Users guide for FRCS: fuel reduction cost simulator software.
Roger D. Fight; Bruce R. Hartsough; Peter Noordijk
2006-01-01
The Fuel Reduction Cost Simulator (FRCS) spreadsheet application is public domain software used to estimate costs for fuel reduction treatments involving removal of trees of mixed sizes in the form of whole trees, logs, or chips from a forest. Equipment production rates were developed from existing studies. Equipment operating cost rates are from December 2002 prices...
Spacelab cost reduction alternatives study. Volume 1: Executive summary
NASA Technical Reports Server (NTRS)
1976-01-01
Alternative approaches to payload operations planning and control and flight crew training are defined for spacelab payloads with the goal of: lowering FY77 and FY 78 costs for new starts; lowering costs to achieve Spacelab operational capability; and minimizing the cost per Spacelab flight. These alternatives attempt to minimize duplication of hardware, software, and personnel, and the investment in supporting facility and equipment. Of particular importance is the possible reduction of equipment, software, and manpower resources such as comtational systems, trainers, and simulators.
Software analyzes feasibility of saw kerf reduction for hardwood mills
Philip H. Steele
2005-01-01
Reductions in saw kerf on head rigs and resaws can dramatically increase lumber recovery in hardwood sawmills. Research has shown that lumber sawing variation reduction will increase lumber recovery above that obtained solely from kerf reduction. Reductions in sawing machine kerf or variation always come at some cost in both capital and variable costs. Determining...
Automated Estimation Of Software-Development Costs
NASA Technical Reports Server (NTRS)
Roush, George B.; Reini, William
1993-01-01
COSTMODL is automated software development-estimation tool. Yields significant reduction in risk of cost overruns and failed projects. Accepts description of software product developed and computes estimates of effort required to produce it, calendar schedule required, and distribution of effort and staffing as function of defined set of development life-cycle phases. Written for IBM PC(R)-compatible computers.
Integrated Component-based Data Acquisition Systems for Aerospace Test Facilities
NASA Technical Reports Server (NTRS)
Ross, Richard W.
2001-01-01
The Multi-Instrument Integrated Data Acquisition System (MIIDAS), developed by the NASA Langley Research Center, uses commercial off the shelf (COTS) products, integrated with custom software, to provide a broad range of capabilities at a low cost throughout the system s entire life cycle. MIIDAS combines data acquisition capabilities with online and post-test data reduction computations. COTS products lower purchase and maintenance costs by reducing the level of effort required to meet system requirements. Object-oriented methods are used to enhance modularity, encourage reusability, and to promote adaptability, reducing software development costs. Using only COTS products and custom software supported on multiple platforms reduces the cost of porting the system to other platforms. The post-test data reduction capabilities of MIIDAS have been installed at four aerospace testing facilities at NASA Langley Research Center. The systems installed at these facilities provide a common user interface, reducing the training time required for personnel that work across multiple facilities. The techniques employed by MIIDAS enable NASA to build a system with a lower initial purchase price and reduced sustaining maintenance costs. With MIIDAS, NASA has built a highly flexible next generation data acquisition and reduction system for aerospace test facilities that meets customer expectations.
Quality Market: Design and Field Study of Prediction Market for Software Quality Control
ERIC Educational Resources Information Center
Krishnamurthy, Janaki
2010-01-01
Given the increasing competition in the software industry and the critical consequences of software errors, it has become important for companies to achieve high levels of software quality. While cost reduction and timeliness of projects continue to be important measures, software companies are placing increasing attention on identifying the user…
A Formal Application of Safety and Risk Assessment in Software Systems
2004-09-01
characteristics of Software Engineering, Development, and Safety...against a comparison of planned and actual schedules, costs, and characteristics . Software Safety is focused on the reduction of unsafe incidents...they merely carry out the role for which they were anatomically designed.55 Software is characteristically like an anatomical cell as it merely
The Evolution of Software Pricing: From Box Licenses to Application Service Provider Models.
ERIC Educational Resources Information Center
Bontis, Nick; Chung, Honsan
2000-01-01
Describes three different pricing models for software. Findings of this case study support the proposition that software pricing is a complex and subjective process. The key determinant of alignment between vendor and user is the nature of value in the software to the buyer. This value proposition may range from increased cost reduction to…
SEL's Software Process-Improvement Program
NASA Technical Reports Server (NTRS)
Basili, Victor; Zelkowitz, Marvin; McGarry, Frank; Page, Jerry; Waligora, Sharon; Pajerski, Rose
1995-01-01
The goals and operations of the Software Engineering Laboratory (SEL) is reviewed. For nearly 20 years the SEL has worked to understand, assess, and improve software and the development process within the production environment of the Flight Dynamics Division (FDD) of NASA's Goddard Space Flight Center. The SEL was established in 1976 with the goals of reducing: (1) the defect rate of delivered software, (2) the cost of software to support flight projects, and (3) the average time to produce mission-support software. After studying over 125 projects of FDD, the results have guided the standards, management practices, technologies, and the training within the division. The results of the studies have been a 75 percent reduction in defects, a 50 percent reduction in cost, and a 25 percent reduction in development time. Over time the goals of SEL have been clarified. The goals are now stated as: (1) Understand baseline processes and product characteristics, (2) Assess improvements that have been incorporated into the development projects, (3) Package and infuse improvements into the standard SEL process. The SEL improvement goal is to demonstrate continual improvement of the software process by carrying out analysis, measurement and feedback to projects with in the FDD environment. The SEL supports the understanding of the process by study of several processes including, the effort distribution, and error detection rates. The SEL assesses and refines the processes. Once the assessment and refinement of a process is completed, the SEL packages the process by capturing the process in standards, tools and training.
Software forecasting as it is really done: A study of JPL software engineers
NASA Technical Reports Server (NTRS)
Griesel, Martha Ann; Hihn, Jairus M.; Bruno, Kristin J.; Fouser, Thomas J.; Tausworthe, Robert C.
1993-01-01
This paper presents a summary of the results to date of a Jet Propulsion Laboratory internally funded research task to study the costing process and parameters used by internally recognized software cost estimating experts. Protocol Analysis and Markov process modeling were used to capture software engineer's forecasting mental models. While there is significant variation between the mental models that were studied, it was nevertheless possible to identify a core set of cost forecasting activities, and it was also found that the mental models cluster around three forecasting techniques. Further partitioning of the mental models revealed clustering of activities, that is very suggestive of a forecasting lifecycle. The different forecasting methods identified were based on the use of multiple-decomposition steps or multiple forecasting steps. The multiple forecasting steps involved either forecasting software size or an additional effort forecast. Virtually no subject used risk reduction steps in combination. The results of the analysis include: the identification of a core set of well defined costing activities, a proposed software forecasting life cycle, and the identification of several basic software forecasting mental models. The paper concludes with a discussion of the implications of the results for current individual and institutional practices.
IPAD products and implications for the future
NASA Technical Reports Server (NTRS)
Miller, R. E., Jr.
1980-01-01
The betterment of productivity through the improvement of product quality and the reduction of cost is addressed. Productivity improvement is sought through (1) reduction of required resources, (2) improved ask results through the management of such saved resources, (3) reduced downstream costs through manufacturing-oriented engineering, and (4) lowered risks in the making of product design decisions. The IPAD products are both hardware architecture and software distributed over a number of heterogeneous computers in this architecture. These IPAD products are described in terms of capability and engineering usefulness. The future implications of state-of-the-art IPAD hardware and software architectures are discussed in terms of their impact on the functions and on structures of organizations concerned with creating products.
2006-11-01
engines will involve a family of common components. It will consist of a real - time operating system and partitioned application software (AS...system will employ a standard hardware and software architecture. It will consist of a real time operating system and partitioned application...Inputs - Enables Large Cost Reduction 3. Software - FAA Certified Auto Code - Real Time Operating System - Commercial
Reuse-Driven Software Processes Guidebook. Version 02.00.03
1993-11-01
a required sys - tem without unduly constraining the details of the solution. The Naval Research Laboratory Software Cost Reduction project developed...conventional manner. The emphasis is still on the development of "one-of-a-kind" sys - tems and the phased completion and review of corresponding...Application Engineering to improve the life-cycle productivity of Sy - 21 OVM ftrdauntals of Syatbes the total software development enterprise. The
Cost as a technology driver. [in aerospace R and D
NASA Technical Reports Server (NTRS)
Fitzgerald, P. E., Jr.; Savage, M.
1976-01-01
Cost managment as a guiding factor in optimum development of technology, and proper timing of cost-saving programs in the development of a system or technology with payoffs in development and operational advances are discussed and illustrated. Advances enhancing the performance of hardware or software advances raising productivity or reducing cost, are outlined, with examples drawn from: thermochemical thrust maximization, development of cryogenic storage tanks, improvements in fuel cells for Space Shuttle, design of a spacecraft pyrotechnic initiator, cost cutting by reduction in the number of parts to be joined, and cost cutting by dramatic reductions in circuit component number with small-scale double-diffused integrated circuitry. Program-focused supporting research and technology models are devised to aid judicious timing of cost-conscious research programs.
Integrated Design Tools Reduce Risk, Cost
NASA Technical Reports Server (NTRS)
2012-01-01
Thanks in part to a SBIR award with Langley Research Center, Phoenix Integration Inc., based in Wayne, Pennsylvania, modified and advanced software for process integration and design automation. For NASA, the tool has resulted in lower project costs and reductions in design time; clients of Phoenix Integration are experiencing the same rewards.
The nature of knowledge and how to account for it
NASA Astrophysics Data System (ADS)
Burgos, M. C. G.; De la Peña, F. D. E.
2014-10-01
One of the key challenges in science and technology is the evaluation of knowledge generated in either one or the other. This paper presents a case study where the use of acquired knowledge -in the form of lessons learned-applied to process improvement, meets the target of reduction of production costs for secondary -recycled-aluminium, and how this means that an intangible (knowledge embodied in software) can become tangible (a reduction in production costs), thus eligible to be reported in the company's balance sheet.
Integrating automated support for a software management cycle into the TAME system
NASA Technical Reports Server (NTRS)
Sunazuka, Toshihiko; Basili, Victor R.
1989-01-01
Software managers are interested in the quantitative management of software quality, cost and progress. An integrated software management methodology, which can be applied throughout the software life cycle for any number purposes, is required. The TAME (Tailoring A Measurement Environment) methodology is based on the improvement paradigm and the goal/question/metric (GQM) paradigm. This methodology helps generate a software engineering process and measurement environment based on the project characteristics. The SQMAR (software quality measurement and assurance technology) is a software quality metric system and methodology applied to the development processes. It is based on the feed forward control principle. Quality target setting is carried out before the plan-do-check-action activities are performed. These methodologies are integrated to realize goal oriented measurement, process control and visual management. A metric setting procedure based on the GQM paradigm, a management system called the software management cycle (SMC), and its application to a case study based on NASA/SEL data are discussed. The expected effects of SMC are quality improvement, managerial cost reduction, accumulation and reuse of experience, and a highly visual management reporting system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harris, D.
1997-11-01
This report contains viewgraphs on the CINC mobile alternate headquarters and the tech control automation, maintenance and support facility. A discussion on software cost reduction by reuse is given as well as managing risk by assessment and mitigation.
Using Open Source Software in Visual Simulation Development
2005-09-01
increased the use of the technology in training activities. Using open source/free software tools in the process can expand these possibilities...resulting in even greater cost reduction and allowing the flexibility needed in a training environment. This thesis presents a configuration and architecture...to be used when developing training visual simulations using both personal computers and open source tools. Aspects of the requirements needed in a
SMART-FDIR: Use of Artificial Intelligence in the Implementation of a Satellite FDIR
NASA Astrophysics Data System (ADS)
Guiotto, A.; Martelli, A.; Paccagnini, C.
Nowadays space activities are characterized by increased constraints in terms of on-board computing power and functional complexity combined with reduction of costs and schedule. This scenario necessarily originates impacts on the on-board software with particular emphases to the interfaces between on-board software and system/mission level requirements. The questions are: How can the effectiveness of Space System Software design be improved? How can we increase sophistication in the area of autonomy and failure tolerance, maintaining the necessary quality with acceptable risks?
Software engineering and Ada in design
NASA Technical Reports Server (NTRS)
Oneill, Don
1986-01-01
Modern software engineering promises significant reductions in software costs and improvements in software quality. The Ada language is the focus for these software methodology and tool improvements. The IBM FSD approach, including the software engineering practices that guide the systematic design and development of software products and the management of the software process are examined. The revised Ada design language adaptation is revealed. This four level design methodology is detailed including the purpose of each level, the management strategy that integrates the software design activity with the program milestones, and the technical strategy that maps the Ada constructs to each level of design. A complete description of each design level is provided along with specific design language recording guidelines for each level. Finally, some testimony is offered on education, tools, architecture, and metrics resulting from project use of the four level Ada design language adaptation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ian Metzger, Jesse Dean
2010-12-31
This software requires inputs of simple water fixture inventory information and calculates the water/energy and cost benefits of various retrofit opportunities. This tool includes water conservation measures for: Low-flow Toilets, Low-flow Urinals, Low-flow Faucets, and Low-flow Showheads. This tool calculates water savings, energy savings, demand reduction, cost savings, and building life cycle costs including: simple payback, discounted payback, net-present value, and savings to investment ratio. In addition this tool also displays the environmental benefits of a project.
NASA Technical Reports Server (NTRS)
Cowderoy, A. J. C.; Jenkins, John O.; Poulymenakou, A
1992-01-01
The tendency for software development projects to be completed over schedule and over budget was documented extensively. Additionally many projects are completed within budgetary and schedule target only as a result of the customer agreeing to accept reduced functionality. In his classic book, The Mythical Man Month, Fred Brooks exposes the fallacy that effort and schedule are freely interchangeable. All current cost models are produced on the assumption that there is very limited scope for schedule compression unless there is a corresponding reduction in delivered functionality. The Metrication and Resources Modeling Aid (MERMAID) project, partially financed by the Commission of the European Communities (CEC) as Project 2046 began in Oct. 1988 and its goal were as follows: (1) improvement of understanding of the relationships between software development productivity and product and process metrics; (2) to facilitate the widespread technology transfer from the Consortium to the European Software Industry; and (3) to facilitate the widespread uptake of cost estimation techniques by the provision of prototype cost estimation tools. MERMAID developed a family of methods for cost estimation, many of which have had tools implemented in prototypes. These prototypes are best considered as toolkits or workbenches.
A low cost Doppler system for vascular dialysis access surveillance.
Molina, P S C; Moraes, R; Baggio, J F R; Tognon, E A
2004-01-01
The National Kidney Foundation guidelines for vascular access recommend access surveillance to avoid morbidity among patients undergoing hemodialysis. Methods to detect access failure based on CW Doppler system are being proposed to implement surveillance programs at lower cost. This work describes a low cost Doppler system implemented in a PC notebook designed to carry out this task. A Doppler board samples the blood flow velocity and delivers demodulated quadrature Doppler signals. These signals are sampled by a notebook sound card. Software for Windows OS (running at the notebook) applies CFFT to consecutive 11.6 ms intervals of Doppler signals. The sonogram is presented on the screen in real time. The software also calculates the maximum and the intensity weighted mean frequency envelopes. Since similar systems employ DSP boards to process the Doppler signals, cost reduction was achieved. The Doppler board electronic circuits and routines to process the Doppler signals are presented.
Incubator Display Software Cost Reduction Toolset Software Requirements Specification
NASA Technical Reports Server (NTRS)
Moran, Susanne; Jeffords, Ralph
2005-01-01
The Incubator Display Software Requirements Specification was initially developed by Intrinsyx Technologies Corporation (Intrinsyx) under subcontract to Lockheed Martin, Contract Number NAS2-02090, for the National Aeronautics and Space Administration (NASA) Ames Research Center (ARC) Space Station Biological Research Project (SSBRP). The Incubator Display is a User Payload Application (UPA) used to control an Incubator subrack payload for the SSBRP. The Incubator Display functions on-orbit as part of the subrack payload laptop, on the ground as part of the Communication and Data System (CDS) ground control system, and also as part of the crew training environment.
Agile: From Software to Mission Systems
NASA Technical Reports Server (NTRS)
Trimble, Jay; Shirley, Mark; Hobart, Sarah
2017-01-01
To maximize efficiency and flexibility in Mission Operations System (MOS) design, we are evolving principles from agile and lean methods for software, to the complete mission system. This allows for reduced operational risk at reduced cost, and achieves a more effective design through early integration of operations into mission system engineering and flight system design. The core principles are assessment of capability through demonstration, risk reduction through targeted experiments, early test and deployment, and maturation of processes and tools through use.
NASA Astrophysics Data System (ADS)
Perry, Alexander R.
2002-06-01
Low Frequency Eddy Current (EC) probes are capable of measurement from 5 MHz down to DC through the use of Magnetoresistive (MR) sensors. Choosing components with appropriate electrical specifications allows them to be matched to the power and impedance characteristics of standard computer connectors. This permits direct attachment of the probe to inexpensive computers, thereby eliminating external power supplies, amplifiers and modulators that have heretofore precluded very low system purchase prices. Such price reduction is key to increased market penetration in General Aviation maintenance and consequent reduction in recurring costs. This paper examines our computer software CANDETECT, which implements this approach and permits effective probe operation. Results are presented to show the intrinsic sensitivity of the software and demonstrate its practical performance when seeking cracks in the underside of a thick aluminum multilayer structure. The majority of the General Aviation light aircraft fleet uses rivets and screws to attach sheet aluminum skin to the airframe, resulting in similar multilayer lap joints.
ATE accomplishes receiver specification testing with increased speed and throughput
NASA Astrophysics Data System (ADS)
Moser, S. A.
1982-12-01
The use of automatic test equipment (ATE) for receiver specifications testing can result in a 90-95% reduction of test time, with a corresponding reduction of labor costs due both to the reduction of personnel numbers and a simplification of tasks that permits less skilled personnel to be employed. These benefits free high-level technicians for more challenging system management assignments. Accuracy and repeatability also improve with the adoption of ATE, since no possibility of human error can be introduced into the readings that are taken by the system. A massive and expensive software design and development effort is identified as the most difficult aspect of ATE implementation, since programming is both time-consuming and labor intensive. An attempt is therefore made by system manufacturers to conduct an integrated development program for both ATE system hardware and software.
SMI Compatible Simulation Scheduler Design for Reuse of Model Complying with Smp Standard
NASA Astrophysics Data System (ADS)
Koo, Cheol-Hea; Lee, Hoon-Hee; Cheon, Yee-Jin
2010-12-01
Software reusability is one of key factors which impacts cost and schedule on a software development project. It is very crucial also in satellite simulator development since there are many commercial simulator models related to satellite and dynamics. If these models can be used in another simulator platform, great deal of confidence and cost/schedule reduction would be achieved. Simulation model portability (SMP) is maintained by European Space Agency and many models compatible with SMP/simulation model interface (SMI) are available. Korea Aerospace Research Institute (KARI) is developing hardware abstraction layer (HAL) supported satellite simulator to verify on-board software of satellite. From above reasons, KARI wants to port these SMI compatible models to the HAL supported satellite simulator. To port these SMI compatible models to the HAL supported satellite simulator, simulation scheduler is preliminary designed according to the SMI standard.
NASA Technical Reports Server (NTRS)
Lu, George C.
2003-01-01
The purpose of the EXPRESS (Expedite the PRocessing of Experiments to Space Station) rack project is to provide a set of predefined interfaces for scientific payloads which allow rapid integration into a payload rack on International Space Station (ISS). VxWorks' was selected as the operating system for the rack and payload resource controller, primarily based on the proliferation of VME (Versa Module Eurocard) products. These products provide needed flexibility for future hardware upgrades to meet everchanging science research rack configuration requirements. On the International Space Station, there are multiple science research rack configurations, including: 1) Human Research Facility (HRF); 2) EXPRESS ARIS (Active Rack Isolation System); 3) WORF (Window Observational Research Facility); and 4) HHR (Habitat Holding Rack). The RIC (Rack Interface Controller) connects payloads to the ISS bus architecture for data transfer between the payload and ground control. The RIC is a general purpose embedded computer which supports multiple communication protocols, including fiber optic communication buses, Ethernet buses, EIA-422, Mil-Std-1553 buses, SMPTE (Society Motion Picture Television Engineers)-170M video, and audio interfaces to payloads and the ISS. As a cost saving and software reliability strategy, the Boeing Payload Software Organization developed reusable common software where appropriate. These reusable modules included a set of low-level driver software interfaces to 1553B. RS232, RS422, Ethernet buses, HRDL (High Rate Data Link), video switch functionality, telemetry processing, and executive software hosted on the FUC computer. These drivers formed the basis for software development of the HRF, EXPRESS, EXPRESS ARIS, WORF, and HHR RIC executable modules. The reusable RIC common software has provided extensive benefits, including: 1) Significant reduction in development flow time; 2) Minimal rework and maintenance; 3) Improved reliability; and 4) Overall reduction in software life cycle cost. Due to the limited number of crew hours available on ISS for science research, operational efficiency is a critical customer concern. The current method of upgrading RIC software is a time consuming process; thus, an improved methodology for uploading RIC software is currently under evaluation.
ISS Operations Cost Reductions Through Automation of Real-Time Planning Tasks
NASA Technical Reports Server (NTRS)
Hall, Timothy A.; Clancey, William J.; McDonald, Aaron; Toschlog, Jason; Tucker, Tyson; Khan, Ahmed; Madrid, Steven (Eric)
2011-01-01
In 2007 the Johnson Space Center s Mission Operations Directorate (MOD) management team challenged their organizations to find ways to reduce the cost of operations for supporting the International Space Station (ISS) in the Mission Control Center (MCC). Each MOD organization was asked to define and execute projects that would help them attain cost reductions by 2012. The MOD Operations Division Flight Planning Branch responded to this challenge by launching several software automation projects that would allow them to greatly improve console operations and reduce ISS console staffing and intern reduce operating costs. These tasks ranged from improving the management and integration mission plan changes, to automating the uploading and downloading of information to and from the ISS and the associated ground complex tasks that required multiple decision points. The software solutions leveraged several different technologies including customized web applications and implementation of industry standard web services architecture; as well as engaging a previously TRL 4-5 technology developed by Ames Research Center (ARC) that utilized an intelligent agent-based system to manage and automate file traffic flow, archive data, and generate console logs. These projects to date have allowed the MOD Operations organization to remove one full time (7 x 24 x 365) ISS console position in 2010; with the goal of eliminating a second full time ISS console support position by 2012. The team will also reduce one long range planning console position by 2014. When complete, these Flight Planning Branch projects will account for the elimination of 3 console positions and a reduction in staffing of 11 engineering personnel (EP) for ISS.
Launch vehicle operations cost reduction through artificial intelligence techniques
NASA Technical Reports Server (NTRS)
Davis, Tom C., Jr.
1988-01-01
NASA's Kennedy Space Center has attempted to develop AI methods in order to reduce the cost of launch vehicle ground operations as well as to improve the reliability and safety of such operations. Attention is presently given to cost savings estimates for systems involving launch vehicle firing-room software and hardware real-time diagnostics, as well as the nature of configuration control and the real-time autonomous diagnostics of launch-processing systems by these means. Intelligent launch decisions and intelligent weather forecasting are additional applications of AI being considered.
Workstation-Based Simulation for Rapid Prototyping and Piloted Evaluation of Control System Designs
NASA Technical Reports Server (NTRS)
Mansur, M. Hossein; Colbourne, Jason D.; Chang, Yu-Kuang; Aiken, Edwin W. (Technical Monitor)
1998-01-01
The development and optimization of flight control systems for modem fixed- and rotary-. wing aircraft consume a significant portion of the overall time and cost of aircraft development. Substantial savings can be achieved if the time required to develop and flight test the control system, and the cost, is reduced. To bring about such reductions, software tools such as Matlab/Simulink are being used to readily implement block diagrams and rapidly evaluate the expected responses of the completed system. Moreover, tools such as CONDUIT (CONtrol Designer's Unified InTerface) have been developed that enable the controls engineers to optimize their control laws and ensure that all the relevant quantitative criteria are satisfied, all within a fully interactive, user friendly, unified software environment.
Software for Probabilistic Risk Reduction
NASA Technical Reports Server (NTRS)
Hensley, Scott; Michel, Thierry; Madsen, Soren; Chapin, Elaine; Rodriguez, Ernesto
2004-01-01
A computer program implements a methodology, denoted probabilistic risk reduction, that is intended to aid in planning the development of complex software and/or hardware systems. This methodology integrates two complementary prior methodologies: (1) that of probabilistic risk assessment and (2) a risk-based planning methodology, implemented in a prior computer program known as Defect Detection and Prevention (DDP), in which multiple requirements and the beneficial effects of risk-mitigation actions are taken into account. The present methodology and the software are able to accommodate both process knowledge (notably of the efficacy of development practices) and product knowledge (notably of the logical structure of a system, the development of which one seeks to plan). Estimates of the costs and benefits of a planned development can be derived. Functional and non-functional aspects of software can be taken into account, and trades made among them. It becomes possible to optimize the planning process in the sense that it becomes possible to select the best suite of process steps and design choices to maximize the expectation of success while remaining within budget.
NASA Technical Reports Server (NTRS)
1976-01-01
Technologies required to support the stated OAST thrust to increase information return by X1000, while reducing costs by a factor of 10 are identified. The most significant driver is the need for an overall end-to-end data system management technology. Maximum use of LSI component technology and trade-offs between hardware and software are manifest in most all considerations of technology needs. By far, the greatest need for data handling technology was identified for the space Exploration and Global Services themes. Major advances are needed in NASA's ability to provide cost effective mass reduction of space data, and automated assessment of earth looking imagery, with a concomitant reduction in cost per useful bit. A combined approach embodying end-to-end system analysis, with onboard data set selection, onboard data processing, highly parallel image processing (both ground and space), low cost, high capacity memories, and low cost user data distribution systems would be necessary.
Long-term Preservation of Data Analysis Capabilities
NASA Astrophysics Data System (ADS)
Gabriel, C.; Arviset, C.; Ibarra, A.; Pollock, A.
2015-09-01
While the long-term preservation of scientific data obtained by large astrophysics missions is ensured through science archives, the issue of data analysis software preservation has hardly been addressed. Efforts by large data centres have contributed so far to maintain some instrument or mission-specific data reduction packages on top of high-level general purpose data analysis software. However, it is always difficult to keep software alive without support and maintenance once the active phase of a mission is over. This is especially difficult in the budgetary model followed by space agencies. We discuss the importance of extending the lifetime of dedicated data analysis packages and review diverse strategies under development at ESA using new paradigms such as Virtual Machines, Cloud Computing, and Software as a Service for making possible full availability of data analysis and calibration software for decades at minimal cost.
rpe v5: an emulator for reduced floating-point precision in large numerical simulations
NASA Astrophysics Data System (ADS)
Dawson, Andrew; Düben, Peter D.
2017-06-01
This paper describes the rpe (reduced-precision emulator) library which has the capability to emulate the use of arbitrary reduced floating-point precision within large numerical models written in Fortran. The rpe software allows model developers to test how reduced floating-point precision affects the result of their simulations without having to make extensive code changes or port the model onto specialized hardware. The software can be used to identify parts of a program that are problematic for numerical precision and to guide changes to the program to allow a stronger reduction in precision.The development of rpe was motivated by the strong demand for more computing power. If numerical precision can be reduced for an application under consideration while still achieving results of acceptable quality, computational cost can be reduced, since a reduction in numerical precision may allow an increase in performance or a reduction in power consumption. For simulations with weather and climate models, savings due to a reduction in precision could be reinvested to allow model simulations at higher spatial resolution or complexity, or to increase the number of ensemble members to improve predictions. rpe was developed with a particular focus on the community of weather and climate modelling, but the software could be used with numerical simulations from other domains.
NASA Astrophysics Data System (ADS)
Berlin, Julian; Bogaard, Thom; Van Westen, Cees; Bakker, Wim; Mostert, Eric; Dopheide, Emile
2014-05-01
Cost benefit analysis (CBA) is a well know method used widely for the assessment of investments either in the private and public sector. In the context of risk mitigation and the evaluation of risk reduction alternatives for natural hazards its use is very important to evaluate the effectiveness of such efforts in terms of avoided monetary losses. However the current method has some disadvantages related to the spatial distribution of the costs and benefits, the geographical distribution of the avoided damage and losses, the variation in areas that are benefited in terms of invested money and avoided monetary risk. Decision-makers are often interested in how the costs and benefits are distributed among different administrative units of a large area or region, so they will be able to compare and analyse the cost and benefits per administrative unit as a result of the implementation of the risk reduction projects. In this work we first examined the Cost benefit procedure for natural hazards, how the costs are assessed for several structural and non-structural risk reduction alternatives, we also examined the current problems of the method such as the inclusion of cultural and social considerations that are complex to monetize , the problem of discounting future values using a defined interest rate and the spatial distribution of cost and benefits. We also examined the additional benefits and the indirect costs associated with the implementation of the risk reduction alternatives such as the cost of having a ugly landscape (also called negative benefits). In the last part we examined the current tools and software used in natural hazards assessment with support to conduct CBA and we propose design considerations for the implementation of the CBA module for the CHANGES-SDSS Platform an initiative of the ongoing 7th Framework Programme "CHANGES of the European commission. Keywords: Risk management, Economics of risk mitigation, EU Flood Directive, resilience, prevention, cost benefit analysis, spatial distribution of costs and benefits
Using the SCR Specification Technique in a High School Programming Course.
ERIC Educational Resources Information Center
Rosen, Edward; McKim, James C., Jr.
1992-01-01
Presents the underlying ideas of the Software Cost Reduction (SCR) approach to requirements specifications. Results of applying this approach to the teaching of programing to high school students indicate that students perform better in writing programs. An appendix provides two examples of how the method is applied to problem solving. (MDH)
Feasibility study for automatic reduction of phase change imagery
NASA Technical Reports Server (NTRS)
Nossaman, G. O.
1971-01-01
The feasibility of automatically reducing a form of pictorial aerodynamic heating data is discussed. The imagery, depicting the melting history of a thin coat of fusible temperature indicator painted on an aerodynamically heated model, was previously reduced by manual methods. Careful examination of various lighting theories and approaches led to an experimentally verified illumination concept capable of yielding high-quality imagery. Both digital and video image processing techniques were applied to reduction of the data, and it was demonstrated that either method can be used to develop superimposed contours. Mathematical techniques were developed to find the model-to-image and the inverse image-to-model transformation using six conjugate points, and methods were developed using these transformations to determine heating rates on the model surface. A video system was designed which is able to reduce the imagery rapidly, economically and accurately. Costs for this system were estimated. A study plan was outlined whereby the mathematical transformation techniques developed to produce model coordinate heating data could be applied to operational software, and methods were discussed and costs estimated for obtaining the digital information necessary for this software.
System Risk Balancing Profiles: Software Component
NASA Technical Reports Server (NTRS)
Kelly, John C.; Sigal, Burton C.; Gindorf, Tom
2000-01-01
The Software QA / V&V guide will be reviewed and updated based on feedback from NASA organizations and others with a vested interest in this area. Hardware, EEE Parts, Reliability, and Systems Safety are a sample of the future guides that will be developed. Cost Estimates, Lessons Learned, Probability of Failure and PACTS (Prevention, Avoidance, Control or Test) are needed to provide a more complete risk management strategy. This approach to risk management is designed to help balance the resources and program content for risk reduction for NASA's changing environment.
Design of a stateless low-latency router architecture for green software-defined networking
NASA Astrophysics Data System (ADS)
Saldaña Cercós, Silvia; Ramos, Ramon M.; Ewald Eller, Ana C.; Martinello, Magnos; Ribeiro, Moisés. R. N.; Manolova Fagertun, Anna; Tafur Monroy, Idelfonso
2015-01-01
Expanding software defined networking (SDN) to transport networks requires new strategies to deal with the large number of flows that future core networks will have to face. New south-bound protocols within SDN have been proposed to benefit from having control plane detached from the data plane offering a cost- and energy-efficient forwarding engine. This paper presents an overview of a new approach named KeyFlow to simultaneously reduce latency, jitter, and power consumption in core network nodes. Results on an emulation platform indicate that round trip time (RTT) can be reduced above 50% compared to the reference protocol OpenFlow, specially when flow tables are densely populated. Jitter reduction has been demonstrated experimentally on a NetFPGA-based platform, and 57.3% power consumption reduction has been achieved.
Taking the Observatory to the Astronomer
NASA Astrophysics Data System (ADS)
Bisque, T. M.
1997-05-01
Since 1992, Software Bisque's Remote Astronomy Software has been used by the Mt. Wilson Institute to allow interactive control of a 24" telescope and digital camera via modem. Software Bisque now introduces a comparable, relatively low-cost observatory system that allows powerful, yet "user-friendly" telescope and CCD camera control via the Internet. Utilizing software developed for the Windows 95/NT operating systems, the system offers point-and-click access to comprehensive celestial databases, extremely accurate telescope pointing, rapid download of digital CCD images by one or many users and flexible image processing software for data reduction and analysis. Our presentation will describe how the power of the personal computer has been leveraged to provide professional-level tools to the amateur astronomer, and include a description of this system's software and hardware components. The system software includes TheSky Astronomy Software?, CCDSoft CCD Astronomy Software?, TPoint Telescope Pointing Analysis System? software, Orchestrate? and, optionally, the RealSky CDs. The system hardware includes the Paramount GT-1100? Robotic Telescope Mount, as well as third party CCD cameras, focusers and optical tube assemblies.
Applications Development for a Parallel COTS Spaceborne Computer
NASA Technical Reports Server (NTRS)
Katz, Daniel S.; Springer, Paul L.; Granat, Robert; Turmon, Michael
2000-01-01
This presentation reviews the Remote Exploration and Experimentation Project (REE) program for utilization of scalable supercomputing technology in space. The implementation of REE will be the use of COTS hardware and software to the maximum extent possible, keeping overhead low. Since COTS systems will be used, with little or no special modification, there will be significant cost reduction.
A Concept for the One Degree Imager (ODI) Data Reduction Pipeline and Archiving System
NASA Astrophysics Data System (ADS)
Knezek, Patricia; Stobie, B.; Michael, S.; Valdes, F.; Marru, S.; Henschel, R.; Pierce, M.
2010-05-01
The One Degree Imager (ODI), currently being built by the WIYN Observatory, will provide tremendous possibilities for conducting diverse scientific programs. ODI will be a complex instrument, using non-conventional Orthogonal Transfer Array (OTA) detectors. Due to its large field of view, small pixel size, use of OTA technology, and expected frequent use, ODI will produce vast amounts of astronomical data. If ODI is to achieve its full potential, a data reduction pipeline must be developed. Long-term archiving must also be incorporated into the pipeline system to ensure the continued value of ODI data. This paper presents a concept for an ODI data reduction pipeline and archiving system. To limit costs and development time, our plan leverages existing software and hardware, including existing pipeline software, Science Gateways, Computational Grid & Cloud Technology, Indiana University's Data Capacitor and Massive Data Storage System, and TeraGrid compute resources. Existing pipeline software will be augmented to add functionality required to meet challenges specific to ODI, enhance end-user control, and enable the execution of the pipeline on grid resources including national grid resources such as the TeraGrid and Open Science Grid. The planned system offers consistent standard reductions and end-user flexibility when working with images beyond the initial instrument signature removal. It also gives end-users access to computational and storage resources far beyond what are typically available at most institutions. Overall, the proposed system provides a wide array of software tools and the necessary hardware resources to use them effectively.
NASA Technical Reports Server (NTRS)
Garrocq, C. A.; Hurley, M. J.; Dublin, M.
1973-01-01
A baseline implementation plan, including alternative implementation approaches for critical software elements and variants to the plan, was developed. The basic philosophy was aimed at: (1) a progressive release of capability for three major computing systems, (2) an end product that was a working tool, (3) giving participation to industry, government agencies, and universities, and (4) emphasizing the development of critical elements of the IPAD framework software. The results of these tasks indicate an IPAD first release capability 45 months after go-ahead, a five year total implementation schedule, and a total developmental cost of 2027 man-months and 1074 computer hours. Several areas of operational cost increases were identified mainly due to the impact of additional equipment needed and additional computer overhead. The benefits of an IPAD system were related mainly to potential savings in engineering man-hours, reduction of design-cycle calendar time, and indirect upgrading of product quality and performance.
Evolution of Archival Storage (from Tape to Memory)
NASA Technical Reports Server (NTRS)
Ramapriyan, Hampapuram K.
2015-01-01
Over the last three decades, there has been a significant evolution in storage technologies supporting archival of remote sensing data. This section provides a brief survey of how these technologies have evolved. Three main technologies are considered - tape, hard disk and solid state disk. Their historical evolution is traced, summarizing how reductions in cost have helped being able to store larger volumes of data on faster media. The cost per GB of media is only one of the considerations in determining the best approach to archival storage. Active archives generally require faster response to user requests for data than permanent archives. The archive costs have to consider facilities and other capital costs, operations costs, software licenses, utilities costs, etc. For meeting requirements in any organization, typically a mix of technologies is needed.
Helicopter Maritime Environment Trainer: Software Test Document
2011-06-01
Toronto was requested by CF to investigate the potential use of low cost, virtual reality technologies for this purpose, following a successful...demonstration of these technologies for training ship handling skills and reductions of sea time. 2 DRDC Toronto TM 2011-048 Landing on...Simulator Preliminary Specification (Updated) b. DRDC Toronto Report Helicopter Deck Landing Simulator: Technology Demonstrator by F.A
NASA Technical Reports Server (NTRS)
1975-01-01
The results of the third and final phase of a study undertaken to define means of optimizing the Spacelab experiment data system by interactively manipulating the flow of data were presented. A number of payload applicable interactive techniques and an integrated interaction system for each of two possible payloads are described. These interaction systems have been functionally defined and are accompanied with block diagrams, hardware specifications, software sizing and speed requirements, operational procedures and cost/benefits analysis data for both onboard and ground based system elements. It is shown that accrued benefits are attributable to a reduction in data processing costs obtained by, generally, a considerable reduction in the quantity of data that might otherwise be generated without interaction. One other additional anticipated benefit includes the increased scientific value obtained by the quicker return of all useful data.
Object-Oriented Technology-Based Software Library for Operations of Water Reclamation Centers
NASA Astrophysics Data System (ADS)
Otani, Tetsuo; Shimada, Takehiro; Yoshida, Norio; Abe, Wataru
SCADA systems in water reclamation centers have been constructed based on hardware and software that each manufacturer produced according to their design. Even though this approach used to be effective to realize real-time and reliable execution, it is an obstacle to cost reduction about system construction and maintenance. A promising solution to address the problem is to set specifications that can be used commonly. In terms of software, information model approach has been adopted in SCADA systems in other field, such as telecommunications and power systems. An information model is a piece of software specification that describes a physical or logical object to be monitored. In this paper, we propose information models for operations of water reclamation centers, which have not ever existed. In addition, we show the feasibility of the information model in terms of common use and processing performance.
A measurement system for large, complex software programs
NASA Technical Reports Server (NTRS)
Rone, Kyle Y.; Olson, Kitty M.; Davis, Nathan E.
1994-01-01
This paper describes measurement systems required to forecast, measure, and control activities for large, complex software development and support programs. Initial software cost and quality analysis provides the foundation for meaningful management decisions as a project evolves. In modeling the cost and quality of software systems, the relationship between the functionality, quality, cost, and schedule of the product must be considered. This explicit relationship is dictated by the criticality of the software being developed. This balance between cost and quality is a viable software engineering trade-off throughout the life cycle. Therefore, the ability to accurately estimate the cost and quality of software systems is essential to providing reliable software on time and within budget. Software cost models relate the product error rate to the percent of the project labor that is required for independent verification and validation. The criticality of the software determines which cost model is used to estimate the labor required to develop the software. Software quality models yield an expected error discovery rate based on the software size, criticality, software development environment, and the level of competence of the project and developers with respect to the processes being employed.
An Adaptable Power System with Software Control Algorithm
NASA Technical Reports Server (NTRS)
Castell, Karen; Bay, Mike; Hernandez-Pellerano, Amri; Ha, Kong
1998-01-01
A low cost, flexible and modular spacecraft power system design was developed in response to a call for an architecture that could accommodate multiple missions in the small to medium load range. Three upcoming satellites will use this design, with one launch date in 1999 and two in the year 2000. The design consists of modular hardware that can be scaled up or down, without additional cost, to suit missions in the 200 to 600 Watt orbital average load range. The design will be applied to satellite orbits that are circular, polar elliptical and a libration point orbit. Mission unique adaptations are accomplished in software and firmware. In designing this advanced, adaptable power system, the major goals were reduction in weight volume and cost. This power system design represents reductions in weight of 78 percent, volume of 86 percent and cost of 65 percent from previous comparable systems. The efforts to miniaturize the electronics without sacrificing performance has created streamlined power electronics with control functions residing in the system microprocessor. The power system design can handle any battery size up to 50 Amp-hour and any battery technology. The three current implementations will use both nickel cadmium and nickel hydrogen batteries ranging in size from 21 to 50 Amp-hours. Multiple batteries can be used by adding another battery module. Any solar cell technology can be used and various array layouts can be incorporated with no change in Power System Electronics (PSE) hardware. Other features of the design are the standardized interfaces between cards and subsystems and immunity to radiation effects up to 30 krad Total Ionizing Dose (TID) and 35 Mev/cm(exp 2)-kg for Single Event Effects (SEE). The control algorithm for the power system resides in a radiation-hardened microprocessor. A table driven software design allows for flexibility in mission specific requirements. By storing critical power system constants in memory, modifying the system code for other programs is simple. These constants can be altered also by ground command, or in response to an anomolous event. All critical power system functions have backup hardware functions to prevent a software or computer glitch from propagating. A number of battery charge control schemes can be implemented by selecting the proper control terms in the code. The architecture allows the design engineer to tune the system response to various system components and anticipated load profiles without costly alterations. A design trade was made with the size, weight and power dissipation of the electronics versus the performance of the power bus to load variations. Linear, fine control is maintained with a streamlined electronics design. This paper describes the hardware design as well as the software control algorithm. The challenges of closing the system control loop digitally is discussed. Control loop margin and power system performance is presented. Lab measurements are shown and compared to the system response of a hardware model running actual flight software.
Sneak Analysis Application Guidelines
1982-06-01
Hardware Program Change Cost Trend, Airborne Environment ....... ....................... 111 3-11 Relative Software Program Change Costs...113 3-50 Derived Software Program Change Cost by Phase,* Airborne Environment ..... ............... 114 3-51 Derived Software Program Change...Cost by Phase, Ground/Water Environment ... ............. .... 114 3-52 Total Software Program Change Costs ................ 115 3-53 Sneak Analysis
Casis, E; Garrido, A; Uranga, B; Vives, A; Zufiaurre, C
2001-01-01
Total laboratory automation (TLA) can be substituted in mid-size laboratories by a computer sample workflow control (virtual automation). Such a solution has been implemented in our laboratory using PSM, software developed in cooperation with Roche Diagnostics (Barcelona, Spain), to this purpose. This software is connected to the online analyzers and to the laboratory information system and is able to control and direct the samples working as an intermediate station. The only difference with TLA is the replacement of transport belts by personnel of the laboratory. The implementation of this virtual automation system has allowed us the achievement of the main advantages of TLA: workload increase (64%) with reduction in the cost per test (43%), significant reduction in the number of biochemistry primary tubes (from 8 to 2), less aliquoting (from 600 to 100 samples/day), automation of functional testing, drastic reduction of preanalytical errors (from 11.7 to 0.4% of the tubes) and better total response time for both inpatients (from up to 48 hours to up to 4 hours) and outpatients (from up to 10 days to up to 48 hours). As an additional advantage, virtual automation could be implemented without hardware investment and significant headcount reduction (15% in our lab).
Operations analysis (study 2.1): Shuttle upper stage software requirements
NASA Technical Reports Server (NTRS)
Wolfe, R. R.
1974-01-01
An investigation of software costs related to space shuttle upper stage operations with emphasis on the additional costs attributable to space servicing was conducted. The questions and problem areas include the following: (1) the key parameters involved with software costs; (2) historical data for extrapolation of future costs; (3) elements of the basic software development effort that are applicable to servicing functions; (4) effect of multiple servicing on complexity of the operation; and (5) are recurring software costs significant. The results address these questions and provide a foundation for estimating software costs based on the costs of similar programs and a series of empirical factors.
Which factors affect software projects maintenance cost more?
Dehaghani, Sayed Mehdi Hejazi; Hajrahimi, Nafiseh
2013-03-01
The software industry has had significant progress in recent years. The entire life of software includes two phases: production and maintenance. Software maintenance cost is increasingly growing and estimates showed that about 90% of software life cost is related to its maintenance phase. Extraction and considering the factors affecting the software maintenance cost help to estimate the cost and reduce it by controlling the factors. In this study, the factors affecting software maintenance cost were determined then were ranked based on their priority and after that effective ways to reduce the maintenance costs were presented. This paper is a research study. 15 software related to health care centers information systems in Isfahan University of Medical Sciences and hospitals function were studied in the years 2010 to 2011. Among Medical software maintenance team members, 40 were selected as sample. After interviews with experts in this field, factors affecting maintenance cost were determined. In order to prioritize the factors derived by AHP, at first, measurement criteria (factors found) were appointed by members of the maintenance team and eventually were prioritized with the help of EC software. Based on the results of this study, 32 factors were obtained which were classified in six groups. "Project" was ranked the most effective feature in maintenance cost with the highest priority. By taking into account some major elements like careful feasibility of IT projects, full documentation and accompany the designers in the maintenance phase good results can be achieved to reduce maintenance costs and increase longevity of the software.
Evaluation of Inventory Reduction Strategies: Balad Air Base Case Study
2012-03-01
produced by conducting individual simulations using a unique random seed generated by the default Anylogic © random number generator. The...develops an agent-based simulation model of the sustainment supply chain supporting Balad AB during its closure using the software AnyLogic ®. The...research. The goal of USAF Stockage Policy is to maximize customer support while minimizing inventory costs (DAF, 2011:1). USAF stocking decisions
NASA Astrophysics Data System (ADS)
Vijayashree, M.; Uthayakumar, R.
2017-09-01
Lead time is one of the major limits that affect planning at every stage of the supply chain system. In this paper, we study a continuous review inventory model. This paper investigates the ordering cost reductions are dependent on lead time. This study addressed two-echelon supply chain problem consisting of a single vendor and a single buyer. The main contribution of this study is that the integrated total cost of the single vendor and the single buyer integrated system is analyzed by adopting two different (linear and logarithmic) types ordering cost reductions act dependent on lead time. In both cases, we develop effective solution procedures for finding the optimal solution and then illustrative numerical examples are given to illustrate the results. The solution procedure is to determine the optimal solutions of order quantity, ordering cost, lead time and the number of deliveries from the single vendor and the single buyer in one production run, so that the integrated total cost incurred has the minimum value. Ordering cost reduction is the main aspect of the proposed model. A numerical example is given to validate the model. Numerical example solved by using Matlab software. The mathematical model is solved analytically by minimizing the integrated total cost. Furthermore, the sensitivity analysis is included and the numerical examples are given to illustrate the results. The results obtained in this paper are illustrated with the help of numerical examples. The sensitivity of the proposed model has been checked with respect to the various major parameters of the system. Results reveal that the proposed integrated inventory model is more applicable for the supply chain manufacturing system. For each case, an algorithm procedure of finding the optimal solution is developed. Finally, the graphical representation is presented to illustrate the proposed model and also include the computer flowchart in each model.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-07
... 215, 234, 242, and 252 Defense Federal Acquisition Regulation Supplement; Cost and Software Data... Regulation Supplement (DFARS) to set forth DoD Cost and Software Data Reporting system requirements for major... set forth the DoD requirement for offerors to: Describe the standard Cost and Software Data Reporting...
NASA Astrophysics Data System (ADS)
Dervilllé, A.; Labrosse, A.; Zimmermann, Y.; Foucher, J.; Gronheid, R.; Boeckx, C.; Singh, A.; Leray, P.; Halder, S.
2016-03-01
The dimensional scaling in IC manufacturing strongly drives the demands on CD and defect metrology techniques and their measurement uncertainties. Defect review has become as important as CD metrology and both of them create a new metrology paradigm because it creates a completely new need for flexible, robust and scalable metrology software. Current, software architectures and metrology algorithms are performant but it must be pushed to another higher level in order to follow roadmap speed and requirements. For example: manage defect and CD in one step algorithm, customize algorithms and outputs features for each R&D team environment, provide software update every day or every week for R&D teams in order to explore easily various development strategies. The final goal is to avoid spending hours and days to manually tune algorithm to analyze metrology data and to allow R&D teams to stay focus on their expertise. The benefits are drastic costs reduction, more efficient R&D team and better process quality. In this paper, we propose a new generation of software platform and development infrastructure which can integrate specific metrology business modules. For example, we will show the integration of a chemistry module dedicated to electronics materials like Direct Self Assembly features. We will show a new generation of image analysis algorithms which are able to manage at the same time defect rates, images classifications, CD and roughness measurements with high throughput performances in order to be compatible with HVM. In a second part, we will assess the reliability, the customization of algorithm and the software platform capabilities to follow new specific semiconductor metrology software requirements: flexibility, robustness, high throughput and scalability. Finally, we will demonstrate how such environment has allowed a drastic reduction of data analysis cycle time.
COSTMODL - AN AUTOMATED SOFTWARE DEVELOPMENT COST ESTIMATION TOOL
NASA Technical Reports Server (NTRS)
Roush, G. B.
1994-01-01
The cost of developing computer software consumes an increasing portion of many organizations' budgets. As this trend continues, the capability to estimate the effort and schedule required to develop a candidate software product becomes increasingly important. COSTMODL is an automated software development estimation tool which fulfills this need. Assimilating COSTMODL to any organization's particular environment can yield significant reduction in the risk of cost overruns and failed projects. This user-customization capability is unmatched by any other available estimation tool. COSTMODL accepts a description of a software product to be developed and computes estimates of the effort required to produce it, the calendar schedule required, and the distribution of effort and staffing as a function of the defined set of development life-cycle phases. This is accomplished by the five cost estimation algorithms incorporated into COSTMODL: the NASA-developed KISS model; the Basic, Intermediate, and Ada COCOMO models; and the Incremental Development model. This choice affords the user the ability to handle project complexities ranging from small, relatively simple projects to very large projects. Unique to COSTMODL is the ability to redefine the life-cycle phases of development and the capability to display a graphic representation of the optimum organizational structure required to develop the subject project, along with required staffing levels and skills. The program is menu-driven and mouse sensitive with an extensive context-sensitive help system that makes it possible for a new user to easily install and operate the program and to learn the fundamentals of cost estimation without having prior training or separate documentation. The implementation of these functions, along with the customization feature, into one program makes COSTMODL unique within the industry. COSTMODL was written for IBM PC compatibles, and it requires Turbo Pascal 5.0 or later and Turbo Professional 5.0 for recompilation. An executable is provided on the distribution diskettes. COSTMODL requires 512K RAM. The standard distribution medium for COSTMODL is three 5.25 inch 360K MS-DOS format diskettes. The contents of the diskettes are compressed using the PKWARE archiving tools. The utility to unarchive the files, PKUNZIP.EXE, is included. COSTMODL was developed in 1991. IBM PC is a registered trademark of International Business Machines. Borland and Turbo Pascal are registered trademarks of Borland International, Inc. Turbo Professional is a trademark of TurboPower Software. MS-DOS is a registered trademark of Microsoft Corporation. Turbo Professional is a trademark of TurboPower Software.
48 CFR 252.234-7004 - Cost and Software Data Reporting System. (NOV 2010)
Code of Federal Regulations, 2012 CFR
2012-10-01
... 48 Federal Acquisition Regulations System 3 2012-10-01 2012-10-01 false Cost and Software Data... AND CONTRACT CLAUSES Text of Provisions And Clauses 252.234-7004 Cost and Software Data Reporting System. (NOV 2010) As prescribed in 234.7101(b)(1), use the following clause: Cost and Software Data...
48 CFR 252.234-7004 - Cost and Software Data Reporting System. (NOV 2010)
Code of Federal Regulations, 2014 CFR
2014-10-01
... 48 Federal Acquisition Regulations System 3 2014-10-01 2014-10-01 false Cost and Software Data... AND CONTRACT CLAUSES Text of Provisions And Clauses 252.234-7004 Cost and Software Data Reporting System. (NOV 2010) As prescribed in 234.7101(b)(1), use the following clause: Cost and Software Data...
48 CFR 252.234-7004 - Cost and Software Data Reporting System. (NOV 2010)
Code of Federal Regulations, 2011 CFR
2011-10-01
... 48 Federal Acquisition Regulations System 3 2011-10-01 2011-10-01 false Cost and Software Data... AND CONTRACT CLAUSES Text of Provisions And Clauses 252.234-7004 Cost and Software Data Reporting System. (NOV 2010) As prescribed in 234.7101(b)(1), use the following clause: COST AND SOFTWARE DATA...
48 CFR 252.234-7004 - Cost and Software Data Reporting System. (NOV 2010)
Code of Federal Regulations, 2013 CFR
2013-10-01
... 48 Federal Acquisition Regulations System 3 2013-10-01 2013-10-01 false Cost and Software Data... AND CONTRACT CLAUSES Text of Provisions And Clauses 252.234-7004 Cost and Software Data Reporting System. (NOV 2010) As prescribed in 234.7101(b)(1), use the following clause: Cost and Software Data...
Assuring Software Cost Estimates: Is it an Oxymoron?
NASA Technical Reports Server (NTRS)
Hihn, Jarius; Tregre, Grant
2013-01-01
The software industry repeatedly observes cost growth of well over 100% even after decades of cost estimation research and well-known best practices, so "What's the problem?" In this paper we will provide an overview of the current state oj software cost estimation best practice. We then explore whether applying some of the methods used in software assurance might improve the quality of software cost estimates. This paper especially focuses on issues associated with model calibration, estimate review, and the development and documentation of estimates as part alan integrated plan.
Eleven quick tips for architecting biomedical informatics workflows with cloud computing.
Cole, Brian S; Moore, Jason H
2018-03-01
Cloud computing has revolutionized the development and operations of hardware and software across diverse technological arenas, yet academic biomedical research has lagged behind despite the numerous and weighty advantages that cloud computing offers. Biomedical researchers who embrace cloud computing can reap rewards in cost reduction, decreased development and maintenance workload, increased reproducibility, ease of sharing data and software, enhanced security, horizontal and vertical scalability, high availability, a thriving technology partner ecosystem, and much more. Despite these advantages that cloud-based workflows offer, the majority of scientific software developed in academia does not utilize cloud computing and must be migrated to the cloud by the user. In this article, we present 11 quick tips for architecting biomedical informatics workflows on compute clouds, distilling knowledge gained from experience developing, operating, maintaining, and distributing software and virtualized appliances on the world's largest cloud. Researchers who follow these tips stand to benefit immediately by migrating their workflows to cloud computing and embracing the paradigm of abstraction.
Eleven quick tips for architecting biomedical informatics workflows with cloud computing
Moore, Jason H.
2018-01-01
Cloud computing has revolutionized the development and operations of hardware and software across diverse technological arenas, yet academic biomedical research has lagged behind despite the numerous and weighty advantages that cloud computing offers. Biomedical researchers who embrace cloud computing can reap rewards in cost reduction, decreased development and maintenance workload, increased reproducibility, ease of sharing data and software, enhanced security, horizontal and vertical scalability, high availability, a thriving technology partner ecosystem, and much more. Despite these advantages that cloud-based workflows offer, the majority of scientific software developed in academia does not utilize cloud computing and must be migrated to the cloud by the user. In this article, we present 11 quick tips for architecting biomedical informatics workflows on compute clouds, distilling knowledge gained from experience developing, operating, maintaining, and distributing software and virtualized appliances on the world’s largest cloud. Researchers who follow these tips stand to benefit immediately by migrating their workflows to cloud computing and embracing the paradigm of abstraction. PMID:29596416
Automated Reuse of Scientific Subroutine Libraries through Deductive Synthesis
NASA Technical Reports Server (NTRS)
Lowry, Michael R.; Pressburger, Thomas; VanBaalen, Jeffrey; Roach, Steven
1997-01-01
Systematic software construction offers the potential of elevating software engineering from an art-form to an engineering discipline. The desired result is more predictable software development leading to better quality and more maintainable software. However, the overhead costs associated with the formalisms, mathematics, and methods of systematic software construction have largely precluded their adoption in real-world software development. In fact, many mainstream software development organizations, such as Microsoft, still maintain a predominantly oral culture for software development projects; which is far removed from a formalism-based culture for software development. An exception is the limited domain of safety-critical software, where the high-assuiance inherent in systematic software construction justifies the additional cost. We believe that systematic software construction will only be adopted by mainstream software development organization when the overhead costs have been greatly reduced. Two approaches to cost mitigation are reuse (amortizing costs over many applications) and automation. For the last four years, NASA Ames has funded the Amphion project, whose objective is to automate software reuse through techniques from systematic software construction. In particular, deductive program synthesis (i.e., program extraction from proofs) is used to derive a composition of software components (e.g., subroutines) that correctly implements a specification. The construction of reuse libraries of software components is the standard software engineering solution for improving software development productivity and quality.
How to Compare the Security Quality Requirements Engineering (SQUARE) Method with Other Methods
2007-08-01
Attack Trees for Modeling and Analysis 10 2.8 Misuse and Abuse Cases 10 2.9 Formal Methods 11 2.9.1 Software Cost Reduction 12 2.9.2 Common...modern or efficient techniques. • Requirements analysis typically is either not performed at all (identified requirements are directly specified without...any analysis or modeling) or analysis is restricted to functional re- quirements and ignores quality requirements, other nonfunctional requirements
An approach to software cost estimation
NASA Technical Reports Server (NTRS)
Mcgarry, F.; Page, J.; Card, D.; Rohleder, M.; Church, V.
1984-01-01
A general procedure for software cost estimation in any environment is outlined. The basic concepts of work and effort estimation are explained, some popular resource estimation models are reviewed, and the accuracy of source estimates is discussed. A software cost prediction procedure based on the experiences of the Software Engineering Laboratory in the flight dynamics area and incorporating management expertise, cost models, and historical data is described. The sources of information and relevant parameters available during each phase of the software life cycle are identified. The methodology suggested incorporates these elements into a customized management tool for software cost prediction. Detailed guidelines for estimation in the flight dynamics environment developed using this methodology are presented.
COSTMODL: An automated software development cost estimation tool
NASA Technical Reports Server (NTRS)
Roush, George B.
1991-01-01
The cost of developing computer software continues to consume an increasing portion of many organizations' total budgets, both in the public and private sector. As this trend develops, the capability to produce reliable estimates of the effort and schedule required to develop a candidate software product takes on increasing importance. The COSTMODL program was developed to provide an in-house capability to perform development cost estimates for NASA software projects. COSTMODL is an automated software development cost estimation tool which incorporates five cost estimation algorithms including the latest models for the Ada language and incrementally developed products. The principal characteristic which sets COSTMODL apart from other software cost estimation programs is its capacity to be completely customized to a particular environment. The estimation equations can be recalibrated to reflect the programmer productivity characteristics demonstrated by the user's organization, and the set of significant factors which effect software development costs can be customized to reflect any unique properties of the user's development environment. Careful use of a capability such as COSTMODL can significantly reduce the risk of cost overruns and failed projects.
Clinical process analysis and activity-based costing at a heart center.
Ridderstolpe, Lisa; Johansson, Andreas; Skau, Tommy; Rutberg, Hans; Ahlfeldt, Hans
2002-08-01
Cost studies, productivity, efficiency, and quality of care measures, the links between resources and patient outcomes, are fundamental issues for hospital management today. This paper describes the implementation of a model for process analysis and activity-based costing (ABC)/management at a Heart Center in Sweden as a tool for administrative cost information, strategic decision-making, quality improvement, and cost reduction. A commercial software package (QPR) containing two interrelated parts, "ProcessGuide and CostControl," was used. All processes at the Heart Center were mapped and graphically outlined. Processes and activities such as health care procedures, research, and education were identified together with their causal relationship to costs and products/services. The construction of the ABC model in CostControl was time-consuming. However, after the ABC/management system was created, it opened the way for new possibilities including process and activity analysis, simulation, and price calculations. Cost analysis showed large variations in the cost obtained for individual patients undergoing coronary artery bypass grafting (CABG) surgery. We conclude that a process-based costing system is applicable and has the potential to be useful in hospital management.
Florence, Curtis; Shepherd, Jonathan; Brennan, Iain; Simon, Thomas R
2014-04-01
To assess the costs and benefits of a partnership between health services, police and local government shown to reduce violence-related injury. Benefit-cost analysis. Anonymised information sharing and use led to a reduction in wounding recorded by the police that reduced the economic and social costs of violence by £6.9 million in 2007 compared with the costs the intervention city, Cardiff UK, would have experienced in the absence of the programme. This includes a gross cost reduction of £1.25 million to the health service and £1.62 million to the criminal justice system in 2007. By contrast, the costs associated with the programme were modest: setup costs of software modifications and prevention strategies were £107 769, while the annual operating costs of the system were estimated as £210 433 (2003 UK pound). The cumulative social benefit-cost ratio of the programme from 2003 to 2007 was £82 in benefits for each pound spent on the programme, including a benefit-cost ratio of 14.80 for the health service and 19.1 for the criminal justice system. Each of these benefit-cost ratios is above 1 across a wide range of sensitivity analyses. An effective information-sharing partnership between health services, police and local government in Cardiff, UK, led to substantial cost savings for the health service and the criminal justice system compared with 14 other cities in England and Wales designated as similar by the UK government where this intervention was not implemented.
The cost of software fault tolerance
NASA Technical Reports Server (NTRS)
Migneault, G. E.
1982-01-01
The proposed use of software fault tolerance techniques as a means of reducing software costs in avionics and as a means of addressing the issue of system unreliability due to faults in software is examined. A model is developed to provide a view of the relationships among cost, redundancy, and reliability which suggests strategies for software development and maintenance which are not conventional.
Akar, Zeynep; Küçük, Murat; Doğan, Hacer
2017-12-01
2,2-Diphenyl-1-picrylhydrazyl (DPPH • ) radical scavenging, the most commonly used antioxidant method with more than seventeen thousand articles cited, is very practical; however, as with most assays, it has the major disadvantage of dependence on a spectrophotometer. To overcome this drawback, the colorimetric determination of the antioxidant activity using a scanner and freely available Image J software was developed. In this new method, the mixtures of solutions of DPPH • and standard antioxidants or extracts of common medicinal herbs were dropped onto TLC plates, after an incubation period. The spot images were evaluated with Image J software to determine CSC 50 values, the sample concentrations providing 50% colour reduction, which were very similar with the SC 50 values obtained with spectrophotometric method. The advantages of the new method are the use of lower amounts of reagents and solvents, no need for costly spectrophotometers, and thus significantly lowered costs, and convenient implementation in any environment and situation.
NASA Astrophysics Data System (ADS)
Lou, Kuo-Ren; Wang, Lu
2016-05-01
The seller frequently offers the buyer trade credit to settle the purchase amount. From the seller's prospective, granting trade credit increases not only the opportunity cost (i.e., the interest loss on the buyer's purchase amount during the credit period) but also the default risk (i.e., the rate that the buyer will be unable to pay off his/her debt obligations). On the other hand, granting trade credit increases sales volume and revenue. Consequently, trade credit is an important strategy to increase seller's profitability. In this paper, we assume that the seller uses trade credit and number of shipments in a production run as decision variables to maximise his/her profit, while the buyer determines his/her replenishment cycle time and capital investment as decision variables to reduce his/her ordering cost and achieve his/her maximum profit. We then derive non-cooperative Nash solution and cooperative integrated solution in a just-in-time inventory system, in which granting trade credit increases not only the demand but also the opportunity cost and default risk, and the relationship between the capital investment and the ordering cost reduction is logarithmic. Then, we use a software to solve and compare these two distinct solutions. Finally, we use sensitivity analysis to obtain some managerial insights.
ERIC Educational Resources Information Center
Thankachan, Briju; Moore, David Richard
2017-01-01
The use of Free and Open Source Software (FOSS), a subset of Information and Communication Technology (ICT), can reduce the cost of purchasing software. Despite the benefit in the initial purchase price of software, deploying software requires total cost that goes beyond the initial purchase price. Total cost is a silent issue of FOSS and can only…
NASA Software Cost Estimation Model: An Analogy Based Estimation Model
NASA Technical Reports Server (NTRS)
Hihn, Jairus; Juster, Leora; Menzies, Tim; Mathew, George; Johnson, James
2015-01-01
The cost estimation of software development activities is increasingly critical for large scale integrated projects such as those at DOD and NASA especially as the software systems become larger and more complex. As an example MSL (Mars Scientific Laboratory) developed at the Jet Propulsion Laboratory launched with over 2 million lines of code making it the largest robotic spacecraft ever flown (Based on the size of the software). Software development activities are also notorious for their cost growth, with NASA flight software averaging over 50% cost growth. All across the agency, estimators and analysts are increasingly being tasked to develop reliable cost estimates in support of program planning and execution. While there has been extensive work on improving parametric methods there is very little focus on the use of models based on analogy and clustering algorithms. In this paper we summarize our findings on effort/cost model estimation and model development based on ten years of software effort estimation research using data mining and machine learning methods to develop estimation models based on analogy and clustering. The NASA Software Cost Model performance is evaluated by comparing it to COCOMO II, linear regression, and K- nearest neighbor prediction model performance on the same data set.
Designing informed game-based rehabilitation tasks leveraging advances in virtual reality.
Lange, Belinda; Koenig, Sebastian; Chang, Chien-Yen; McConnell, Eric; Suma, Evan; Bolas, Mark; Rizzo, Albert
2012-01-01
This paper details a brief history and rationale for the use of virtual reality (VR) technology for clinical research and intervention, and then focuses on game-based VR applications in the area of rehabilitation. An analysis of the match between rehabilitation task requirements and the assets available with VR technology is presented. Low-cost camera-based systems capable of tracking user behavior at sufficient levels for game-based virtual rehabilitation activities are currently available for in-home use. Authoring software is now being developed that aims to provide clinicians with a usable toolkit for leveraging this technology. This will facilitate informed professional input on software design, development and application to ensure safe and effective use in the rehabilitation context. The field of rehabilitation generally stands to benefit from the continual advances in VR technology, concomitant system cost reductions and an expanding clinical research literature and knowledge base. Home-based activity within VR systems that are low-cost, easy to deploy and maintain, and meet the requirements for "good" interactive rehabilitation tasks could radically improve users' access to care, adherence to prescribed training and subsequently enhance functional activity in everyday life in clinical populations.
Estimating Software-Development Costs With Greater Accuracy
NASA Technical Reports Server (NTRS)
Baker, Dan; Hihn, Jairus; Lum, Karen
2008-01-01
COCOMOST is a computer program for use in estimating software development costs. The goal in the development of COCOMOST was to increase estimation accuracy in three ways: (1) develop a set of sensitivity software tools that return not only estimates of costs but also the estimation error; (2) using the sensitivity software tools, precisely define the quantities of data needed to adequately tune cost estimation models; and (3) build a repository of software-cost-estimation information that NASA managers can retrieve to improve the estimates of costs of developing software for their project. COCOMOST implements a methodology, called '2cee', in which a unique combination of well-known pre-existing data-mining and software-development- effort-estimation techniques are used to increase the accuracy of estimates. COCOMOST utilizes multiple models to analyze historical data pertaining to software-development projects and performs an exhaustive data-mining search over the space of model parameters to improve the performances of effort-estimation models. Thus, it is possible to both calibrate and generate estimates at the same time. COCOMOST is written in the C language for execution in the UNIX operating system.
2006-02-01
technology for cost and risk reduction of products, software, and processes; long-term, multi-Service needs; and disruptive technologies , both...initiatives and for disruptive technologies , the Office of the Secretary of Defense (OSD) can better promote the importance and value of the program...multi- Service programs, research in “ disruptive ” technologies , and SBIR programs. Balance current, near term, and future needs as well as small and
Impact of Agile Software Development Model on Software Maintainability
ERIC Educational Resources Information Center
Gawali, Ajay R.
2012-01-01
Software maintenance and support costs account for up to 60% of the overall software life cycle cost and often burdens tightly budgeted information technology (IT) organizations. Agile software development approach delivers business value early, but implications on software maintainability are still unknown. The purpose of this quantitative study…
Effort Drivers Estimation for Brazilian Geographically Distributed Software Development
NASA Astrophysics Data System (ADS)
Almeida, Ana Carina M.; Souza, Renata; Aquino, Gibeon; Meira, Silvio
To meet the requirements of today’s fast paced markets, it is important to develop projects on time and with the minimum use of resources. A good estimate is the key to achieve this goal. Several companies have started to work with geographically distributed teams due to cost reduction and time-to-market. Some researchers indicate that this approach introduces new challenges, because the teams work in different time zones and have possible differences in culture and language. It is already known that the multisite development increases the software cycle time. Data from 15 DSD projects from 10 distinct companies were collected. The analysis shows drivers that impact significantly the total effort planned to develop systems using DSD approach in Brazil.
Software Solution Saves Dollars
ERIC Educational Resources Information Center
Trotter, Andrew
2004-01-01
This article discusses computer software that can give classrooms and computer labs the capabilities of costly PC's at a small fraction of the cost. A growing number of cost-conscious school districts are finding budget relief in low-cost computer software known as "open source" that can do everything from manage school Web sites to equip…
2010-09-01
NNWC) was used to calculate major cost components—labor, hardware, software , and transport, while a VMware tool was used to calculate power and...cooling costs for both solutions. In addition, VMware provided a cost estimate for the upfront hardware and software licensing costs needed to support...cost per seat (CPS) model developed by Naval Network Warfare Command (NNWC) was used to calculate major cost components—labor, hardware, software , and
NASA Technical Reports Server (NTRS)
Hops, J. M.; Sherif, J. S.
1994-01-01
A great deal of effort is now being devoted to the study, analysis, prediction, and minimization of software maintenance expected cost, long before software is delivered to users or customers. It has been estimated that, on the average, the effort spent on software maintenance is as costly as the effort spent on all other software costs. Software design methods should be the starting point to aid in alleviating the problems of software maintenance complexity and high costs. Two aspects of maintenance deserve attention: (1) protocols for locating and rectifying defects, and for ensuring that noe new defects are introduced in the development phase of the software process; and (2) protocols for modification, enhancement, and upgrading. This article focuses primarily on the second aspect, the development of protocols to help increase the quality and reduce the costs associated with modifications, enhancements, and upgrades of existing software. This study developed parsimonious models and a relative complexity metric for complexity measurement of software that were used to rank the modules in the system relative to one another. Some success was achieved in using the models and the relative metric to identify maintenance-prone modules.
Florence, Curtis; Shepherd, Jonathan; Brennan, Iain; Simon, Thomas
2018-01-01
Objective To assess the costs and benefits of a partnership between health services, police and local government shown to reduce violence related injury. Methods Cost benefit analysis Results Anonymised information sharing and use led to a reduction in wounding recorded by the police that reduced the economic and social costs of violence by £6.9 million in 2007 compared to the costs the intervention city, Cardiff UK, would have experienced in the absence of the programme. This includes a gross cost reduction of £1.25 million to the health service and £1.62 million to the criminal justice system in 2007. In contrast, the costs associated with the programme are modest: setup costs of software modifications and prevention strategies were £107,769, while the annual operating costs of the system were estimated as £210,433 (2003 UK Pound). The cumulative social benefit/cost ratio of the programme from 2003 to 2007 was £82 in benefits for each pound spent on the programme, including a benefit cost ratio of 14.8 for the health service and 19.1 for the criminal justice system. Each of these benefit/cost ratios is above 1 across a wide range of sensitivity analyses. Conclusions An effective information sharing partnership between health services, police, and local government in Cardiff, UK, led to substantial cost savings to the health service and the criminal justice system compared with 14 other cities in England and Wales designated as similar by the UK government where this intervention was not implemented. PMID:24048916
Spacelab experiment computer study. Volume 1: Executive summary (presentation)
NASA Technical Reports Server (NTRS)
Lewis, J. L.; Hodges, B. C.; Christy, J. O.
1976-01-01
A quantitative cost for various Spacelab flight hardware configurations is provided along with varied software development options. A cost analysis of Spacelab computer hardware and software is presented. The cost study is discussed based on utilization of a central experiment computer with optional auxillary equipment. Groundrules and assumptions used in deriving the costing methods for all options in the Spacelab experiment study are presented. The groundrules and assumptions, are analysed and the options along with their cost considerations, are discussed. It is concluded that Spacelab program cost for software development and maintenance is independent of experimental hardware and software options, that distributed standard computer concept simplifies software integration without a significant increase in cost, and that decisions on flight computer hardware configurations should not be made until payload selection for a given mission and a detailed analysis of the mission requirements are completed.
New Design for Rapid Prototyping of Digital Master Casts for Multiple Dental Implant Restorations
Romero, Luis; Jiménez, Mariano; Espinosa, María del Mar; Domínguez, Manuel
2015-01-01
Aim This study proposes the replacement of all the physical devices used in the manufacturing of conventional prostheses through the use of digital tools, such as 3D scanners, CAD design software, 3D implants files, rapid prototyping machines or reverse engineering software, in order to develop laboratory work models from which to finish coatings for dental prostheses. Different types of dental prosthetic structures are used, which were adjusted by a non-rotatory threaded fixing system. Method From a digital process, the relative positions of dental implants, soft tissue and adjacent teeth of edentulous or partially edentulous patients has been captured, and a maser working model which accurately replicates data relating to the patients oral cavity has been through treatment of three-dimensional digital data. Results Compared with the conventional master cast, the results show a significant cost savings in attachments, as well as an increase in the quality of reproduction and accuracy of the master cast, with the consequent reduction in the number of patient consultation visits. The combination of software and hardware three-dimensional tools allows the optimization of the planning of dental implant-supported rehabilitations protocol, improving the predictability of clinical treatments and the production cost savings of master casts for restorations upon implants. PMID:26696528
New Design for Rapid Prototyping of Digital Master Casts for Multiple Dental Implant Restorations.
Romero, Luis; Jiménez, Mariano; Espinosa, María Del Mar; Domínguez, Manuel
2015-01-01
This study proposes the replacement of all the physical devices used in the manufacturing of conventional prostheses through the use of digital tools, such as 3D scanners, CAD design software, 3D implants files, rapid prototyping machines or reverse engineering software, in order to develop laboratory work models from which to finish coatings for dental prostheses. Different types of dental prosthetic structures are used, which were adjusted by a non-rotatory threaded fixing system. From a digital process, the relative positions of dental implants, soft tissue and adjacent teeth of edentulous or partially edentulous patients has been captured, and a maser working model which accurately replicates data relating to the patients oral cavity has been through treatment of three-dimensional digital data. Compared with the conventional master cast, the results show a significant cost savings in attachments, as well as an increase in the quality of reproduction and accuracy of the master cast, with the consequent reduction in the number of patient consultation visits. The combination of software and hardware three-dimensional tools allows the optimization of the planning of dental implant-supported rehabilitations protocol, improving the predictability of clinical treatments and the production cost savings of master casts for restorations upon implants.
Manned/Unmanned Common Architecture Program (MCAP) net centric flight tests
NASA Astrophysics Data System (ADS)
Johnson, Dale
2009-04-01
Properly architected avionics systems can reduce the costs of periodic functional improvements, maintenance, and obsolescence. With this in mind, the U.S. Army Aviation Applied Technology Directorate (AATD) initiated the Manned/Unmanned Common Architecture Program (MCAP) in 2003 to develop an affordable, high-performance embedded mission processing architecture for potential application to multiple aviation platforms. MCAP analyzed Army helicopter and unmanned air vehicle (UAV) missions, identified supporting subsystems, surveyed advanced hardware and software technologies, and defined computational infrastructure technical requirements. The project selected a set of modular open systems standards and market-driven commercial-off-theshelf (COTS) electronics and software, and, developed experimental mission processors, network architectures, and software infrastructures supporting the integration of new capabilities, interoperability, and life cycle cost reductions. MCAP integrated the new mission processing architecture into an AH-64D Apache Longbow and participated in Future Combat Systems (FCS) network-centric operations field experiments in 2006 and 2007 at White Sands Missile Range (WSMR), New Mexico and at the Nevada Test and Training Range (NTTR) in 2008. The MCAP Apache also participated in PM C4ISR On-the-Move (OTM) Capstone Experiments 2007 (E07) and 2008 (E08) at Ft. Dix, NJ and conducted Mesa, Arizona local area flight tests in December 2005, February 2006, and June 2008.
The costs of avoiding environmental impacts from shale-gas surface infrastructure.
Milt, Austin W; Gagnolet, Tamara D; Armsworth, Paul R
2016-12-01
Growing energy demand has increased the need to manage conflicts between energy production and the environment. As an example, shale-gas extraction requires substantial surface infrastructure, which fragments habitats, erodes soils, degrades freshwater systems, and displaces rare species. Strategic planning of shale-gas infrastructure can reduce trade-offs between economic and environmental objectives, but the specific nature of these trade-offs is not known. We estimated the cost of avoiding impacts from land-use change on forests, wetlands, rare species, and streams from shale-energy development within leaseholds. We created software for optimally siting shale-gas surface infrastructure to minimize its environmental impacts at reasonable construction cost. We visually assessed sites before infrastructure optimization to test whether such inspection could be used to predict whether impacts could be avoided at the site. On average, up to 38% of aggregate environmental impacts of infrastructure could be avoided for 20% greater development costs by spatially optimizing infrastructure. However, we found trade-offs between environmental impacts and costs among sites. In visual inspections, we often distinguished between sites that could be developed to avoid impacts at relatively low cost (29%) and those that could not (20%). Reductions in a metric of aggregate environmental impact could be largely attributed to potential displacement of rare species, sedimentation, and forest fragmentation. Planners and regulators can estimate and use heterogeneous trade-offs among development sites to create industry-wide improvements in environmental performance and do so at reasonable costs by, for example, leveraging low-cost avoidance of impacts at some sites to offset others. This could require substantial effort, but the results and software we provide can facilitate the process. © 2016 Society for Conservation Biology.
Software Cost-Estimation Model
NASA Technical Reports Server (NTRS)
Tausworthe, R. C.
1985-01-01
Software Cost Estimation Model SOFTCOST provides automated resource and schedule model for software development. Combines several cost models found in open literature into one comprehensive set of algorithms. Compensates for nearly fifty implementation factors relative to size of task, inherited baseline, organizational and system environment and difficulty of task.
Reducing Contingency through Sampling at the Luckey FUSRAP Site - 13186
DOE Office of Scientific and Technical Information (OSTI.GOV)
Frothingham, David; Barker, Michelle; Buechi, Steve
2013-07-01
Typically, the greatest risk in developing accurate cost estimates for the remediation of hazardous, toxic, and radioactive waste sites is the uncertainty in the estimated volume of contaminated media requiring remediation. Efforts to address this risk in the remediation cost estimate can result in large cost contingencies that are often considered unacceptable when budgeting for site cleanups. Such was the case for the Luckey Formerly Utilized Sites Remedial Action Program (FUSRAP) site near Luckey, Ohio, which had significant uncertainty surrounding the estimated volume of site soils contaminated with radium, uranium, thorium, beryllium, and lead. Funding provided by the American Recoverymore » and Reinvestment Act (ARRA) allowed the U.S. Army Corps of Engineers (USACE) to conduct additional environmental sampling and analysis at the Luckey Site between November 2009 and April 2010, with the objective to further delineate the horizontal and vertical extent of contaminated soils in order to reduce the uncertainty in the soil volume estimate. Investigative work included radiological, geophysical, and topographic field surveys, subsurface borings, and soil sampling. Results from the investigative sampling were used in conjunction with Argonne National Laboratory's Bayesian Approaches for Adaptive Spatial Sampling (BAASS) software to update the contaminated soil volume estimate for the site. This updated volume estimate was then used to update the project cost-to-complete estimate using the USACE Cost and Schedule Risk Analysis process, which develops cost contingencies based on project risks. An investment of $1.1 M of ARRA funds for additional investigative work resulted in a reduction of 135,000 in-situ cubic meters (177,000 in-situ cubic yards) in the estimated base volume estimate. This refinement of the estimated soil volume resulted in a $64.3 M reduction in the estimated project cost-to-complete, through a reduction in the uncertainty in the contaminated soil volume estimate and the associated contingency costs. (authors)« less
Custom hip prostheses by integrating CAD and casting technology
NASA Astrophysics Data System (ADS)
Silva, Pedro F.; Leal, Nuno; Neto, Rui J.; Lino, F. Jorge; Reis, Ana
2012-09-01
Total Hip Arthroplasty (THA) is a surgical intervention that is being achieving high rates of success, leaving room to research on long run durability, patient comfort and costs reduction. Even so, up to the present, little research has been done to improve the method of manufacturing customized prosthesis. The common customized prostheses are made by full machining. This document presents a different approach methodology which combines the study of medical images, through CAD (Computer Aided Design) software, SLadditive manufacturing, ceramic shell manufacture, precision foundry with Titanium alloys and Computer Aided Manufacturing (CAM). The goal is to achieve the best comfort for the patient, stress distribution and the maximum lifetime of the prosthesis produced by this integrated methodology. The way to achieve this desiderate is to make custom hip prosthesis which are adapted to each patient needs and natural physiognomy. Not only the process is reliable, but also represents a cost reduction comparing to the conventional full machined custom hip prosthesis.
Code of Federal Regulations, 2010 CFR
2010-04-01
... computer hardware or software, or both, the cost of contracting for those services, or the cost of... operating budget. At the HA's option, the cost of the computer software may include service contracts to...
48 CFR 1845.7101-3 - Unit acquisition cost.
Code of Federal Regulations, 2010 CFR
2010-10-01
... services for designs, plans, specifications, and surveys. (6) Acquisition and preparation costs of... acquisition cost is under $100,000, it shall be reported as under $100,000. (g) Software acquisition costs include software costs incurred up through acceptance testing and material internal costs incurred to...
ERIC Educational Resources Information Center
Mitchell, Susan Marie
2012-01-01
Uncontrollable costs, schedule overruns, and poor end product quality continue to plague the software engineering field. Innovations formulated with the expectation to minimize or eliminate cost, schedule, and quality problems have generally fallen into one of three categories: programming paradigms, software tools, and software process…
An experience of qualified preventive screening: shiraz smart screening software.
Islami Parkoohi, Parisa; Zare, Hashem; Abdollahifard, Gholamreza
2015-01-01
Computerized preventive screening software is a cost effective intervention tool to address non-communicable chronic diseases. Shiraz Smart Screening Software (SSSS) was developed as an innovative tool for qualified screening. It allows simultaneous smart screening of several high-burden chronic diseases and supports reminder notification functionality. The extent in which SSSS affects screening quality is also described. Following software development, preventive screening and annual health examinations of 261 school staff (Medical School of Shiraz, Iran) was carried out in a software-assisted manner. To evaluate the quality of the software-assisted screening, we used quasi-experimental study design and determined coverage, irregular attendance and inappropriateness proportions in relation with the manual and software-assisted screening as well as the corresponding number of requested tests. In manual screening method, 27% of employees were covered (with 94% irregular attendance) while by software-assisted screening, the coverage proportion was 79% (attendance status will clear after the specified time). The frequency of inappropriate screening test requests, before the software implementation, was 41.37% for fasting plasma glucose, 41.37% for lipid profile, 0.84% for occult blood, 0.19% for flexible sigmoidoscopy/colonoscopy, 35.29% for Pap smear, 19.20% for mammography and 11.2% for prostate specific antigen. All of the above were corrected by the software application. In total, 366 manual screening and 334 software-assisted screening tests were requested. SSSS is an innovative tool to improve the quality of preventive screening plans in terms of increased screening coverage, reduction in inappropriateness and the total number of requested tests.
DOT National Transportation Integrated Search
2013-04-01
The Rural Road Upgrade Inventory and Cost Estimation Software is designed by the AUTC : research team to help the Fairbanks North Star Borough (FNSB) estimate the cost of upgrading : rural roads located in the Borough's Service Areas. The Software pe...
NASA Technical Reports Server (NTRS)
Alexandrov, Natalia; Holmes, Bruce J.; Hahn, Andrew S.
2016-01-01
We report on an examination of potential benefits of infusing wireless technologies into various areas of aircraft and airspace operations. The analysis is done in support of a NASA seedling project Efficient Reconfigurable Cockpit Design and Fleet Operations Using Software Intensive, Network Enabled Wireless Architecture (ECON). The study has two objectives. First, we investigate one of the main benefit hypotheses of the ECON proposal: that the replacement of wired technologies with wireless would lead to significant weight reductions on an aircraft, among other benefits. Second, we advance a list of wireless technology applications and discuss their system benefits. With regard to the primary hypothesis, we conclude that the promise of weight reduction is premature. Specificity of the system domain and aircraft, criticality of components, reliability of wireless technologies, the weight of replacement or augmentation equipment, and the cost of infusion must all be taken into account among other considerations, to produce a reliable estimate of weight savings or increase.
ISS Operations Cost Reductions Through Automation of Real-Time Planning Tasks
NASA Technical Reports Server (NTRS)
Hall, Timothy A.
2011-01-01
In 2008 the Johnson Space Center s Mission Operations Directorate (MOD) management team challenged their organization to find ways to reduce the costs of International Space station (ISS) console operations in the Mission Control Center (MCC). Each MOD organization was asked to identify projects that would help them attain a goal of a 30% reduction in operating costs by 2012. The MOD Operations and Planning organization responded to this challenge by launching several software automation projects that would allow them to greatly improve ISS console operations and reduce staffing and operating costs. These projects to date have allowed the MOD Operations organization to remove one full time (7 x 24 x 365) ISS console position in 2010; with the plan of eliminating two full time ISS console support positions by 2012. This will account for an overall 10 EP reduction in staffing for the Operations and Planning organization. These automation projects focused on utilizing software to automate many administrative and often repetitive tasks involved with processing ISS planning and daily operations information. This information was exchanged between the ground flight control teams in Houston and around the globe, as well as with the ISS astronaut crew. These tasks ranged from managing mission plan changes from around the globe, to uploading and downloading information to and from the ISS crew, to even more complex tasks that required multiple decision points to process the data, track approvals and deliver it to the correct recipient across network and security boundaries. The software solutions leveraged several different technologies including customized web applications and implementation of industry standard web services architecture between several planning tools; as well as a engaging a previously research level technology (TRL 2-3) developed by Ames Research Center (ARC) that utilized an intelligent agent based system to manage and automate file traffic flow, archiving f data, and generating console logs. This technology called OCAMS (OCA (Orbital Communication System) Management System), is now considered TRL level 9 and is in daily use in the Mission Control Center in support of ISS operations. These solutions have not only allowed for improved efficiency on console; but since many of the previously manual data transfers are now automated, many of the human error prone steps have been removed, and the quality of the planning products has improved tremendously. This has also allowed our Planning Flight Controllers more time to focus on the abstract areas of the job, (like the complexities of planning a mission for 6 international crew members with a global planning team), instead of being burdened with the administrative tasks that took significant time each console shift to process. The resulting automation solutions have allowed the Operations and Planning organization to realize significant cost savings for the ISS program through 2020 and many of these solutions could be a viable
Development of Methodology for Programming Autonomous Agents
NASA Technical Reports Server (NTRS)
Erol, Kutluhan; Levy, Renato; Lang, Lun
2004-01-01
A brief report discusses the rationale for, and the development of, a methodology for generating computer code for autonomous-agent-based systems. The methodology is characterized as enabling an increase in the reusability of the generated code among and within such systems, thereby making it possible to reduce the time and cost of development of the systems. The methodology is also characterized as enabling reduction of the incidence of those software errors that are attributable to the human failure to anticipate distributed behaviors caused by the software. A major conceptual problem said to be addressed in the development of the methodology was that of how to efficiently describe the interfaces between several layers of agent composition by use of a language that is both familiar to engineers and descriptive enough to describe such interfaces unambivalently
Software Support for Transiently Powered Computers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Van Der Woude, Joel Matthew
With the continued reduction in size and cost of computing, power becomes an increasingly heavy burden on system designers for embedded applications. While energy harvesting techniques are an increasingly desirable solution for many deeply embedded applications where size and lifetime are a priority, previous work has shown that energy harvesting provides insufficient power for long running computation. We present Ratchet, which to the authors knowledge is the first automatic, software-only checkpointing system for energy harvesting platforms. We show that Ratchet provides a means to extend computation across power cycles, consistent with those experienced by energy harvesting devices. We demonstrate themore » correctness of our system under frequent failures and show that it has an average overhead of 58.9% across a suite of benchmarks representative for embedded applications.« less
Component Prioritization Schema for Achieving Maximum Time and Cost Benefits from Software Testing
NASA Astrophysics Data System (ADS)
Srivastava, Praveen Ranjan; Pareek, Deepak
Software testing is any activity aimed at evaluating an attribute or capability of a program or system and determining that it meets its required results. Defining the end of software testing represents crucial features of any software development project. A premature release will involve risks like undetected bugs, cost of fixing faults later, and discontented customers. Any software organization would want to achieve maximum possible benefits from software testing with minimum resources. Testing time and cost need to be optimized for achieving a competitive edge in the market. In this paper, we propose a schema, called the Component Prioritization Schema (CPS), to achieve an effective and uniform prioritization of the software components. This schema serves as an extension to the Non Homogenous Poisson Process based Cumulative Priority Model. We also introduce an approach for handling time-intensive versus cost-intensive projects.
Current state and future direction of computer systems at NASA Langley Research Center
NASA Technical Reports Server (NTRS)
Rogers, James L. (Editor); Tucker, Jerry H. (Editor)
1992-01-01
Computer systems have advanced at a rate unmatched by any other area of technology. As performance has dramatically increased there has been an equally dramatic reduction in cost. This constant cost performance improvement has precipitated the pervasiveness of computer systems into virtually all areas of technology. This improvement is due primarily to advances in microelectronics. Most people are now convinced that the new generation of supercomputers will be built using a large number (possibly thousands) of high performance microprocessors. Although the spectacular improvements in computer systems have come about because of these hardware advances, there has also been a steady improvement in software techniques. In an effort to understand how these hardware and software advances will effect research at NASA LaRC, the Computer Systems Technical Committee drafted this white paper to examine the current state and possible future directions of computer systems at the Center. This paper discusses selected important areas of computer systems including real-time systems, embedded systems, high performance computing, distributed computing networks, data acquisition systems, artificial intelligence, and visualization.
Benefit Evaluation of Implementing BIM in Construction Projects
NASA Astrophysics Data System (ADS)
Chou, Hui-Yu; Chen, Pei-Yu
2017-10-01
Since 2014, public construction projects in Taiwan have progressively undertaken steps to promote the use of Building Information Modelling (BIM) technology, the use of BIM has therefore become a necessity for contractors. However, issues such as the high upfront costs relating to software and hardware setup and BIM user training, combined with the difficulties of incorporating BIM into existing workflow operations and management systems, remain a challenge to contractors. Consequently, the benefits stemming from the BIM implementation in turn will affect the activeness and enthusiasm of contractors to implement BIM. While there have been previous studies abroad where the benefits relating to BIM implementation had been calculated and quantified numerically, a benefit evaluation index would require considerations for regional industry practices and characteristics. This study established a benefit evaluation index and method for the implementation of BIM suitable for contractors in Taiwan. The three main principal indexes are: (1) RCR means the effects of reducing costs associated with rework; (2) SDR & DPR mean the effects of mitigating delays that occur due to construction interface coordination or rework, as well as the effects of reducing the penalty costs associated with overdue delivery; (3) AQE means the effects of improving the ability to estimate the amounts of building materials and resources. This study also performed a benefit evaluation calculation of a real world case study construction project using the first two established indexes. The results showed a 0.16% reduction in rework costs, a 6.49% reduction in delays that occur from construction interface coordination or rework, and a 5.0% reduction in penalty costs associated with overdue deliveries. The results demonstrated the applicability of the benefit evaluation index established in this study for real world construction projects.
32 CFR 37.550 - May I accept intellectual property as cost sharing?
Code of Federal Regulations, 2013 CFR
2013-07-01
... software) as cost sharing, because: (1) It is difficult to assign values to these intangible contributions... offer the use of commercially available software for which there is an established license fee for use of the product. The costs of the development of the software would not be a reasonable basis for...
32 CFR 37.550 - May I accept intellectual property as cost sharing?
Code of Federal Regulations, 2011 CFR
2011-07-01
... offer the use of commercially available software for which there is an established license fee for use of the product. The costs of the development of the software would not be a reasonable basis for... software) as cost sharing, because: (1) It is difficult to assign values to these intangible contributions...
Development of the ASTRI heliostat
NASA Astrophysics Data System (ADS)
Coventry, Joe; Arjomandi, Maziar; Barry, John; Blanco, Manuel; Burgess, Greg; Campbell, Jonathan; Connor, Phil; Emes, Matthew; Fairman, Philip; Farrant, David; Ghanadi, Farzin; Grigoriev, Victor; Hall, Colin; Koltun, Paul; Lewis, David; Martin, Scott; Nathan, Graham; Pye, John; Qiu, Ang; Stuart, Wayne; Tang, Youhong; Venn, Felix; Yu, Jeremy
2016-05-01
The Australian Solar Thermal Research Initiative (ASTRI) aims to develop a high optical quality heliostat with target cost - manufactured, installed and operational - of 90 AUD/m2. Three different heliostat design concepts are described, each with features identified during a prior scoping study as having the potential to contribute to cost reduction compared to the current state-of-the-art. The three concepts which are being developed will be down-selected to a single concept for testing in late 2016. The heliostat concept development work is supported by technology development streams, developing novel sandwich panel mirror facet structures, analysing and testing wind loads on heliostats in both stow and operation positions, and developing new heliostat field layouts and software tools for optical analysis of heliostats design concepts.
RiskScape: a new tool for comparing risk from natural hazards (Invited)
NASA Astrophysics Data System (ADS)
Stirling, M. W.; King, A.
2010-12-01
The Regional RiskScape is New Zealand’s joint venture between GNS Science & NIWA, and represents a comprehensive and easy-to-use tool for multi-hazard-based risk and impact analysis. It has basic GIS functionality, in that it has Import/Export functions to use with GIS software. Five natural hazards have been implemented in Riskscape to date: Flood (river), earthquake, volcano (ash), tsunami and wind storm. The software converts hazard exposure information into the likely impacts for a region, for example, damage and replacement costs, casualties, economic losses, disruption, and number of people affected. It therefore can be used to assist with risk management, land use planning, building codes and design, risk identification, prioritization of risk-reduction/mitigation, determination of “best use” risk-reduction investment, evacuation and contingency planning, awareness raising, public information, realistic scenarios for exercises, and hazard event response. Three geographically disparate pilot regions have been used to develop and triall Riskscape in New Zealand, and each region is exposed to a different mix of natural hazards. Future (phase II) development of Riskscape will include the following hazards: Landslides (both rainfall and earthquake triggered), storm surges, pyroclastic flows and lahars, and climate change effects. While Riskscape developments have thus far focussed on scenario-based risk, future developments will advance the software into providing probabilistic-based solutions.
Knowledge-based assistance in costing the space station DMS
NASA Technical Reports Server (NTRS)
Henson, Troy; Rone, Kyle
1988-01-01
The Software Cost Engineering (SCE) methodology developed over the last two decades at IBM Systems Integration Division (SID) in Houston is utilized to cost the NASA Space Station Data Management System (DMS). An ongoing project to capture this methodology, which is built on a foundation of experiences and lessons learned, has resulted in the development of an internal-use-only, PC-based prototype that integrates algorithmic tools with knowledge-based decision support assistants. This prototype Software Cost Engineering Automation Tool (SCEAT) is being employed to assist in the DMS costing exercises. At the same time, DMS costing serves as a forcing function and provides a platform for the continuing, iterative development, calibration, and validation and verification of SCEAT. The data that forms the cost engineering database is derived from more than 15 years of development of NASA Space Shuttle software, ranging from low criticality, low complexity support tools to highly complex and highly critical onboard software.
Guidance and Control Software,
1980-05-01
commitments of function, cost, and schedule . The phrase "software engineering" was intended to contrast with the phrase "computer science" the latter aims...the software problems of cost, delivery schedule , and quality were gradually being recognized at the highest management levels. Thus, in a project... schedule dates. Although the analysis of software problems indicated that the entire software development process (figure 1) needed new methods, only
48 CFR 252.234-7003 - Notice of Cost and Software Data Reporting System. (NOV 2010)
Code of Federal Regulations, 2012 CFR
2012-10-01
... Software Data Reporting System. (NOV 2010) 252.234-7003 Section 252.234-7003 Federal Acquisition... Software Data Reporting System. (NOV 2010) As prescribed in 234-7101(a)(1), use the following provision: Notice of Cost and Software Data Reporting System (NOV 2010) (a) This solicitation includes— (1) The...
48 CFR 252.234-7003 - Notice of Cost and Software Data Reporting System. (NOV 2010)
Code of Federal Regulations, 2014 CFR
2014-10-01
... Software Data Reporting System. (NOV 2010) 252.234-7003 Section 252.234-7003 Federal Acquisition... Software Data Reporting System. (NOV 2010) As prescribed in 234-7101(a)(1), use the following provision: Notice of Cost and Software Data Reporting System (NOV 2010) (a) This solicitation includes— (1) The...
48 CFR 252.234-7003 - Notice of Cost and Software Data Reporting System. (NOV 2010)
Code of Federal Regulations, 2011 CFR
2011-10-01
... Software Data Reporting System. (NOV 2010) 252.234-7003 Section 252.234-7003 Federal Acquisition... Software Data Reporting System. (NOV 2010) As prescribed in 234-7101(a)(1), use the following provision: NOTICE OF COST AND SOFTWARE DATA REPORTING SYSTEM (NOV 2010) (a) This solicitation includes— (1) The...
48 CFR 252.234-7003 - Notice of Cost and Software Data Reporting System. (NOV 2010)
Code of Federal Regulations, 2013 CFR
2013-10-01
... Software Data Reporting System. (NOV 2010) 252.234-7003 Section 252.234-7003 Federal Acquisition... Software Data Reporting System. (NOV 2010) As prescribed in 234-7101(a)(1), use the following provision: Notice of Cost and Software Data Reporting System (NOV 2010) (a) This solicitation includes— (1) The...
Software Cuts Homebuilding Costs, Increases Energy Efficiency
NASA Technical Reports Server (NTRS)
2015-01-01
To sort out the best combinations of technologies for a crewed mission to Mars, NASA Headquarters awarded grants to MIT's Department of Aeronautics and Astronautics to develop an algorithm-based software tool that highlights the most reliable and cost-effective options. Utilizing the software, Professor Edward Crawley founded Cambridge, Massachussetts-based Ekotrope, which helps homebuilders choose cost- and energy-efficient floor plans and materials.
Using SCR methods to analyze requirements documentation
NASA Technical Reports Server (NTRS)
Callahan, John; Morrison, Jeffery
1995-01-01
Software Cost Reduction (SCR) methods are being utilized to analyze and verify selected parts of NASA's EOS-DIS Core System (ECS) requirements documentation. SCR is being used as a spot-inspection tool. Through this formal and systematic approach of the SCR requirements methods, insights as to whether the requirements are internally inconsistent or incomplete as the scenarios of intended usage evolve in the OC (Operations Concept) documentation. Thus, by modelling the scenarios and requirements as mode charts using the SCR methods, we have been able to identify problems within and between the documents.
Views of Health Information Management Staff on the Medical Coding Software in Mashhad, Iran.
Kimiafar, Khalil; Hemmati, Fatemeh; Banaye Yazdipour, Alireza; Sarbaz, Masoumeh
2018-01-01
Systematic evaluation of Health Information Technology (HIT) and users' views leads to the modification and development of these technologies in accordance with their needs. The purpose of this study was to investigate the views of Health Information Management (HIM) staff on the quality of medical coding software. A descriptive cross-sectional study was conducted between May to July 2016 in 26 hospitals (academic and non-academic) in Mashhad, north-eastern Iran. The study population consisted of the chairs of HIM departments and medical coders (58 staff). Data were collected through a valid and reliable questionnaire. The data were analyzed using the SPSS version 16.0. From the views of staff, the advantages of coding software such as reducing coding time had the highest average (Mean=3.82) while cost reduction had the lowest average (Mean =3.20), respectively. Meanwhile, concern about losing job opportunities was the least important disadvantage (15.5%) to the use of coding software. In general, the results of this study showed that coding software in some cases have deficiencies. Designers and developers of health information coding software should pay more attention to technical aspects, in-work reminders, help in deciding on proper codes selection by access coding rules, maintenance services, link to other relevant databases and the possibility of providing brief and detailed reports in different formats.
Modelling Electrical Energy Consumption in Automotive Paint Shop
NASA Astrophysics Data System (ADS)
Oktaviandri, Muchamad; Safiee, Aidil Shafiza Bin
2018-03-01
Industry players are seeking ways to reduce operational cost to sustain in a challenging economic trend. One key aspect is an energy cost reduction. However, implementing energy reduction strategy often struggle with obstructions, which slow down their realization and implementation. Discrete event simulation method is an approach actively discussed in current research trend to overcome such obstructions because of its flexibility and comprehensiveness. Meanwhile, in automotive industry, paint shop is considered the most energy consumer area which is reported consuming about 50%-70% of overall automotive plant consumption. Hence, this project aims at providing a tool to model and simulate energy consumption at paint shop area by conducting a case study at XYZ Company, one of the automotive companies located at Pekan, Pahang. The simulation model was developed using Tecnomatix Plant Simulation software version 13. From the simulation result, the model was accurately within ±5% for energy consumption and ±15% for maximum demand after validation with real system. Two different energy saving scenarios were tested. Scenario 1 was based on production scheduling approach under low demand situation which results energy saving up to 30% on the consumption. Meanwhile scenario 2 was based on substituting high power compressor with the lower power compressor. The results were energy consumption saving of approximately 1.42% and maximum demand reduction about 1.27%. This approach would help managers and engineers to justify worthiness of investment for implementing the reduction strategies.
Incorporating cost-benefit analyses into software assurance planning
NASA Technical Reports Server (NTRS)
Feather, M. S.; Sigal, B.; Cornford, S. L.; Hutchinson, P.
2001-01-01
The objective is to use cost-benefit analyses to identify, for a given project, optimal sets of software assurance activities. Towards this end we have incorporated cost-benefit calculations into a risk management framework.
NASA Technical Reports Server (NTRS)
Mizell, Carolyn; Malone, Linda
2007-01-01
It is very difficult for project managers to develop accurate cost and schedule estimates for large, complex software development projects. None of the approaches or tools available today can estimate the true cost of software with any high degree of accuracy early in a project. This paper provides an approach that utilizes a software development process simulation model that considers and conveys the level of uncertainty that exists when developing an initial estimate. A NASA project will be analyzed using simulation and data from the Software Engineering Laboratory to show the benefits of such an approach.
Code of Federal Regulations, 2014 CFR
2014-10-01
... DEFENSE SPECIAL CATEGORIES OF CONTRACTING MAJOR SYSTEM ACQUISITION Cost and Software Data Reporting 234.7100 Policy. (a) The cost and software data reporting (CSDR) requirement is mandatory for major defense... data reporting and software resources data reporting. (b) Prior to contract award, contracting officers...
Code of Federal Regulations, 2012 CFR
2012-10-01
... DEFENSE SPECIAL CATEGORIES OF CONTRACTING MAJOR SYSTEM ACQUISITION Cost and Software Data Reporting 234.7100 Policy. (a) The cost and software data reporting (CSDR) requirement is mandatory for major defense... data reporting and software resources data reporting. (b) Prior to contract award, contracting officers...
Code of Federal Regulations, 2011 CFR
2011-10-01
... DEFENSE SPECIAL CATEGORIES OF CONTRACTING MAJOR SYSTEM ACQUISITION Cost and Software Data Reporting 234.7100 Policy. (a) The cost and software data reporting (CSDR) requirement is mandatory for major defense... data reporting and software resources data reporting. (b) Prior to contract award, contracting officers...
Code of Federal Regulations, 2013 CFR
2013-10-01
... DEFENSE SPECIAL CATEGORIES OF CONTRACTING MAJOR SYSTEM ACQUISITION Cost and Software Data Reporting 234.7100 Policy. (a) The cost and software data reporting (CSDR) requirement is mandatory for major defense... data reporting and software resources data reporting. (b) Prior to contract award, contracting officers...
1986-09-01
point here Is that the capital cost of design and development (including the cost of software tools and/or CAD/CAM programs which aided in the development...and capitalization , software Is in many ways more Ike a hardware component than it is Ike the tech- nical documentation which supports the hardware...Invoked, the owner of intelectual property rights in software may attach appropriate copyright notices to software delivered under this contract. 2.2.2
Cost benefits of advanced software: A review of methodology used at Kennedy Space Center
NASA Technical Reports Server (NTRS)
Joglekar, Prafulla N.
1993-01-01
To assist rational investments in advanced software, a formal, explicit, and multi-perspective cost-benefit analysis methodology is proposed. The methodology can be implemented through a six-stage process which is described and explained. The current practice of cost-benefit analysis at KSC is reviewed in the light of this methodology. The review finds that there is a vicious circle operating. Unsound methods lead to unreliable cost-benefit estimates. Unreliable estimates convince management that cost-benefit studies should not be taken seriously. Then, given external demands for cost-benefit estimates, management encourages software enginees to somehow come up with the numbers for their projects. Lacking the expertise needed to do a proper study, courageous software engineers with vested interests use ad hoc and unsound methods to generate some estimates. In turn, these estimates are unreliable, and the cycle continues. The proposed methodology should help KSC to break out of this cycle.
1988-09-01
20 SLIM . . . . .e & . . . . . . . . . . . . 24 SoftCost-R . . . . . . . . . . . . . . . 26 SPQR /20 . . . . . . . . . . .*. . . . . 28...PRICB-8 . . . . . . . . . . .. . 83 softCost-R ............. 84 SPQR /20 . . . . . . . . . . . . 0 . 84 System-3 . . . . . . . . . . . . . . 85 Summry...128 Appendix G: SoftCost-R Input Values . . . . . . . . . . 129 Appendix H: SoftCost-R Resources Estimate . . . . . . . 131 Appendix I: SPQR
Integrated testing and verification system for research flight software
NASA Technical Reports Server (NTRS)
Taylor, R. N.
1979-01-01
The MUST (Multipurpose User-oriented Software Technology) program is being developed to cut the cost of producing research flight software through a system of software support tools. An integrated verification and testing capability was designed as part of MUST. Documentation, verification and test options are provided with special attention on real-time, multiprocessing issues. The needs of the entire software production cycle were considered, with effective management and reduced lifecycle costs as foremost goals.
Reusable experiment controllers, case studies
NASA Astrophysics Data System (ADS)
Buckley, Brian A.; Gaasbeck, Jim Van
1996-03-01
Congress has given NASA and the science community a reality check. The tight and ever shrinking budgets are trimming the fat from many space science programs. No longer can a Principal Investigator (PI) afford to waste development dollars on re-inventing spacecraft controllers, experiment/payload controllers, ground control systems, or test sets. Inheritance of the Ground Support Equipment (GSE) from one program to another is not a significant re-use of technology to develop a science mission in these times. Reduction of operational staff and highly autonomous experiments are needed to reduce the sustaining cost of a mission. The re-use of an infrastructure from one program to another is needed to truly attain the cost and time savings required. Interface and Control Systems, Inc. (ICS) has a long history of re-usable software. Navy, Air Force, and NASA programs have benefited from the re-use of a common control system from program to program. Several standardization efforts in the AIAA have adopted the Spacecraft Command Language (SCL) architecture as a point solution to satisfy requirements for re-use and autonomy. The Environmental Research Institute of Michigan (ERIM) has been a long-standing customer of ICS and are working on their 4th generation system using SCL. Much of the hardware and software infrastructure has been re-used from mission to mission with little cost for re-hosting a new experiment. The same software infrastructure has successfully been used on Clementine, and an end-to-end system is being deployed for the Far Ultraviolet Spectroscopic Explorer (FUSE) for Johns Hopkins University. A case study of the ERIM programs, Clementine and FUSE will be detailed in this paper.
Magnetic resonance angiography: current status and future directions
2011-01-01
With recent improvement in hardware and software techniques, magnetic resonance angiography (MRA) has undergone significant changes in technique and approach. The advent of 3.0 T magnets has allowed reduction in exogenous contrast dose without compromising overall image quality. The use of novel intravascular contrast agents substantially increases the image windows and decreases contrast dose. Additionally, the lower risk and cost in non-contrast enhanced (NCE) MRA has sparked renewed interest in these methods. This article discusses the current state of both contrast-enhanced (CE) and NCE-MRA. New CE-MRA methods take advantage of dose reduction at 3.0 T, novel contrast agents, and parallel imaging methods. The risks of gadolinium-based contrast media, and the NCE-MRA methods of time-of-flight, steady-state free precession, and phase contrast are discussed. PMID:21388544
1979-12-01
team progranming in reducing software dleveloup- ment costs relative to ad hoc approaches and improving software product quality relative to...are interpreted as demonstrating the advantages of disciplined team programming in reducing software development costs relative to ad hoc approaches...is due oartialty to the cost and imoracticality of a valiI experimental setup within a oroauct ion environment. Thus the question remains, are
Software for Tracking Costs of Mars Projects
NASA Technical Reports Server (NTRS)
Wong, Alvin; Warfield, Keith
2003-01-01
The Mars Cost Tracking Model is a computer program that administers a system set up for tracking the costs of future NASA projects that pertain to Mars. Previously, no such tracking system existed, and documentation was written in a variety of formats and scattered in various places. It was difficult to justify costs or even track the history of costs of a spacecraft mission to Mars. The present software enables users to maintain all cost-model definitions, documentation, and justifications of cost estimates in one computer system that is accessible via the Internet. The software provides sign-off safeguards to ensure the reliability of information entered into the system. This system may eventually be used to track the costs of projects other than only those that pertain to Mars.
Advanced ground station architecture
NASA Technical Reports Server (NTRS)
Zillig, David; Benjamin, Ted
1994-01-01
This paper describes a new station architecture for NASA's Ground Network (GN). The architecture makes efficient use of emerging technologies to provide dramatic reductions in size, operational complexity, and operational and maintenance costs. The architecture, which is based on recent receiver work sponsored by the Office of Space Communications Advanced Systems Program, allows integration of both GN and Space Network (SN) modes of operation in the same electronics system. It is highly configurable through software and the use of charged coupled device (CCD) technology to provide a wide range of operating modes. Moreover, it affords modularity of features which are optional depending on the application. The resulting system incorporates advanced RF, digital, and remote control technology capable of introducing significant operational, performance, and cost benefits to a variety of NASA communications and tracking applications.
The Hidden Cost of Buying a Computer.
ERIC Educational Resources Information Center
Johnson, Michael
1983-01-01
In order to process data in a computer, application software must be either developed or purchased. Costs for modifications of the software package and maintenance are often hidden. The decision to buy or develop software packages should be based upon factors of time and maintenance. (MLF)
Smart energy management system
NASA Astrophysics Data System (ADS)
Desai, Aniruddha; Singh, Jugdutt
2010-04-01
Peak and average energy usage in domestic and industrial environments is growing rapidly and absence of detailed energy consumption metrics is making systematic reduction of energy usage very difficult. Smart energy management system aims at providing a cost-effective solution for managing soaring energy consumption and its impact on green house gas emissions and climate change. The solution is based on seamless integration of existing wired and wireless communication technologies combined with smart context-aware software which offers a complete solution for automation of energy measurement and device control. The persuasive software presents users with easy-to-assimilate visual cues identifying problem areas and time periods and encourages a behavioural change to conserve energy. The system allows analysis of real-time/statistical consumption data with the ability to drill down into detailed analysis of power consumption, CO2 emissions and cost. The system generates intelligent projections and suggests potential methods (e.g. reducing standby, tuning heating/cooling temperature, etc.) of reducing energy consumption. The user interface is accessible using web enabled devices such as PDAs, PCs, etc. or using SMS, email, and instant messaging. Successful real-world trial of the system has demonstrated the potential to save 20 to 30% energy consumption on an average. Low cost of deployment and the ability to easily manage consumption from various web enabled devices offers gives this system a high penetration and impact capability offering a sustainable solution to act on climate change today.
Developmental approach towards high resolution optical coherence tomography for glaucoma diagnostics
NASA Astrophysics Data System (ADS)
Kemper, Björn; Ketelhut, Steffi; Heiduschka, Peter; Thorn, Marie; Larsen, Michael; Schnekenburger, Jürgen
2018-02-01
Glaucoma is caused by a pathological rise in the intraocular pressure, which results in a progressive loss of vision by a damage to retinal cells and the optical nerve head. Early detection of pressure-induced damage is thus essential for the reduction of eye pressure and to prevent severe incapacity or blindness. Within the new European Project GALAHAD (Glaucoma Advanced, Label free High Resolution Automated OCT Diagnostics), we will develop a new low-cost and high-resolution OCT system for the early detection of glaucoma. The device is designed to improve diagnosis based on a new system of optical coherence tomography. Although OCT systems are at present available in ophthalmology centres, high-resolution devices are extremely expensive. The novelty of the new Galahad system is its super wideband light source to achieve high image resolution at a reasonable cost. Proof of concept experiments with cell and tissue Glaucoma test standards and animal models are planned for the test of the new optical components and new algorithms performance for the identification of Glaucoma associated cell and tissue structures. The intense training of the software systems with various samples should result in a increased sensitivity and specificity of the OCT software system.
Technology-driven dietary assessment: a software developer’s perspective
Buday, Richard; Tapia, Ramsey; Maze, Gary R.
2015-01-01
Dietary researchers need new software to improve nutrition data collection and analysis, but creating information technology is difficult. Software development projects may be unsuccessful due to inadequate understanding of needs, management problems, technology barriers or legal hurdles. Cost overruns and schedule delays are common. Barriers facing scientific researchers developing software include workflow, cost, schedule, and team issues. Different methods of software development and the role that intellectual property rights play are discussed. A dietary researcher must carefully consider multiple issues to maximize the likelihood of success when creating new software. PMID:22591224
IUWare and Computing Tools: Indiana University's Approach to Low-Cost Software.
ERIC Educational Resources Information Center
Sheehan, Mark C.; Williams, James G.
1987-01-01
Describes strategies for providing low-cost microcomputer-based software for classroom use on college campuses. Highlights include descriptions of the software (IUWare and Computing Tools); computing center support; license policies; documentation; promotion; distribution; staff, faculty, and user training; problems; and future plans. (LRW)
Yuki, I; Kambayashi, Y; Ikemura, A; Abe, Y; Kan, I; Mohamed, A; Dahmani, C; Suzuki, T; Ishibashi, T; Takao, H; Urashima, M; Murayama, Y
2016-02-01
Combination of high-resolution C-arm CT and novel metal artifact reduction software may contribute to the assessment of aneurysms treated with stent-assisted coil embolization. This study aimed to evaluate the efficacy of a novel Metal Artifact Reduction prototype software combined with the currently available high spatial-resolution C-arm CT prototype implementation by using an experimental aneurysm model treated with stent-assisted coil embolization. Eight experimental aneurysms were created in 6 swine. Coil embolization of each aneurysm was performed by using a stent-assisted technique. High-resolution C-arm CT with intra-arterial contrast injection was performed immediately after the treatment. The obtained images were processed with Metal Artifact Reduction. Five neurointerventional specialists reviewed the image quality before and after Metal Artifact Reduction. Observational and quantitative analyses (via image analysis software) were performed. Every aneurysm was successfully created and treated with stent-assisted coil embolization. Before Metal Artifact Reduction, coil loops protruding through the stent lumen were not visualized due to the prominent metal artifacts produced by the coils. These became visible after Metal Artifact Reduction processing. Contrast filling in the residual aneurysm was also visualized after Metal Artifact Reduction in every aneurysm. Both the observational (P < .0001) and quantitative (P < .001) analyses showed significant reduction of the metal artifacts after application of the Metal Artifact Reduction prototype software. The combination of high-resolution C-arm CT and Metal Artifact Reduction enables differentiation of the coil mass, stent, and contrast material on the same image by significantly reducing the metal artifacts produced by the platinum coils. This novel image technique may improve the assessment of aneurysms treated with stent-assisted coil embolization. © 2016 by American Journal of Neuroradiology.
MYRaf: A new Approach with IRAF for Astronomical Photometric Reduction
NASA Astrophysics Data System (ADS)
Kilic, Y.; Shameoni Niaei, M.; Özeren, F. F.; Yesilyaprak, C.
2016-12-01
In this study, the design and some developments of MYRaf software for astronomical photometric reduction are presented. MYRaf software is an easy to use, reliable, and has a fast IRAF aperture photometry GUI tools. MYRaf software is an important step for the automated software process of robotic telescopes, and uses IRAF, PyRAF, matplotlib, ginga, alipy, and Sextractor with the general-purpose and high-level programming language Python and uses the QT framework.
Integrated computer-aided design using minicomputers
NASA Technical Reports Server (NTRS)
Storaasli, O. O.
1980-01-01
Computer-Aided Design/Computer-Aided Manufacturing (CAD/CAM), a highly interactive software, has been implemented on minicomputers at the NASA Langley Research Center. CAD/CAM software integrates many formerly fragmented programs and procedures into one cohesive system; it also includes finite element modeling and analysis, and has been interfaced via a computer network to a relational data base management system and offline plotting devices on mainframe computers. The CAD/CAM software system requires interactive graphics terminals operating at a minimum of 4800 bits/sec transfer rate to a computer. The system is portable and introduces 'interactive graphics', which permits the creation and modification of models interactively. The CAD/CAM system has already produced designs for a large area space platform, a national transonic facility fan blade, and a laminar flow control wind tunnel model. Besides the design/drafting element analysis capability, CAD/CAM provides options to produce an automatic program tooling code to drive a numerically controlled (N/C) machine. Reductions in time for design, engineering, drawing, finite element modeling, and N/C machining will benefit productivity through reduced costs, fewer errors, and a wider range of configuration.
Code of Federal Regulations, 2010 CFR
2010-04-01
... increases. (b) At the owner's option, the cost of the computer software may include service contracts to... requirements. (c) The source of funds for the purchase of hardware or software, or contracting for services for... formatted data, including either the purchase and maintenance of computer hardware or software, or both, the...
A low-cost PC-based telemetry data-reduction system
NASA Astrophysics Data System (ADS)
Simms, D. A.; Butterfield, C. P.
1990-04-01
The Solar Energy Research Institute's (SERI) Wind Research Branch is using Pulse Code Modulation (PCM) telemetry data-acquisition systems to study horizontal-axis wind turbines. PCM telemetry systems are used in test installations that require accurate multiple-channel measurements taken from a variety of different locations. SERI has found them ideal for use in tests requiring concurrent acquisition of data-reduction system to facilitate quick, in-the-field multiple-channel data analysis. Called the PC-PCM System, it consists of two basic components. First, AT-compatible hardware boards are used for decoding and combining PCM data streams. Up to four hardware boards can be installed in a single PC, which provides the capability to combine data from four PCM streams directly to PC disk or memory. Each stream can have up to 62 data channels. Second, a software package written for the DOS operating system was developed to simplify data-acquisition control and management. The software provides a quick, easy-to-use interface between the PC and PCM data streams. Called the Quick-Look Data Management Program, it is a comprehensive menu-driven package used to organize, acquire, process, and display information from incoming PCM data streams. This paper describes both hardware and software aspects of the SERI PC-PCM system, concentrating on features that make it useful in an experiment test environment to quickly examine and verify incoming data. Also discussed are problems and techniques associated with PC-based telemetry data acquisition, processing, and real-time display.
Clinical impact and value of workstation single sign-on.
Gellert, George A; Crouch, John F; Gibson, Lynn A; Conklin, George S; Webster, S Luke; Gillean, John A
2017-05-01
CHRISTUS Health began implementation of computer workstation single sign-on (SSO) in 2015. SSO technology utilizes a badge reader placed at each workstation where clinicians swipe or "tap" their identification badges. To assess the impact of SSO implementation in reducing clinician time logging in to various clinical software programs, and in financial savings from migrating to a thin client that enabled replacement of traditional hard drive computer workstations. Following implementation of SSO, a total of 65,202 logins were sampled systematically during a 7day period among 2256 active clinical end users for time saved in 6 facilities when compared to pre-implementation. Dollar values were assigned to the time saved by 3 groups of clinical end users: physicians, nurses and ancillary service providers. The reduction of total clinician login time over the 7day period showed a net gain of 168.3h per week of clinician time - 28.1h (2.3 shifts) per facility per week. Annualized, 1461.2h of mixed physician and nursing time is liberated per facility per annum (121.8 shifts of 12h per year). The annual dollar cost savings of this reduction of time expended logging in is $92,146 per hospital per annum and $1,658,745 per annum in the first phase implementation of 18 hospitals. Computer hardware equipment savings due to desktop virtualization increases annual savings to $2,333,745. Qualitative value contributions to clinician satisfaction, reduction in staff turnover, facilitation of adoption of EHR applications, and other benefits of SSO are discussed. SSO had a positive impact on clinician efficiency and productivity in the 6 hospitals evaluated, and is an effective and cost-effective method to liberate clinician time from repetitive and time consuming logins to clinical software applications. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
Software Products for Temperature Data Reduction of Platinum Resistance Thermometers (PRT)
NASA Technical Reports Server (NTRS)
Sherrod, Jerry K.
1998-01-01
The main objective of this project is to create user-friendly personal computer (PC) software for reduction/analysis of platinum resistance thermometer (PRT) data. Software products were designed and created to help users of PRT data with the tasks of using the Callendar-Van Dusen method. Sample runs are illustrated in this report.
Cost considerations in automating the library.
Bolef, D
1987-01-01
The purchase price of a computer and its software is but a part of the cost of any automated system. There are many additional costs, including one-time costs of terminals, printers, multiplexors, microcomputers, consultants, workstations and retrospective conversion, and ongoing costs of maintenance and maintenance contracts for the equipment and software, telecommunications, and supplies. This paper examines those costs in an effort to produce a more realistic picture of an automated system. PMID:3594021
Examining the Effect of the Die Angle on Tool Load and Wear in the Extrusion Process
NASA Astrophysics Data System (ADS)
Nowotyńska, Irena; Kut, Stanisław
2014-04-01
The tool durability is a crucial factor in each manufacturing process, and this also includes the extrusion process. Striving to achieve the higher product quality should be accompanied by a long-term tool life and production cost reduction. This article presents the comparative research of load and wear of die at various angles of working cone during the concurrent extrusion. The numerical calculations of a tool load during the concurrent extrusion were performed using the MSC MARC software using the finite element method (FEM). Archard model was used to determine and compare die wear. This model was implemented in the software using the FEM. The examined tool deformations and stress distribution were determined based on the performed analyses. The die wear depth at various working cone angles was determined. Properly shaped die has an effect on the extruded material properties, but also controls loads, elastic deformation, and the tool life.
A cross-validation package driving Netica with python
Fienen, Michael N.; Plant, Nathaniel G.
2014-01-01
Bayesian networks (BNs) are powerful tools for probabilistically simulating natural systems and emulating process models. Cross validation is a technique to avoid overfitting resulting from overly complex BNs. Overfitting reduces predictive skill. Cross-validation for BNs is known but rarely implemented due partly to a lack of software tools designed to work with available BN packages. CVNetica is open-source, written in Python, and extends the Netica software package to perform cross-validation and read, rebuild, and learn BNs from data. Insights gained from cross-validation and implications on prediction versus description are illustrated with: a data-driven oceanographic application; and a model-emulation application. These examples show that overfitting occurs when BNs become more complex than allowed by supporting data and overfitting incurs computational costs as well as causing a reduction in prediction skill. CVNetica evaluates overfitting using several complexity metrics (we used level of discretization) and its impact on performance metrics (we used skill).
Laboratory cost control and financial management software.
Mayer, M
1998-02-09
Economical constraints within the health care system advocate the introduction of tighter control of costs in clinical laboratories. Detailed cost information forms the basis for cost control and financial management. Based on the cost information, proper decisions regarding priorities, procedure choices, personnel policies and investments can be made. This presentation outlines some principles of cost analysis, describes common limitations of cost analysis, and exemplifies use of software to achieve optimized cost control. One commercially available cost analysis software, LabCost, is described in some detail. In addition to provision of cost information, LabCost also serves as a general management tool for resource handling, accounting, inventory management and billing. The application of LabCost in the selection process of a new high throughput analyzer for a large clinical chemistry service is taken as an example for decisions that can be assisted by cost evaluation. It is concluded that laboratory management that wisely utilizes cost analysis to support the decision-making process will undoubtedly have a clear advantage over those laboratories that fail to employ cost considerations to guide their actions.
Deep space network software cost estimation model
NASA Technical Reports Server (NTRS)
Tausworthe, R. C.
1981-01-01
A parametric software cost estimation model prepared for Jet PRopulsion Laboratory (JPL) Deep Space Network (DSN) Data System implementation tasks is described. The resource estimation mdel modifies and combines a number of existing models. The model calibrates the task magnitude and difficulty, development environment, and software technology effects through prompted responses to a set of approximately 50 questions. Parameters in the model are adjusted to fit JPL software life-cycle statistics.
1977-05-01
C31) programs; (4) simulator/ trainer programs ; and (5) automatic test equipment software. Each of these five types of software represents a problem...coded in the same source language, say JOVIAL, then source—language statements would be a better measure, since that would automatically compensate...whether done at no (visible) cost or by renegotiation of the contract. Fig. 2.3 illustrates these with solid lines. It is conjec- tured that the change
Secure it now or secure it later: the benefits of addressing cyber-security from the outset
NASA Astrophysics Data System (ADS)
Olama, Mohammed M.; Nutaro, James
2013-05-01
The majority of funding for research and development (R&D) in cyber-security is focused on the end of the software lifecycle where systems have been deployed or are nearing deployment. Recruiting of cyber-security personnel is similarly focused on end-of-life expertise. By emphasizing cyber-security at these late stages, security problems are found and corrected when it is most expensive to do so, thus increasing the cost of owning and operating complex software systems. Worse, expenditures on expensive security measures often mean less money for innovative developments. These unwanted increases in cost and potential slowing of innovation are unavoidable consequences of an approach to security that finds and remediate faults after software has been implemented. We argue that software security can be improved and the total cost of a software system can be substantially reduced by an appropriate allocation of resources to the early stages of a software project. By adopting a similar allocation of R&D funds to the early stages of the software lifecycle, we propose that the costs of cyber-security can be better controlled and, consequently, the positive effects of this R&D on industry will be much more pronounced.
Software Development Cost Estimation Executive Summary
NASA Technical Reports Server (NTRS)
Hihn, Jairus M.; Menzies, Tim
2006-01-01
Identify simple fully validated cost models that provide estimation uncertainty with cost estimate. Based on COCOMO variable set. Use machine learning techniques to determine: a) Minimum number of cost drivers required for NASA domain based cost models; b) Minimum number of data records required and c) Estimation Uncertainty. Build a repository of software cost estimation information. Coordinating tool development and data collection with: a) Tasks funded by PA&E Cost Analysis; b) IV&V Effort Estimation Task and c) NASA SEPG activities.
Differential maneuvering simulator data reduction and analysis software
NASA Technical Reports Server (NTRS)
Beasley, G. P.; Sigman, R. S.
1972-01-01
A multielement data reduction and analysis software package has been developed for use with the Langley differential maneuvering simulator (DMS). This package, which has several independent elements, was developed to support all phases of DMS aircraft simulation studies with a variety of both graphical and tabular information. The overall software package is considered unique because of the number, diversity, and sophistication of the element programs available for use in a single study. The purpose of this paper is to discuss the overall DMS data reduction and analysis package by reviewing the development of the various elements of the software, showing typical results that can be obtained, and discussing how each element can be used.
Chalazonitis, A N; Koumarianos, D; Tzovara, J; Chronopoulos, P
2003-06-01
Over the past decade, the technology that permits images to be digitized and the reduction in the cost of digital equipment allows quick digital transfer of any conventional radiological film. Images then can be transferred to a personal computer, and several software programs are available that can manipulate their digital appearance. In this article, the fundamentals of digital imaging are discussed, as well as the wide variety of optional adjustments that the Adobe Photoshop 6.0 (Adobe Systems, San Jose, CA) program can offer to present radiological images with satisfactory digital imaging quality.
NASA Technical Reports Server (NTRS)
2000-01-01
Genoa is a software product that predicts progressive aging and failure in a variety of materials. It is the result of a SBIR contract between the Glenn Research Center and Alpha Star Corporation. Genoa allows designers to determine if the materials they plan on applying to a structure are up to the task or if alternate materials should be considered. Genoa's two feature applications are its progressive failure simulations and its test verification. It allows for a reduction in inspection frequency, rapid design solutions, and manufacturing with low cost materials. It will benefit the aerospace, airline, and automotive industries, with future applications for other uses.
NASA Technical Reports Server (NTRS)
1996-01-01
Teledyne Brown developed a computer-based interactive multimedia training system for use with the Crystal Growth Furnace in the U.S. Microgravity Laboratory-2 mission on the Space Shuttle. Teledyne Brown commercialized the system and customized it for PPG Industries Aircraft Products. The system challenges learners with role-playing scenarios and software-driven simulations engaging all the senses using text, video, animation, voice, sounds and music. The transfer of this technology to commercial industrial process training has resulted in significant improvements in effectiveness, standardization, and quality control, as well as cost reductions over the usual classroom and on-the- job training approaches.
Design Activity in the Software Cost Reduction Project.
1986-08-18
PM Physical Model S G System Generation SS Shared Services SU System Utilities . NOV M N 1600SEP A 0 JUL TOTAL 14000 MAAR cc 100 FEB :IESGN 0o 10000...iy---- .... ;’ TESTING Jan 78 Jan 79 Jan 80 Jan 81 Jan 82 Jan 83 Jan 84 Jan 85 M3ITH Fig. 7 - Shared services activities A F 0 U E C 1600 G B T...DISCUSSING 200M Jan 78 Jan 79 Jan 80 Jan 81 Jan 82 Jan 83 Jan 84 Jan 85 Fig 13 - Shared services design activities 5.~ S% 12 ......,ooU7 . . NRL REPORT 8974 A
General aviation design synthesis utilizing interactive computer graphics
NASA Technical Reports Server (NTRS)
Galloway, T. L.; Smith, M. R.
1976-01-01
Interactive computer graphics is a fast growing area of computer application, due to such factors as substantial cost reductions in hardware, general availability of software, and expanded data communication networks. In addition to allowing faster and more meaningful input/output, computer graphics permits the use of data in graphic form to carry out parametric studies for configuration selection and for assessing the impact of advanced technologies on general aviation designs. The incorporation of interactive computer graphics into a NASA developed general aviation synthesis program is described, and the potential uses of the synthesis program in preliminary design are demonstrated.
Designing an optimal software intensive system acquisition: A game theoretic approach
NASA Astrophysics Data System (ADS)
Buettner, Douglas John
The development of schedule-constrained software-intensive space systems is challenging. Case study data from national security space programs developed at the U.S. Air Force Space and Missile Systems Center (USAF SMC) provide evidence of the strong desire by contractors to skip or severely reduce software development design and early defect detection methods in these schedule-constrained environments. The research findings suggest recommendations to fully address these issues at numerous levels. However, the observations lead us to investigate modeling and theoretical methods to fundamentally understand what motivated this behavior in the first place. As a result, Madachy's inspection-based system dynamics model is modified to include unit testing and an integration test feedback loop. This Modified Madachy Model (MMM) is used as a tool to investigate the consequences of this behavior on the observed defect dynamics for two remarkably different case study software projects. Latin Hypercube sampling of the MMM with sample distributions for quality, schedule and cost-driven strategies demonstrate that the higher cost and effort quality-driven strategies provide consistently better schedule performance than the schedule-driven up-front effort-reduction strategies. Game theory reasoning for schedule-driven engineers cutting corners on inspections and unit testing is based on the case study evidence and Austin's agency model to describe the observed phenomena. Game theory concepts are then used to argue that the source of the problem and hence the solution to developers cutting corners on quality for schedule-driven system acquisitions ultimately lies with the government. The game theory arguments also lead to the suggestion that the use of a multi-player dynamic Nash bargaining game provides a solution for our observed lack of quality game between the government (the acquirer) and "large-corporation" software developers. A note is provided that argues this multi-player dynamic Nash bargaining game also provides the solution to Freeman Dyson's problem, for a way to place a label of good or bad on systems.
Software-Enabled Project Management Techniques and Their Relationship to the Triple Constraints
ERIC Educational Resources Information Center
Elleh, Festus U.
2013-01-01
This study investigated the relationship between software-enabled project management techniques and the triple constraints (time, cost, and scope). There was the dearth of academic literature that focused on the relationship between software-enabled project management techniques and the triple constraints (time, cost, and scope). Based on the gap…
48 CFR 234.7101 - Solicitation provision and contract clause.
Code of Federal Regulations, 2012 CFR
2012-10-01
... Software Data Reporting 234.7101 Solicitation provision and contract clause. (a)(1) Use the provision at 252.234-7003, Notice of Cost and Software Data Reporting System, in all solicitations that include the clause at 252.234-7004, Cost and Software Data Reporting. (2) Use the provision with its Alternate I when...
48 CFR 234.7101 - Solicitation provision and contract clause.
Code of Federal Regulations, 2014 CFR
2014-10-01
... Software Data Reporting 234.7101 Solicitation provision and contract clause. (a)(1) Use the provision at 252.234-7003, Notice of Cost and Software Data Reporting System, in all solicitations that include the clause at 252.234-7004, Cost and Software Data Reporting. (2) Use the provision with its Alternate I when...
48 CFR 234.7101 - Solicitation provision and contract clause.
Code of Federal Regulations, 2011 CFR
2011-10-01
... Software Data Reporting 234.7101 Solicitation provision and contract clause. (a)(1) Use the provision at 252.234-7003, Notice of Cost and Software Data Reporting System, in all solicitations that include the clause at 252.234-7004, Cost and Software Data Reporting. (2) Use the provision with its Alternate I when...
48 CFR 234.7101 - Solicitation provision and contract clause.
Code of Federal Regulations, 2013 CFR
2013-10-01
... Software Data Reporting 234.7101 Solicitation provision and contract clause. (a)(1) Use the provision at 252.234-7003, Notice of Cost and Software Data Reporting System, in all solicitations that include the clause at 252.234-7004, Cost and Software Data Reporting. (2) Use the provision with its Alternate I when...
An Exploratory Study of Software Cost Estimating at the Electronic Systems Division.
1976-07-01
action’. to improve the software cost Sestimating proces., While thin research was limited to the M.nD onvironment, the same types of problema may exist...Methods in Social Science. Now York: Random House, 1969. 57. Smith, Ronald L. Structured Programming Series (Vol. XI) - Estimating Software Project
ERIC Educational Resources Information Center
Lowe, John B.
1988-01-01
If the CD-ROM revolution is likened to gambling, players are information providers and consumers; the stakes are development, production, distribution, hardware, and software costs; and betting is represented by the costs of updating disks and hardware and software maintenance, and by pricing. Strategy should take into account cost savings,…
NASA Technical Reports Server (NTRS)
Fridge, Ernest M., III; Hiott, Jim; Golej, Jim; Plumb, Allan
1993-01-01
Today's software systems generally use obsolete technology, are not integrated properly with other software systems, and are difficult and costly to maintain. The discipline of reverse engineering is becoming prominent as organizations try to move their systems up to more modern and maintainable technology in a cost effective manner. The Johnson Space Center (JSC) created a significant set of tools to develop and maintain FORTRAN and C code during development of the space shuttle. This tool set forms the basis for an integrated environment to reengineer existing code into modern software engineering structures which are then easier and less costly to maintain and which allow a fairly straightforward translation into other target languages. The environment will support these structures and practices even in areas where the language definition and compilers do not enforce good software engineering. The knowledge and data captured using the reverse engineering tools is passed to standard forward engineering tools to redesign or perform major upgrades to software systems in a much more cost effective manner than using older technologies. The latest release of the environment was in Feb. 1992.
Automating Risk Analysis of Software Design Models
Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P.
2014-01-01
The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance. PMID:25136688
Automating risk analysis of software design models.
Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P
2014-01-01
The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.
Analyzing the costs to deliver medication therapy management services.
Rupp, Michael T
2011-01-01
To provide pharmacy managers and consultant pharmacists with a step-by-step approach for analyzing of the costs of delivering medication therapy management (MTM) services and to describe use of a free online software application for determining costs of delivering MTM. The process described is applicable to community pharmacies and consultant pharmacists who provide MTM services from nonpharmacy settings. The PharmAccount Service Cost Calculator is an Internet- based software application that uses a guided online interview to collect information needed to conduct a comprehensive cost analysis of any specialized pharmacy service. In addition to direct variable and fixed costs, the software automatically allocates indirect and overhead costs to the service and generates an itemized report that details the components of service delivery costs. The service cost calculator is sufficiently flexible to support the analysis of virtually any specialized pharmacy service, irrespective of whether the service is being delivered from a physical pharmacy. The software application allows users to perform sensitivity analysis to quickly determine the potential impact that alternate scenarios would have on service delivery cost. It is therefore particularly well suited to assist in the design and planning of a new pharmacy service. Good management requires that the cost implications of service delivery decisions are known and considered. Analyzing the cost of an MTM service is an important step in developing a sustainable business model.
NASA Astrophysics Data System (ADS)
Demuzere, Matthias; Kassomenos, P.; Philipp, A.
2011-08-01
In the framework of the COST733 Action "Harmonisation and Applications of Weather Types Classifications for European Regions" a new circulation type classification software (hereafter, referred to as cost733class software) is developed. The cost733class software contains a variety of (European) classification methods and is flexible towards choice of domain of interest, input variables, time step, number of circulation types, sequencing and (weighted) target variables. This work introduces the capabilities of the cost733class software in which the resulting circulation types (CTs) from various circulation type classifications (CTCs) are applied on observed summer surface ozone concentrations in Central Europe. Firstly, the main characteristics of the CTCs in terms of circulation pattern frequencies are addressed using the baseline COST733 catalogue (cat 2.0), at present the latest product of the new cost733class software. In a second step, the probabilistic Brier skill score is used to quantify the explanatory power of all classifications in terms of the maximum 8 hourly mean ozone concentrations exceeding the 120-μg/m3 threshold; this was based on ozone concentrations from 130 Central European measurement stations. Averaged evaluation results over all stations indicate generally higher performance of CTCs with a higher number of types. Within the subset of methodologies with a similar number of types, the results suggest that the use of CTCs based on optimisation algorithms are performing slightly better than those which are based on other algorithms (predefined thresholds, principal component analysis and leader algorithms). The results are further elaborated by exploring additional capabilities of the cost733class software. Sensitivity experiments are performed using different domain sizes, input variables, seasonally based classifications and multiple-day sequencing. As an illustration, CTCs which are also conditioned towards temperature with various weights are derived and tested similarly. All results exploit a physical interpretation by adapting the environment-to-circulation approach, providing more detailed information on specific synoptic conditions prevailing on days with high surface ozone concentrations. This research does not intend to bring forward a favourite classification methodology or construct a statistical ozone forecasting tool but should be seen as an introduction to the possibilities of the cost733class software. It this respect, the results presented here can provide a basic user support for the cost733class software and the development of a more user- or application-specific CTC approach.
Optimizing Performance Parameters of Chemically-Derived Graphene/p-Si Heterojunction Solar Cell.
Batra, Kamal; Nayak, Sasmita; Behura, Sanjay K; Jani, Omkar
2015-07-01
Chemically-derived graphene have been synthesized by modified Hummers method and reduced using sodium borohydride. To explore the potential for photovoltaic applications, graphene/p-silicon (Si) heterojunction devices were fabricated using a simple and cost effective technique called spin coating. The SEM analysis shows the formation of graphene oxide (GO) flakes which become smooth after reduction. The absence of oxygen containing functional groups, as observed in FT-IR spectra, reveals the reduction of GO, i.e., reduced graphene oxide (rGO). It was further confirmed by Raman analysis, which shows slight reduction in G-band intensity with respect to D-band. Hall effect measurement confirmed n-type nature of rGO. Therefore, an effort has been made to simu- late rGO/p-Si heterojunction device by using the one-dimensional solar cell capacitance software, considering the experimentally derived parameters. The detail analysis of the effects of Si thickness, graphene thickness and temperature on the performance of the device has been presented.
Software Cost Estimation Using a Decision Graph Process: A Knowledge Engineering Approach
NASA Technical Reports Server (NTRS)
Stukes, Sherry; Spagnuolo, John, Jr.
2011-01-01
This paper is not a description per se of the efforts by two software cost analysts. Rather, it is an outline of the methodology used for FSW cost analysis presented in a form that would serve as a foundation upon which others may gain insight into how to perform FSW cost analyses for their own problems at hand.
1988-12-01
software development scene is often charac- c. SPQR Model-Jones terized by: * schedule and cost estimates that are gross-d. COPMO-Thebaut ly inaccurate, SEI...time c. SPQR Model-Jones (in seconds) is simply derived from E by dividing T. Capers Jones has developed a software cost by the Stroud number, S...estimation model called the Software Produc- T=E/S tivity, Quality, and Reliability ( SPQR ) model. The basic approach is similar to that of Boehm’s The value
A Decision Support System for Planning, Control and Auditing of DoD Software Cost Estimation.
1986-03-01
is frequently used in U. S. Air Force software cost estimates. Barry Boehm’s Constructive Cost Estimation Model (COCOMO) was recently selected for use...are considered basic to the proper development of software. Pressman , [Ref. 11], addresses these basic elements in a manner which attempts to integrate...H., Jr., and Carlson, Eric D., Building E fective Decision SUDDOrt Systems, Prentice-Hal, EnglewoodNJ, 1982 11. Pressman , Roger S., o A Practioner’s A
Integrated Design and Implementation of Embedded Control Systems with Scilab
Ma, Longhua; Xia, Feng; Peng, Zhe
2008-01-01
Embedded systems are playing an increasingly important role in control engineering. Despite their popularity, embedded systems are generally subject to resource constraints and it is therefore difficult to build complex control systems on embedded platforms. Traditionally, the design and implementation of control systems are often separated, which causes the development of embedded control systems to be highly time-consuming and costly. To address these problems, this paper presents a low-cost, reusable, reconfigurable platform that enables integrated design and implementation of embedded control systems. To minimize the cost, free and open source software packages such as Linux and Scilab are used. Scilab is ported to the embedded ARM-Linux system. The drivers for interfacing Scilab with several communication protocols including serial, Ethernet, and Modbus are developed. Experiments are conducted to test the developed embedded platform. The use of Scilab enables implementation of complex control algorithms on embedded platforms. With the developed platform, it is possible to perform all phases of the development cycle of embedded control systems in a unified environment, thus facilitating the reduction of development time and cost. PMID:27873827
Integrated Design and Implementation of Embedded Control Systems with Scilab.
Ma, Longhua; Xia, Feng; Peng, Zhe
2008-09-05
Embedded systems are playing an increasingly important role in control engineering. Despite their popularity, embedded systems are generally subject to resource constraints and it is therefore difficult to build complex control systems on embedded platforms. Traditionally, the design and implementation of control systems are often separated, which causes the development of embedded control systems to be highly timeconsuming and costly. To address these problems, this paper presents a low-cost, reusable, reconfigurable platform that enables integrated design and implementation of embedded control systems. To minimize the cost, free and open source software packages such as Linux and Scilab are used. Scilab is ported to the embedded ARM-Linux system. The drivers for interfacing Scilab with several communication protocols including serial, Ethernet, and Modbus are developed. Experiments are conducted to test the developed embedded platform. The use of Scilab enables implementation of complex control algorithms on embedded platforms. With the developed platform, it is possible to perform all phases of the development cycle of embedded control systems in a unified environment, thus facilitating the reduction of development time and cost.
Software cost/resource modeling: Deep space network software cost estimation model
NASA Technical Reports Server (NTRS)
Tausworthe, R. J.
1980-01-01
A parametric software cost estimation model prepared for JPL deep space network (DSN) data systems implementation tasks is presented. The resource estimation model incorporates principles and data from a number of existing models, such as those of the General Research Corporation, Doty Associates, IBM (Walston-Felix), Rome Air Force Development Center, University of Maryland, and Rayleigh-Norden-Putnam. The model calibrates task magnitude and difficulty, development environment, and software technology effects through prompted responses to a set of approximately 50 questions. Parameters in the model are adjusted to fit JPL software lifecycle statistics. The estimation model output scales a standard DSN work breakdown structure skeleton, which is then input to a PERT/CPM system, producing a detailed schedule and resource budget for the project being planned.
The Future of Data Reduction at UKIRT
NASA Astrophysics Data System (ADS)
Economou, F.; Bridger, A.; Wright, G. S.; Rees, N. P.; Jenness, T.
The Observatory Reduction and Acquisition Control (ORAC) project is a comprehensive re-implementation of all existing instrument user interfaces and data handling software involved at the United Kingdom Infrared Telescope (UKIRT). This paper addresses the design of the data reduction part of the system. Our main aim is to provide data reduction facilities for the new generation of UKIRT instruments of a similar standard to our current software packages, which have enjoyed success because of their science-driven approach. Additionally we wish to use modern software techniques in order to produce a system that is portable, flexible and extensible so as to have modest maintenance requirements, both in the medium and the longer term.
Low-Cost, High-Throughput Sequencing of DNA Assemblies Using a Highly Multiplexed Nextera Process.
Shapland, Elaine B; Holmes, Victor; Reeves, Christopher D; Sorokin, Elena; Durot, Maxime; Platt, Darren; Allen, Christopher; Dean, Jed; Serber, Zach; Newman, Jack; Chandran, Sunil
2015-07-17
In recent years, next-generation sequencing (NGS) technology has greatly reduced the cost of sequencing whole genomes, whereas the cost of sequence verification of plasmids via Sanger sequencing has remained high. Consequently, industrial-scale strain engineers either limit the number of designs or take short cuts in quality control. Here, we show that over 4000 plasmids can be completely sequenced in one Illumina MiSeq run for less than $3 each (15× coverage), which is a 20-fold reduction over using Sanger sequencing (2× coverage). We reduced the volume of the Nextera tagmentation reaction by 100-fold and developed an automated workflow to prepare thousands of samples for sequencing. We also developed software to track the samples and associated sequence data and to rapidly identify correctly assembled constructs having the fewest defects. As DNA synthesis and assembly become a centralized commodity, this NGS quality control (QC) process will be essential to groups operating high-throughput pipelines for DNA construction.
A Programmable Plug & Play Sensor Interface for WSN Applications
Vera, Sergio D.; Bayo, Alberto; Medrano, Nicolás; Calvo, Belén; Celma, Santiago
2011-01-01
Cost reduction in wireless sensor networks (WSN) becomes a priority when extending their application to fields where a great number of sensors is needed, such as habitat monitoring, precision agriculture or diffuse greenhouse emission measurement. In these cases, the use of smart sensors is expensive, consequently requiring the use of low-cost sensors. The solution to convert such generic low-cost sensors into intelligent ones leads to the implementation of a versatile system with enhanced processing and storage capabilities to attain a plug and play electronic interface able to adapt to all the sensors used. This paper focuses on this issue and presents a low-voltage plug & play reprogrammable interface capable of adapting to different sensor types and achieving an optimum reading performance for every sensor. The proposed interface, which includes both electronic and software elements so that it can be easily integrated in WSN nodes, is described and experimental test results to validate its performance are given. PMID:22164118
Expanding Software Productivity and Power while Reducing Costs.
ERIC Educational Resources Information Center
Winer, Ellen N.
1988-01-01
Microcomputer efficiency and software economy can be achieved through file transfer and data sharing. Costs can be reduced by purchasing computer systems that allow for expansion and portability of data. (MLF)
PDT - PARTICLE DISPLACEMENT TRACKING SOFTWARE
NASA Technical Reports Server (NTRS)
Wernet, M. P.
1994-01-01
Particle Imaging Velocimetry (PIV) is a quantitative velocity measurement technique for measuring instantaneous planar cross sections of a flow field. The technique offers very high precision (1%) directionally resolved velocity vector estimates, but its use has been limited by high equipment costs and complexity of operation. Particle Displacement Tracking (PDT) is an all-electronic PIV data acquisition and reduction procedure which is simple, fast, and easily implemented. The procedure uses a low power, continuous wave laser and a Charged Coupled Device (CCD) camera to electronically record the particle images. A frame grabber board in a PC is used for data acquisition and reduction processing. PDT eliminates the need for photographic processing, system costs are moderately low, and reduced data are available within seconds of acquisition. The technique results in velocity estimate accuracies on the order of 5%. The software is fully menu-driven from the acquisition to the reduction and analysis of the data. Options are available to acquire a single image or 5- or 25-field series of images separated in time by multiples of 1/60 second. The user may process each image, specifying its boundaries to remove unwanted glare from the periphery and adjusting its background level to clearly resolve the particle images. Data reduction routines determine the particle image centroids and create time history files. PDT then identifies the velocity vectors which describe the particle movement in the flow field. Graphical data analysis routines are included which allow the user to graph the time history files and display the velocity vector maps, interpolated velocity vector grids, iso-velocity vector contours, and flow streamlines. The PDT data processing software is written in FORTRAN 77 and the data acquisition routine is written in C-Language for 80386-based IBM PC compatibles running MS-DOS v3.0 or higher. Machine requirements include 4 MB RAM (3 MB Extended), a single or multiple frequency RGB monitor (EGA or better), a math co-processor, and a pointing device. The printers supported by the graphical analysis routines are the HP Laserjet+, Series II, and Series III with at least 1.5 MB memory. The data acquisition routines require the EPIX 4-MEG video board and optional 12.5MHz oscillator, and associated EPIX software. Data can be acquired from any CCD or RS-170 compatible video camera with pixel resolution of 600hX400v or better. PDT is distributed on one 5.25 inch 360K MS-DOS format diskette. Due to the use of required proprietary software, executable code is not provided on the distribution media. Compiling the source code requires the Microsoft C v5.1 compiler, Microsoft QuickC v2.0, the Microsoft Mouse Library, EPIX Image Processing Libraries, the Microway NDP-Fortran-386 v2.1 compiler, and the Media Cybernetics HALO Professional Graphics Kernal System. Due to the complexities of the machine requirements, COSMIC strongly recommends the purchase and review of the documentation prior to the purchase of the program. The source code, and sample input and output files are provided in PKZIP format; the PKUNZIP utility is included. PDT was developed in 1990. All trade names used are the property of their respective corporate owners.
NASA Astrophysics Data System (ADS)
Marculescu, Bogdan; Feldt, Robert; Torkar, Richard; Green, Lars-Goran; Liljegren, Thomas; Hult, Erika
2011-08-01
Verification and validation is an important part of software development and accounts for significant amounts of the costs associated with such a project. For developers of life or mission critical systems, such as software being developed for space applications, a balance must be reached between ensuring the quality of the system by extensive and rigorous testing and reducing costs and allowing the company to compete.Ensuring the quality of any system starts with a quality development process. To evaluate both the software development process and the product itself, measurements are needed. A balance must be then struck between ensuring the best possible quality of both process and product on the one hand, and reducing the cost of performing requirements on the other.A number of measurements have already been defined and are being used. For some of these, data collection can be automated as well, further lowering costs associated with implementing them. In practice, however, there may be situations where existing measurements are unsuitable for a variety of reasons.This paper describes a framework for creating low cost, flexible measurements in areas where initial information is scarce. The framework, called The Measurements Exploration Framework, is aimed in particular at the Space Software development industry and was developed is such an environment.
Lucid Provides Energy Reduction Software to DC Government
Earlier this year, Lucid, a 2014 U.S. EPA SBIR award recipient, announced that the D.C. government was using Lucid's BuildingOS® software in order to reach an an energy reduction target of 20 percent in 20 months
CrossTalk: The Journal of Defense Software Engineering. Volume 20, Number 6, June 2007
2007-06-01
California. He has co-authored the book Software Cost Estimation With COCOMO II with Barry Boehm and others. Clark helped define the COCOMO II model...Software Engineering at the University of Southern California. She worked with Barry Boehm and Chris Abts to develop and calibrate a cost-estimation...2003/02/ schorsch.html>. 2. See “Software Engineering, A Practitioners Approach” by Roger Pressman for a good description of coupling, cohesion
48 CFR 242.503-2 - Post-award conference procedure.
Code of Federal Regulations, 2014 CFR
2014-10-01
... contracts that include the clause at 252.234-7004, Cost and Software Data Reporting, postaward conferences shall include a discussion of the contractor's standard cost and software data reporting (CSDR) process...
48 CFR 242.503-2 - Post-award conference procedure.
Code of Federal Regulations, 2012 CFR
2012-10-01
... contracts that include the clause at 252.234-7004, Cost and Software Data Reporting, postaward conferences shall include a discussion of the contractor's standard cost and software data reporting (CSDR) process...
48 CFR 242.503-2 - Post-award conference procedure.
Code of Federal Regulations, 2011 CFR
2011-10-01
... contracts that include the clause at 252.234-7004, Cost and Software Data Reporting, postaward conferences shall include a discussion of the contractor's standard cost and software data reporting (CSDR) process...
Methods for cost estimation in software project management
NASA Astrophysics Data System (ADS)
Briciu, C. V.; Filip, I.; Indries, I. I.
2016-02-01
The speed in which the processes used in software development field have changed makes it very difficult the task of forecasting the overall costs for a software project. By many researchers, this task has been considered unachievable, but there is a group of scientist for which this task can be solved using the already known mathematical methods (e.g. multiple linear regressions) and the new techniques as genetic programming and neural networks. The paper presents a solution for building a model for the cost estimation models in the software project management using genetic algorithms starting from the PROMISE datasets related COCOMO 81 model. In the first part of the paper, a summary of the major achievements in the research area of finding a model for estimating the overall project costs is presented together with the description of the existing software development process models. In the last part, a basic proposal of a mathematical model of a genetic programming is proposed including here the description of the chosen fitness function and chromosome representation. The perspective of model described it linked with the current reality of the software development considering as basis the software product life cycle and the current challenges and innovations in the software development area. Based on the author's experiences and the analysis of the existing models and product lifecycle it was concluded that estimation models should be adapted with the new technologies and emerging systems and they depend largely by the chosen software development method.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1998-08-01
An estimated 85% of the installed base of software is a custom application with a production quantity of one. In practice, almost 100% of military software systems are custom software. Paradoxically, the marginal costs of producing additional units are near zero. So why hasn`t the software market, a market with high design costs and low productions costs evolved like other similar custom widget industries, such as automobiles and hardware chips? The military software industry seems immune to market pressures that have motivated a multilevel supply chain structure in other widget industries: design cost recovery, improve quality through specialization, and enablemore » rapid assembly from purchased components. The primary goal of the ComponentWare Consortium (CWC) technology plan was to overcome barriers to building and deploying mission-critical information systems by using verified, reusable software components (Component Ware). The adoption of the ComponentWare infrastructure is predicated upon a critical mass of the leading platform vendors` inevitable adoption of adopting emerging, object-based, distributed computing frameworks--initially CORBA and COM/OLE. The long-range goal of this work is to build and deploy military systems from verified reusable architectures. The promise of component-based applications is to enable developers to snap together new applications by mixing and matching prefabricated software components. A key result of this effort is the concept of reusable software architectures. A second important contribution is the notion that a software architecture is something that can be captured in a formal language and reused across multiple applications. The formalization and reuse of software architectures provide major cost and schedule improvements. The Unified Modeling Language (UML) is fast becoming the industry standard for object-oriented analysis and design notation for object-based systems. However, the lack of a standard real-time distributed object operating system, lack of a standard Computer-Aided Software Environment (CASE) tool notation and lack of a standard CASE tool repository has limited the realization of component software. The approach to fulfilling this need is the software component factory innovation. The factory approach takes advantage of emerging standards such as UML, CORBA, Java and the Internet. The key technical innovation of the software component factory is the ability to assemble and test new system configurations as well as assemble new tools on demand from existing tools and architecture design repositories.« less
NASA Technical Reports Server (NTRS)
Shell, Elaine M.; Lue, Yvonne; Chu, Martha I.
1999-01-01
Flight software is a mission critical element of spacecraft functionality and performance. When ground operations personnel interface to a spacecraft, they are typically dealing almost entirely with the capabilities of onboard software. This software, even more than critical ground/flight communications systems, is expected to perform perfectly during all phases of spacecraft life. Due to the fact that it can be reprogrammed on-orbit to accommodate degradations or failures in flight hardware, new insights into spacecraft characteristics, new control options which permit enhanced science options, etc., the on- orbit flight software maintenance team is usually significantly responsible for the long term success of a science mission. Failure of flight software to perform as needed can result in very expensive operations work-around costs and lost science opportunities. There are three basic approaches to maintaining spacecraft software--namely using the original developers, using the mission operations personnel, or assembling a center of excellence for multi-spacecraft software maintenance. Not planning properly for flight software maintenance can lead to unnecessarily high on-orbit costs and/or unacceptably long delays, or errors, in patch installations. A common approach for flight software maintenance is to access the original development staff. The argument for utilizing the development staff is that the people who developed the software will be the best people to modify the software on-orbit. However, it can quickly becomes a challenge to obtain the services of these key people. They may no longer be available to the organization. They may have a more urgent job to perform, quite likely on another project under different project management. If they havn't worked on the software for a long time, they may need precious time for refamiliarization to the software, testbeds and tools. Further, a lack of insight into issues related to flight software in its on-orbit environment, may find the developer unprepared for the challenges. The second approach is to train a member of the flight operations team to maintain the spacecraft software. This can prove to be a costly and inflexible solution. The person assigned to this duty may not have enough work to do during a problem free period and may have too much to do when a problem arises. If the person is a talented software engineer, he/she may not enjoy the limited software opportunities available in this position; and may eventually leave for newer technology computer science opportunities. Training replacement flight software personnel can be a difficult and lengthy process. The third approach is to assemble a center of excellence for on-orbit spacecraft software maintenance. Personnel in this specialty center can be managed to support flight software of multiple missions at once. The variety of challenges among a set of on-orbit missions, can result in a dedicated, talented staff which is fully trained and available to support each mission's needs. Such staff are not software developers but are rather spacecraft software systems engineers. The cost to any one mission is extremely low because the software staff works and charges, minimally on missions with no current operations issues; and their professional insight into on-orbit software troubleshooting and maintenance methods ensures low risk, effective and minimal-cost solutions to on-orbit issues.
Advanced software development workstation project: Engineering scripting language. Graphical editor
NASA Technical Reports Server (NTRS)
1992-01-01
Software development is widely considered to be a bottleneck in the development of complex systems, both in terms of development and in terms of maintenance of deployed systems. Cost of software development and maintenance can also be very high. One approach to reducing costs and relieving this bottleneck is increasing the reuse of software designs and software components. A method for achieving such reuse is a software parts composition system. Such a system consists of a language for modeling software parts and their interfaces, a catalog of existing parts, an editor for combining parts, and a code generator that takes a specification and generates code for that application in the target language. The Advanced Software Development Workstation is intended to be an expert system shell designed to provide the capabilities of a software part composition system.
SNPmplexViewer--toward a cost-effective traceability system
2011-01-01
Background Beef traceability has become mandatory in many regions of the world and is typically achieved through the use of unique numerical codes on ear tags and animal passports. DNA-based traceability uses the animal's own DNA code to identify it and the products derived from it. Using SNaPshot, a primer-extension-based method, a multiplex of 25 SNPs in a single reaction has been practiced for reducing the expense of genotyping a panel of SNPs useful for identity control. Findings To further decrease SNaPshot's cost, we introduced the Perl script SNPmplexViewer, which facilitates the analysis of trace files for reactions performed without the use of fluorescent size standards. SNPmplexViewer automatically aligns reference and target trace electropherograms, run with and without fluorescent size standards, respectively. SNPmplexViewer produces a modified target trace file containing a normalised trace in which the reference size standards are embedded. SNPmplexViewer also outputs aligned images of the two electropherograms together with a difference profile. Conclusions Modified trace files generated by SNPmplexViewer enable genotyping of SnaPshot reactions performed without fluorescent size standards, using common fragment-sizing software packages. SNPmplexViewer's normalised output may also improve the genotyping software's performance. Thus, SNPmplexViewer is a general free tool enabling the reduction of SNaPshot's cost as well as the fast viewing and comparing of trace electropherograms for fragment analysis. SNPmplexViewer is available at http://cowry.agri.huji.ac.il/cgi-bin/SNPmplexViewer.cgi. PMID:21600063
Antonuzzo, A; Vasile, E; Sbrana, A; Lucchesi, M; Galli, L; Brunetti, I M; Musettini, G; Farnesi, A; Biasco, E; Virgili, N; Falcone, A; Ricci, S
2017-01-01
Supportive care in oncology is a primary need for every oncology department nowadays. In 2012, in our institution, a dedicated supportive care service (SCS) was created in order to deal with any need our on-treatment patients might have (e.g. tumour-related or treatment-related symptoms). We hypothesized that this service had a positive impact on the number of unplanned hospitalizations; to confirm our hypothesis, we decided to review admission data in 2011 and 2012. Using our internal software, we compared admission data in 2011 (that is, the year before the dedicated service was created) and 2012 (when such service began, that is April of that year). We also made an evaluation of the costs of these hospitalizations. Despite an increase of the number of patients treated in our day hospital (+6.5 %), the number of unplanned hospital admissions decreased by 3.2 % (from 17.3 to 14.1 %). The number of patients accessing to emergency room went from 66 to 61 % (a reduction of 5 %). The costs of these hospitalizations were reduced by 2.2 %. The introduction of the dedicated SCS in our oncology department caused a net reduction by 3.2 % of the number of unplanned hospitalizations of on-treatment cancer patients.
ERIC Educational Resources Information Center
Tan, Andrea; Ferreira, Aldónio
2012-01-01
This study investigates the influence of the use of accounting software in teaching activity-based costing (ABC) on the learning process. It draws upon the Theory of Planned Behaviour and uses the end-user computer satisfaction (EUCS) framework to examine students' satisfaction with the ABC software. The study examines students' satisfaction with…
1982-05-13
Size Of The Software. A favourite measure for software system size is linos of operational code, or deliverable code (operational code plus...regression models, these conversions are either derived from productivity measures using the "cost per instruction" type of equation or they are...appropriate to different development organisattons, differert project types, different sets of units for measuring e and s, and different items
NASA Technical Reports Server (NTRS)
Fridge, Ernest M., III
1991-01-01
Today's software systems generally use obsolete technology, are not integrated properly with other software systems, and are difficult and costly to maintain. The discipline of reverse engineering is becoming prominent as organizations try to move their systems up to more modern and maintainable technology in a cost effective manner. JSC created a significant set of tools to develop and maintain FORTRAN and C code during development of the Space Shuttle. This tool set forms the basis for an integrated environment to re-engineer existing code into modern software engineering structures which are then easier and less costly to maintain and which allow a fairly straightforward translation into other target languages. The environment will support these structures and practices even in areas where the language definition and compilers do not enforce good software engineering. The knowledge and data captured using the reverse engineering tools is passed to standard forward engineering tools to redesign or perform major upgrades to software systems in a much more cost effective manner than using older technologies. A beta vision of the environment was released in Mar. 1991. The commercial potential for such re-engineering tools is very great. CASE TRENDS magazine reported it to be the primary concern of over four hundred of the top MIS executives.
Agile Development Methods for Space Operations
NASA Technical Reports Server (NTRS)
Trimble, Jay; Webster, Chris
2012-01-01
Main stream industry software development practice has gone from a traditional waterfall process to agile iterative development that allows for fast response to customer inputs and produces higher quality software at lower cost. How can we, the space ops community, adopt state of the art software development practice, achieve greater productivity at lower cost, and maintain safe and effective space flight operations? At NASA Ames, we are developing Mission Control Technologies Software, in collaboration with Johnson Space Center (JSC) and, more recently, the Jet Propulsion Laboratory (JPL).
Software management tools: Lessons learned from use
NASA Technical Reports Server (NTRS)
Reifer, D. J.; Valett, J.; Knight, J.; Wenneson, G.
1985-01-01
Experience in inserting software project planning tools into more than 100 projects producing mission critical software are discussed. The problems the software project manager faces are listed along with methods and tools available to handle them. Experience is reported with the Project Manager's Workstation (PMW) and the SoftCost-R cost estimating package. Finally, the results of a survey, which looked at what could be done in the future to overcome the problems experienced and build a set of truly useful tools, are presented.
Simplifier: a web tool to eliminate redundant NGS contigs.
Ramos, Rommel Thiago Jucá; Carneiro, Adriana Ribeiro; Azevedo, Vasco; Schneider, Maria Paula; Barh, Debmalya; Silva, Artur
2012-01-01
Modern genomic sequencing technologies produce a large amount of data with reduced cost per base; however, this data consists of short reads. This reduction in the size of the reads, compared to those obtained with previous methodologies, presents new challenges, including a need for efficient algorithms for the assembly of genomes from short reads and for resolving repetitions. Additionally after abinitio assembly, curation of the hundreds or thousands of contigs generated by assemblers demands considerable time and computational resources. We developed Simplifier, a stand-alone software that selectively eliminates redundant sequences from the collection of contigs generated by ab initio assembly of genomes. Application of Simplifier to data generated by assembly of the genome of Corynebacterium pseudotuberculosis strain 258 reduced the number of contigs generated by ab initio methods from 8,004 to 5,272, a reduction of 34.14%; in addition, N50 increased from 1 kb to 1.5 kb. Processing the contigs of Escherichia coli DH10B with Simplifier reduced the mate-paired library 17.47% and the fragment library 23.91%. Simplifier removed redundant sequences from datasets produced by assemblers, thereby reducing the effort required for finalization of genome assembly in tests with data from Prokaryotic organisms. Simplifier is available at http://www.genoma.ufpa.br/rramos/softwares/simplifier.xhtmlIt requires Sun jdk 6 or higher.
Kouzani, Abbas Z; Kale, Rajas P; Zarate-Garza, Pablo Patricio; Berk, Michael; Walder, Ken; Tye, Susannah J
2017-09-01
Deep brain stimulation (DBS) devices deliver electrical pulses to neural tissue through an electrode. To study the mechanisms and therapeutic benefits of deep brain stimulation, murine preclinical research is necessary. However, conducting naturalistic long-term, uninterrupted animal behavioral experiments can be difficult with bench-top systems. The reduction of size, weight, power consumption, and cost of DBS devices can assist the progress of this research in animal studies. A low power, low weight, miniature DBS device is presented in this paper. This device consists of electronic hardware and software components including a low-power microcontroller, an adjustable current source, an n-channel metal-oxide-semiconductor field-effect transistor, a coin-cell battery, electrode wires and a software program to operate the device. Evaluation of the performance of the device in terms of battery lifetime and device functionality through bench and in vivo tests was conducted. The bench test revealed that this device can deliver continuous stimulation current pulses of strength [Formula: see text], width [Formula: see text], and frequency 130 Hz for over 22 days. The in vivo tests demonstrated that chronic stimulation of the nucleus accumbens (NAc) with this device significantly increased psychomotor activity, together with a dramatic reduction in anxiety-like behavior in the elevated zero-maze test.
Templet Web: the use of volunteer computing approach in PaaS-style cloud
NASA Astrophysics Data System (ADS)
Vostokin, Sergei; Artamonov, Yuriy; Tsarev, Daniil
2018-03-01
This article presents the Templet Web cloud service. The service is designed for high-performance scientific computing automation. The use of high-performance technology is specifically required by new fields of computational science such as data mining, artificial intelligence, machine learning, and others. Cloud technologies provide a significant cost reduction for high-performance scientific applications. The main objectives to achieve this cost reduction in the Templet Web service design are: (a) the implementation of "on-demand" access; (b) source code deployment management; (c) high-performance computing programs development automation. The distinctive feature of the service is the approach mainly used in the field of volunteer computing, when a person who has access to a computer system delegates his access rights to the requesting user. We developed an access procedure, algorithms, and software for utilization of free computational resources of the academic cluster system in line with the methods of volunteer computing. The Templet Web service has been in operation for five years. It has been successfully used for conducting laboratory workshops and solving research problems, some of which are considered in this article. The article also provides an overview of research directions related to service development.
Improving Software Engineering on NASA Projects
NASA Technical Reports Server (NTRS)
Crumbley, Tim; Kelly, John C.
2010-01-01
Software Engineering Initiative: Reduces risk of software failure -Increases mission safety. More predictable software cost estimates and delivery schedules. Smarter buyer of contracted out software. More defects found and removed earlier. Reduces duplication of efforts between projects. Increases ability to meet the challenges of evolving software technology.
Characterizing and Modeling the Cost of Rework in a Library of Reusable Software Components
NASA Technical Reports Server (NTRS)
Basili, Victor R.; Condon, Steven E.; ElEmam, Khaled; Hendrick, Robert B.; Melo, Walcelio
1997-01-01
In this paper we characterize and model the cost of rework in a Component Factory (CF) organization. A CF is responsible for developing and packaging reusable software components. Data was collected on corrective maintenance activities for the Generalized Support Software reuse asset library located at the Flight Dynamics Division of NASA's GSFC. We then constructed a predictive model of the cost of rework using the C4.5 system for generating a logical classification model. The predictor variables for the model are measures of internal software product attributes. The model demonstrates good prediction accuracy, and can be used by managers to allocate resources for corrective maintenance activities. Furthermore, we used the model to generate proscriptive coding guidelines to improve programming, practices so that the cost of rework can be reduced in the future. The general approach we have used is applicable to other environments.
Organizational management practices for achieving software process improvement
NASA Technical Reports Server (NTRS)
Kandt, Ronald Kirk
2004-01-01
The crisis in developing software has been known for over thirty years. Problems that existed in developing software in the early days of computing still exist today. These problems include the delivery of low-quality products, actual development costs that exceed expected development costs, and actual development time that exceeds expected development time. Several solutions have been offered to overcome out inability to deliver high-quality software, on-time and within budget. One of these solutions involves software process improvement. However, such efforts often fail because of organizational management issues. This paper discusses business practices that organizations should follow to improve their chances of initiating and sustaining successful software process improvement efforts.
Evaluating foodservice software: a suggested approach.
Fowler, K D
1986-09-01
In an era of cost containment, the computer has become a viable management tool. Its use in health care has demonstrated accelerated growth in recent years, and a literature review supports an increased trend in this direction. Foodservice, which is a major cost center, is no exception to this predicted trend. Because software has proliferated, foodservice managers and dietitians are experiencing growing concern about how to evaluate the numerous software packages from which to choose. A suggested approach to evaluating software is offered to dietitians and managers alike to lessen the confusion in software selection and to improve the system satisfaction level post-purchase. Steps of the software evaluatory approach include: delineation of goals, assessment of needs, assignment of value weight factors, development of a vendor checklist, survey of vendors by means of the vendor checklist and elimination of inappropriate systems, thorough development of the request for proposal (RFP) for submission to the selected vendors, an analysis of the returned RFPs in terms of system features and cost factors, and selection of the system(s) for implementation.
Why and how Mastering an Incremental and Iterative Software Development Process
NASA Astrophysics Data System (ADS)
Dubuc, François; Guichoux, Bernard; Cormery, Patrick; Mescam, Jean Christophe
2004-06-01
One of the key issues regularly mentioned in the current software crisis of the space domain is related to the software development process that must be performed while the system definition is not yet frozen. This is especially true for complex systems like launchers or space vehicles.Several more or less mature solutions are under study by EADS SPACE Transportation and are going to be presented in this paper. The basic principle is to develop the software through an iterative and incremental process instead of the classical waterfall approach, with the following advantages:- It permits systematic management and incorporation of requirements changes over the development cycle with a minimal cost. As far as possible the most dimensioning requirements are analyzed and developed in priority for validating very early the architecture concept without the details.- A software prototype is very quickly available. It improves the communication between system and software teams, as it enables to check very early and efficiently the common understanding of the system requirements.- It allows the software team to complete a whole development cycle very early, and thus to become quickly familiar with the software development environment (methodology, technology, tools...). This is particularly important when the team is new, or when the environment has changed since the previous development. Anyhow, it improves a lot the learning curve of the software team.These advantages seem very attractive, but mastering efficiently an iterative development process is not so easy and induces a lot of difficulties such as:- How to freeze one configuration of the system definition as a development baseline, while most of thesystem requirements are completely and naturally unstable?- How to distinguish stable/unstable and dimensioning/standard requirements?- How to plan the development of each increment?- How to link classical waterfall development milestones with an iterative approach: when should theclassical reviews be performed: Software Specification Review? Preliminary Design Review? CriticalDesign Review? Code Review? Etc...Several solutions envisaged or already deployed by EADS SPACE Transportation will be presented, both from a methodological and technological point of view:- How the MELANIE EADS ST internal methodology improves the concurrent engineering activitiesbetween GNC, software and simulation teams in a very iterative and reactive way.- How the CMM approach can help by better formalizing Requirements Management and Planningprocesses.- How the Automatic Code Generation with "certified" tools (SCADE) can still dramatically shorten thedevelopment cycle.Then the presentation will conclude by showing an evaluation of the cost and planning reduction based on a pilot application by comparing figures on two similar projects: one with the classical waterfall process, the other one with an iterative and incremental approach.
RT-Syn: A real-time software system generator
NASA Technical Reports Server (NTRS)
Setliff, Dorothy E.
1992-01-01
This paper presents research into providing highly reusable and maintainable components by using automatic software synthesis techniques. This proposal uses domain knowledge combined with automatic software synthesis techniques to engineer large-scale mission-critical real-time software. The hypothesis centers on a software synthesis architecture that specifically incorporates application-specific (in this case real-time) knowledge. This architecture synthesizes complex system software to meet a behavioral specification and external interaction design constraints. Some examples of these external constraints are communication protocols, precisions, timing, and space limitations. The incorporation of application-specific knowledge facilitates the generation of mathematical software metrics which are used to narrow the design space, thereby making software synthesis tractable. Success has the potential to dramatically reduce mission-critical system life-cycle costs not only by reducing development time, but more importantly facilitating maintenance, modifications, and extensions of complex mission-critical software systems, which are currently dominating life cycle costs.
Efficient experimental design for uncertainty reduction in gene regulatory networks.
Dehghannasiri, Roozbeh; Yoon, Byung-Jun; Dougherty, Edward R
2015-01-01
An accurate understanding of interactions among genes plays a major role in developing therapeutic intervention methods. Gene regulatory networks often contain a significant amount of uncertainty. The process of prioritizing biological experiments to reduce the uncertainty of gene regulatory networks is called experimental design. Under such a strategy, the experiments with high priority are suggested to be conducted first. The authors have already proposed an optimal experimental design method based upon the objective for modeling gene regulatory networks, such as deriving therapeutic interventions. The experimental design method utilizes the concept of mean objective cost of uncertainty (MOCU). MOCU quantifies the expected increase of cost resulting from uncertainty. The optimal experiment to be conducted first is the one which leads to the minimum expected remaining MOCU subsequent to the experiment. In the process, one must find the optimal intervention for every gene regulatory network compatible with the prior knowledge, which can be prohibitively expensive when the size of the network is large. In this paper, we propose a computationally efficient experimental design method. This method incorporates a network reduction scheme by introducing a novel cost function that takes into account the disruption in the ranking of potential experiments. We then estimate the approximate expected remaining MOCU at a lower computational cost using the reduced networks. Simulation results based on synthetic and real gene regulatory networks show that the proposed approximate method has close performance to that of the optimal method but at lower computational cost. The proposed approximate method also outperforms the random selection policy significantly. A MATLAB software implementing the proposed experimental design method is available at http://gsp.tamu.edu/Publications/supplementary/roozbeh15a/.
Efficient experimental design for uncertainty reduction in gene regulatory networks
2015-01-01
Background An accurate understanding of interactions among genes plays a major role in developing therapeutic intervention methods. Gene regulatory networks often contain a significant amount of uncertainty. The process of prioritizing biological experiments to reduce the uncertainty of gene regulatory networks is called experimental design. Under such a strategy, the experiments with high priority are suggested to be conducted first. Results The authors have already proposed an optimal experimental design method based upon the objective for modeling gene regulatory networks, such as deriving therapeutic interventions. The experimental design method utilizes the concept of mean objective cost of uncertainty (MOCU). MOCU quantifies the expected increase of cost resulting from uncertainty. The optimal experiment to be conducted first is the one which leads to the minimum expected remaining MOCU subsequent to the experiment. In the process, one must find the optimal intervention for every gene regulatory network compatible with the prior knowledge, which can be prohibitively expensive when the size of the network is large. In this paper, we propose a computationally efficient experimental design method. This method incorporates a network reduction scheme by introducing a novel cost function that takes into account the disruption in the ranking of potential experiments. We then estimate the approximate expected remaining MOCU at a lower computational cost using the reduced networks. Conclusions Simulation results based on synthetic and real gene regulatory networks show that the proposed approximate method has close performance to that of the optimal method but at lower computational cost. The proposed approximate method also outperforms the random selection policy significantly. A MATLAB software implementing the proposed experimental design method is available at http://gsp.tamu.edu/Publications/supplementary/roozbeh15a/. PMID:26423515
NASA Astrophysics Data System (ADS)
Johansson, Björn
During recent years great attention has been paid to outsourcing as well as to the reverse, insourcing (Dibbern et al., 2004). There has been a strong focus on how the management of software applications and information and communication technology (ICT), expressed as ICT management versus ICT governance, should be carried out (Grembergen, 2004). The maintenance and operation of software applications and ICT use a lot of the resources spent on ICT in organizations today (Bearingpoint, 2004), and managers are asked to increase the business benefits of these investments (Weill & Ross, 2004). That is, they are asked to improve the usage of ICT and to develop new business critical solutions supported by ICT. It also means that investments in ICT and software applications need to be shown to be worthwhile. Basically there are two considerations to take into account with ICT usage: cost reduction and improving business value. How the governance and management of ICT and software applications are organized is important. This means that the improvement of the control of maintenance and operation may be of interest to executives of organizations. It can be stated that usage is dependent on how it is organized. So, if an increase of ICT governance is the same as having well-organized ICT resources, could this be seen as the first step in organizations striving for external provision of ICT? This question is dealt with to some degree in this paper.
[Software for illustrating a cost-quality balance carried out by clinical laboratory practice].
Nishibori, Masahiro; Asayama, Hitoshi; Kimura, Satoshi; Takagi, Yasushi; Hagihara, Michio; Fujiwara, Mutsunori; Yoneyama, Akiko; Watanabe, Takashi
2010-09-01
We have no proper reference indicating the quality of clinical laboratory practice, which should clearly illustrates that better medical tests require more expenses. Japanese Society of Laboratory Medicine was concerned about recent difficult medical economy and issued a committee report proposing a guideline to evaluate the good laboratory practice. According to the guideline, we developed software that illustrate a cost-quality balance carried out by clinical laboratory practice. We encountered a number of controversial problems, for example, how to measure and weight each quality-related factor, how to calculate costs of a laboratory test and how to consider characteristics of a clinical laboratory. Consequently we finished only prototype software within the given period and the budget. In this paper, software implementation of the guideline and the above-mentioned problems are summarized. Aiming to stimulate these discussions, the operative software will be put on the Society's homepage for trial
JPSS Science Data Services for the Direct Readout Community
NASA Technical Reports Server (NTRS)
Chander, Gyanesh; Lutz, Bob
2014-01-01
The Suomi National Polar-orbiting Partnership (S-NPP) and Joint Polar Satellite System (JPSS) High Rate Data (HRD) link provides Direct Broadcast data to users in real-time, utilizing their own remote field terminals. The Field Terminal Support (FTS) provides the resources needed to support the Direct Readout communities by providing software, documentation, and periodic updates to enable them to produce data products from SNPP and JPSS. The FTS distribution server will also provide the necessary ancillary and auxiliary data needed for processing the broadcasts, as well as making orbital data available to assist in locating the satellites of interest. In addition, the FTS provides development support for the algorithm and software through GSFC Direct Readout Laboratory (DRL) International Polar Orbiter Processing Package (IPOPP) and University of Wisconsin (UWISC) Community Satellite Processing Package (CSPP), to enable users to integrate the algorithms into their remote terminals. The support the JPSS Program provides to the institutions developing and maintaining these two software packages, will demonstrate the ability to produce ready-to-use products from the HRD link and provide risk reduction effort at a minimal cost. This paper discusses the key functions and system architecture of FTS.
PREMER: a Tool to Infer Biological Networks.
Villaverde, Alejandro F; Becker, Kolja; Banga, Julio R
2017-10-04
Inferring the structure of unknown cellular networks is a main challenge in computational biology. Data-driven approaches based on information theory can determine the existence of interactions among network nodes automatically. However, the elucidation of certain features - such as distinguishing between direct and indirect interactions or determining the direction of a causal link - requires estimating information-theoretic quantities in a multidimensional space. This can be a computationally demanding task, which acts as a bottleneck for the application of elaborate algorithms to large-scale network inference problems. The computational cost of such calculations can be alleviated by the use of compiled programs and parallelization. To this end we have developed PREMER (Parallel Reverse Engineering with Mutual information & Entropy Reduction), a software toolbox that can run in parallel and sequential environments. It uses information theoretic criteria to recover network topology and determine the strength and causality of interactions, and allows incorporating prior knowledge, imputing missing data, and correcting outliers. PREMER is a free, open source software tool that does not require any commercial software. Its core algorithms are programmed in FORTRAN 90 and implement OpenMP directives. It has user interfaces in Python and MATLAB/Octave, and runs on Windows, Linux and OSX (https://sites.google.com/site/premertoolbox/).
Human performance cognitive-behavioral modeling: a benefit for occupational safety.
Gore, Brian F
2002-01-01
Human Performance Modeling (HPM) is a computer-aided job analysis software methodology used to generate predictions of complex human-automation integration and system flow patterns with the goal of improving operator and system safety. The use of HPM tools has recently been increasing due to reductions in computational cost, augmentations in the tools' fidelity, and usefulness in the generated output. An examination of an Air Man-machine Integration Design and Analysis System (Air MIDAS) model evaluating complex human-automation integration currently underway at NASA Ames Research Center will highlight the importance to occupational safety of considering both cognitive and physical aspects of performance when researching human error.
Babcock, Hazen P
2018-01-29
This work explores the use of industrial grade CMOS cameras for single molecule localization microscopy (SMLM). We show that industrial grade CMOS cameras approach the performance of scientific grade CMOS cameras at a fraction of the cost. This makes it more economically feasible to construct high-performance imaging systems with multiple cameras that are capable of a diversity of applications. In particular we demonstrate the use of industrial CMOS cameras for biplane, multiplane and spectrally resolved SMLM. We also provide open-source software for simultaneous control of multiple CMOS cameras and for the reduction of the movies that are acquired to super-resolution images.
Human performance cognitive-behavioral modeling: a benefit for occupational safety
NASA Technical Reports Server (NTRS)
Gore, Brian F.
2002-01-01
Human Performance Modeling (HPM) is a computer-aided job analysis software methodology used to generate predictions of complex human-automation integration and system flow patterns with the goal of improving operator and system safety. The use of HPM tools has recently been increasing due to reductions in computational cost, augmentations in the tools' fidelity, and usefulness in the generated output. An examination of an Air Man-machine Integration Design and Analysis System (Air MIDAS) model evaluating complex human-automation integration currently underway at NASA Ames Research Center will highlight the importance to occupational safety of considering both cognitive and physical aspects of performance when researching human error.
Industrial applications of automated X-ray inspection
NASA Astrophysics Data System (ADS)
Shashishekhar, N.
2015-03-01
Many industries require that 100% of manufactured parts be X-ray inspected. Factors such as high production rates, focus on inspection quality, operator fatigue and inspection cost reduction translate to an increasing need for automating the inspection process. Automated X-ray inspection involves the use of image processing algorithms and computer software for analysis and interpretation of X-ray images. This paper presents industrial applications and illustrative case studies of automated X-ray inspection in areas such as automotive castings, fuel plates, air-bag inflators and tires. It is usually necessary to employ application-specific automated inspection strategies and techniques, since each application has unique characteristics and interpretation requirements.
Zuhtuogullari, Kursat; Allahverdi, Novruz; Arikan, Nihat
2013-01-01
The systems consisting high input spaces require high processing times and memory usage. Most of the attribute selection algorithms have the problems of input dimensions limits and information storage problems. These problems are eliminated by means of developed feature reduction software using new modified selection mechanism with middle region solution candidates adding. The hybrid system software is constructed for reducing the input attributes of the systems with large number of input variables. The designed software also supports the roulette wheel selection mechanism. Linear order crossover is used as the recombination operator. In the genetic algorithm based soft computing methods, locking to the local solutions is also a problem which is eliminated by using developed software. Faster and effective results are obtained in the test procedures. Twelve input variables of the urological system have been reduced to the reducts (reduced input attributes) with seven, six, and five elements. It can be seen from the obtained results that the developed software with modified selection has the advantages in the fields of memory allocation, execution time, classification accuracy, sensitivity, and specificity values when compared with the other reduction algorithms by using the urological test data.
Zuhtuogullari, Kursat; Allahverdi, Novruz; Arikan, Nihat
2013-01-01
The systems consisting high input spaces require high processing times and memory usage. Most of the attribute selection algorithms have the problems of input dimensions limits and information storage problems. These problems are eliminated by means of developed feature reduction software using new modified selection mechanism with middle region solution candidates adding. The hybrid system software is constructed for reducing the input attributes of the systems with large number of input variables. The designed software also supports the roulette wheel selection mechanism. Linear order crossover is used as the recombination operator. In the genetic algorithm based soft computing methods, locking to the local solutions is also a problem which is eliminated by using developed software. Faster and effective results are obtained in the test procedures. Twelve input variables of the urological system have been reduced to the reducts (reduced input attributes) with seven, six, and five elements. It can be seen from the obtained results that the developed software with modified selection has the advantages in the fields of memory allocation, execution time, classification accuracy, sensitivity, and specificity values when compared with the other reduction algorithms by using the urological test data. PMID:23573172
The Architecture and Application of RAMSES, a CCSDS and ECSS PUS Compliant Test and Control System
NASA Astrophysics Data System (ADS)
Battelino, Milan; Svard, Christian; Carlsson, Anna; Carlstedt-Duke, Theresa; Tornqvist, Marcus
2010-08-01
SSC, Swedish Space Corporation, has more than 30 years of experience in developing test and control systems for sounding rockets, experimental test modules and satellites. The increasing amount of ongoing projects made SSC to consider developing a test and control system conformant to CCSDS (Consultative Committee for Space Data Systems) and ECSS (European Cooperation for Space Standardization), that with small effort and cost, could be reused between separate projects and products. The foreseen reduction in cost and development time for different future space-related projects made such a reusable control system desirable. This paper will describe the ideas behind the RAMSES (Rocket and Multi-Satellite EMCS Software) system, its architecture and how it has been and is being used in a variety of applications at SSC such as the multi-satellite mission PRISMA and sounding rocket project MAXUS-8.
Operations analysis (study 2.1). Volume 1: Executive summary
NASA Technical Reports Server (NTRS)
Wolfe, R. R.
1975-01-01
Subjects related to future STS operations concepts were investigated. The majority of effort was directed at assessing the benefits of automated space servicing concepts as related to improvements in payload procurement and shuttle utilization. Another subject was directed at understanding shuttle upper stage software development and recurring costs relative to total program projections. Space serving of automated payloads is addressed by examining the broad spectrum of payload applications with the belief that shared logistic operations will be a major contributor to reduction of future program costs. However, there are certain requirements for support of payload operations, such as availability of the payload, that may place demands upon the shuttle fleet. Because future projections of the NASA Mission Model are only representative of the payload traffic, it is important to recognize that it is the general character of operations that is significant rather than service to any single payload program.
An Evaluation of the High Level Architecture (HLA) as a Framework for NASA Modeling and Simulation
NASA Technical Reports Server (NTRS)
Reid, Michael R.; Powers, Edward I. (Technical Monitor)
2000-01-01
The High Level Architecture (HLA) is a current US Department of Defense and an industry (IEEE-1516) standard architecture for modeling and simulations. It provides a framework and set of functional rules and common interfaces for integrating separate and disparate simulators into a larger simulation. The goal of the HLA is to reduce software costs by facilitating the reuse of simulation components and by providing a runtime infrastructure to manage the simulations. In order to evaluate the applicability of the HLA as a technology for NASA space mission simulations, a Simulations Group at Goddard Space Flight Center (GSFC) conducted a study of the HLA and developed a simple prototype HLA-compliant space mission simulator. This paper summarizes the prototyping effort and discusses the potential usefulness of the HLA in the design and planning of future NASA space missions with a focus on risk mitigation and cost reduction.
NASA Technical Reports Server (NTRS)
Faust, N.; Jordon, L.
1981-01-01
Since the implementation of the GRID and IMGRID computer programs for multivariate spatial analysis in the early 1970's, geographic data analysis subsequently moved from large computers to minicomputers and now to microcomputers with radical reduction in the costs associated with planning analyses. Programs designed to process LANDSAT data to be used as one element in a geographic data base were used once NIMGRID (new IMGRID), a raster oriented geographic information system, was implemented on the microcomputer. Programs for training field selection, supervised and unsupervised classification, and image enhancement were added. Enhancements to the color graphics capabilities of the microsystem allow display of three channels of LANDSAT data in color infrared format. The basic microcomputer hardware needed to perform NIMGRID and most LANDSAT analyses is listed as well as the software available for LANDSAT processing.
A review of video security training and assessment-systems and their applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cellucci, J.; Hall, R.J.
1991-01-01
This paper reports that during the last 10 years computer-aided video data collection and playback systems have been used as nuclear facility security training and assessment tools with varying degrees of success. These mobile systems have been used by trained security personnel for response force training, vulnerability assessment, force-on-force exercises and crisis management. Typically, synchronous recordings from multiple video cameras, communications audio, and digital sensor inputs; are played back to the exercise participants and then edited for training and briefing. Factors that have influence user acceptance include: frequency of use, the demands placed on security personnel, fear of punishment, usermore » training requirements and equipment cost. The introduction of S-VHS video and new software for scenario planning, video editing and data reduction; should bring about a wider range of security applications and supply the opportunity for significant cost sharing with other user groups.« less
Low Cost Ways to Keep Software Current.
ERIC Educational Resources Information Center
Schultheis, Robert A.
1992-01-01
Discusses strategies for providing students with current computer software technology including acquiring previous versions of software, obtaining demonstration software, using student versions, getting examination software, buying from mail order firms, buying few copies, exploring site licenses, acquiring shareware or freeware, and applying for…
Textbook Software versus Professional Software: Which Is Better for Instructional Purposes?
ERIC Educational Resources Information Center
Snell, Meggan; Yatsenko, Olga
2002-01-01
Compares textbook software with professional packages such as Peachtree for teaching accounting, in terms of cost, availability, ease of teaching and learning, and applicability. Makes suggestions for choosing accounting software. (SK)
Users guide for STHARVEST: software to estimate the cost of harvesting small timber.
Roger D. Fight; Xiaoshan Zhang; Bruce R. Hartsough
2003-01-01
The STHARVEST computer application is Windows-based, public-domain software used to estimate costs for harvesting small-diameter stands or the small-diameter component of a mixed-sized stand. The equipment production rates were developed from existing studies. Equipment operating cost rates were based on November 1998 prices for new equipment and wage rates for the...
Dennis R. Becker; Debra Larson; Eini C. Lowell; Robert B. Rummer
2008-01-01
The HCR (Harvest Cost-Revenue) Estimator is engineering and financial analysis software used to evaluate stand-level financial thresholds for harvesting small-diameter ponderosa pine (Pinus ponderosa Dougl. ex Laws.) in the Southwest United States. The Windows-based program helps contractors and planners to identify costs associated with tree...
DOT National Transportation Integrated Search
1976-09-01
Software used for the reduction and analysis of the multipath prober, modem evaluation (voice, digital data, and ranging), and antenna evaluation data acquired during the ATS-6 field test program is described. Multipath algorithms include reformattin...
NASA Technical Reports Server (NTRS)
Schmidt, R. J.; Dodds, R. H., Jr.
1985-01-01
The dynamic analysis of complex structural systems using the finite element method and multilevel substructured models is presented. The fixed-interface method is selected for substructure reduction because of its efficiency, accuracy, and adaptability to restart and reanalysis. This method is extended to reduction of substructures which are themselves composed of reduced substructures. The implementation and performance of the method in a general purpose software system is emphasized. Solution algorithms consistent with the chosen data structures are presented. It is demonstrated that successful finite element software requires the use of software executives to supplement the algorithmic language. The complexity of the implementation of restart and reanalysis porcedures illustrates the need for executive systems to support the noncomputational aspects of the software. It is shown that significant computational efficiencies can be achieved through proper use of substructuring and reduction technbiques without sacrificing solution accuracy. The restart and reanalysis capabilities and the flexible procedures for multilevel substructured modeling gives economical yet accurate analyses of complex structural systems.
Vulnerability in Determining the Cost of Information System Project to Avoid Loses
NASA Astrophysics Data System (ADS)
Haryono, Kholid; Ikhsani, Zulfa Amalia
2018-03-01
Context: This study discusses the priority of cost required in software development projects. Objectives: To show the costing models, the variables involved, and how practitioners assess and decide the priorities of each variable. To strengthen the information, each variable also confirmed the risk if ignored. Method: The method is done by two approaches. First, systematic literature reviews to find the models and variables used to decide the cost of software development. Second, confirm and take judgments about the level of importance and risk of each variable to the software developer. Result: Obtained about 54 variables that appear on the 10 models discussed. The variables are categorized into 15 groups based on the similarity of meaning. Each group becomes a variable. Confirmation results with practitioners on the level of importance and risk. It shown there are two variables that are considered very important and high risk if ignored. That is duration and effort. Conclusion: The relationship of variable rates between the results of literature studies and confirmation of practitioners contributes to the use of software business actors in considering project cost variables.
Arenas, Meritxell; Sabater, Sebastià; Sintas, Andreu; Arguís, Monica; Hernández, Víctor; Árquez, Miguel; López, Iolanda; Rovirosa, Àngeles; Puig, Doménec
2017-06-01
Skin cancer is the most common tumor in the population. There are different therapeutic modalities. Brachytherapy is one of the techniques used, in which it is necessary to build customized moulds for some patients. Currently, these moulds are made by hand using rudimentary techniques. We present a new procedure based on 3D printing and the analysis of the clinical workflow. Moulds can be made either by hand or by automated 3D printing. For making moulds by hand, a patient's alginate negative is created and, from that, the gypsum cast and customized moulds are made by hand from the patient's negative template. The new process is based on 3D printing. The first step is to take a 3D scan of the surface of the patient and then, 3D modelling software is used to obtain an accurate anatomical reconstruction of the treatment area. We present the clinical workflow using 3D scanning and printing technology, comparing its costs with the usual custom handmade mould protocol. The time spent for the new process is 6.25 hours, in contrast to the time spent for the conventional process, which is 9.5 hours. We found a 34% reduction in time required to create a mould for brachytherapy treatment. The labor cost of the conventional process is 211.5 vs. 152.5 hours, so the reduction is 59 hours. There is also a 49.5% reduction in the financial costs, mostly due to lack of need of a computed tomography (CT) scan of the gypsum and the mould. 3D scanning and printing offers financial benefits and reduces the clinical workload. As the present project demonstrates, through the application of 3D printing technologies, the costs and time spent during the process in the clinical workload in brachytherapy treatment are reduced. Overall, 3D printing is a promising technique for brachytherapy that might be well received in the community.
Investigation of Wind Turbine Rotor Concepts for Offshore Wind Farms
NASA Astrophysics Data System (ADS)
Ceyhan, Özlem; Grasso, Francesco
2014-06-01
Current plans in offshore wind energy developments call for further reduction of cost of energy. In order to contribute to this goal, several wind turbine rotor concepts have been investigated. Assuming the future offshore wind turbines will operate only in the offshore wind farms, the rotor concepts are not only evaluated for their stand-alone performances and their potential in reducing the loads, but also for their performance in an offshore wind farm. In order to do that, the 10MW reference wind turbine designed in Innwind.EU project is chosen as baseline. Several rotor parameters have been modified and their influences are investigated for offshore wind turbine design purposes. This investigation is carried out as a conceptual parametrical study. All concepts are evaluated numerically with BOT (Blade optimisation tool) software in wind turbine level and with Farmflow software in wind farm level for two wind farm layouts. At the end, all these concepts are compared with each other in terms of their advantages and disadvantages.
The HEAO experience - design through operations
NASA Technical Reports Server (NTRS)
Hoffman, D. P.
1983-01-01
The design process and performance of the NASA High Energy Astronomy Observatories (HEAO-1, 2, and 3) are surveyed from the initiation of the program in 1968 through the end of HEAO-3 operation in May, 1981, with a focus on the attitude control and determination subsystem (ACDS). The science objectives, original and revised overall design concepts, final design for each spacecraft, and details of the ACDS designs are discussed, and the stages of the ACDS design process, including redefinition to achieve 50 percent cost reduction, detailed design of common and mission-unique hardware and software, unit qualification, subsystem integration, and observatory-level testing, are described. Overall and ACDS performance is evaluated for each mission and found to meet or exceed design requirements despite some difficulties arising from errors in startracker-ACDS-interface coordination and from gyroscope failures. These difficulties were resolved by using the flexibility of the software design. The implicationns of the HEAO experience for the design process of future spacecraft are suggested.
PDEMOD: Software for control/structures optimization
NASA Technical Reports Server (NTRS)
Taylor, Lawrence W., Jr.; Zimmerman, David
1991-01-01
Because of the possibility of adverse interaction between the control system and the structural dynamics of large, flexible spacecraft, great care must be taken to ensure stability and system performance. Because of the high cost of insertion of mass into low earth orbit, it is prudent to optimize the roles of structure and control systems simultaneously. Because of the difficulty and the computational burden in modeling and analyzing the control structure system dynamics, the total problem is often split and treated iteratively. It would aid design if the control structure system dynamics could be represented in a single system of equations. With the use of the software PDEMOD (Partial Differential Equation Model), it is now possible to optimize structure and control systems simultaneously. The distributed parameter modeling approach enables embedding the control system dynamics into the same equations for the structural dynamics model. By doing this, the current difficulties involved in model order reduction are avoided. The NASA Mini-MAST truss is used an an example for studying integrated control structure design.
Instrumentation: Software-Driven Instrumentation: The New Wave.
ERIC Educational Resources Information Center
Salit, M. L.; Parsons, M. L.
1985-01-01
Software-driven instrumentation makes measurements that demand a computer as an integral part of either control, data acquisition, or data reduction. The structure of such instrumentation, hardware requirements, and software requirements are discussed. Examples of software-driven instrumentation (such as wavelength-modulated continuum source…
Pointo - a Low Cost Solution to Point Cloud Processing
NASA Astrophysics Data System (ADS)
Houshiar, H.; Winkler, S.
2017-11-01
With advance in technology access to data especially 3D point cloud data becomes more and more an everyday task. 3D point clouds are usually captured with very expensive tools such as 3D laser scanners or very time consuming methods such as photogrammetry. Most of the available softwares for 3D point cloud processing are designed for experts and specialists in this field and are usually very large software packages containing variety of methods and tools. This results in softwares that are usually very expensive to acquire and also very difficult to use. Difficulty of use is caused by complicated user interfaces that is required to accommodate a large list of features. The aim of these complex softwares is to provide a powerful tool for a specific group of specialist. However they are not necessary required by the majority of the up coming average users of point clouds. In addition to complexity and high costs of these softwares they generally rely on expensive and modern hardware and only compatible with one specific operating system. Many point cloud customers are not point cloud processing experts or willing to spend the high acquisition costs of these expensive softwares and hardwares. In this paper we introduce a solution for low cost point cloud processing. Our approach is designed to accommodate the needs of the average point cloud user. To reduce the cost and complexity of software our approach focuses on one functionality at a time in contrast with most available softwares and tools that aim to solve as many problems as possible at the same time. Our simple and user oriented design improve the user experience and empower us to optimize our methods for creation of an efficient software. In this paper we introduce Pointo family as a series of connected softwares to provide easy to use tools with simple design for different point cloud processing requirements. PointoVIEWER and PointoCAD are introduced as the first components of the Pointo family to provide a fast and efficient visualization with the ability to add annotation and documentation to the point clouds.
Cost Estimating Cases: Educational Tools for Cost Analysts
1993-09-01
only appropriate documentation should be provided. In other words, students should not submit all of the documentation possible using ACEIT , only that...case was their lack of understanding of the ACEIT software used to conduct the estimate. Specifically, many students misinterpreted the cost...estimating relationships (CERs) embedded in the 49 software. Additionally, few of the students were able to properly organize the ACEIT documentation output
[Cost analysis of intraoperative neurophysiological monitoring (IOM)].
Kombos, T; Suess, O; Brock, M
2002-01-01
A number of studies demonstrate that a significant reduction of postoperative neurological deficits can be achieved by applying intraoperative neurophysiological monitoring (IOM) methods. A cost analysis of IOM is imperative considering the strained financial situation in the public health services. The calculation model presented here comprises two cost components: material and personnel. The material costs comprise consumer goods and depreciation of capital goods. The computation base was 200 IOM cases per year. Consumer goods were calculated for each IOM procedure respectively. The following constellation served as a basis for calculating personnel costs: (a) a medical technician (salary level BAT Vc) for one hour per case; (b) a resident (BAT IIa) for the entire duration of the measurement, and (c) a senior resident (BAT Ia) only for supervision. An IOM device consisting of an 8-channel preamplifier, an electrical and acoustic stimulator and special software costs 66,467 euros on the average. With an annual depreciation of 20%, the costs are 13,293 euros per year. This amounts to 66.46 euros per case for the capital goods. For reusable materials a sum of 0.75 euro; per case was calculated. Disposable materials were calculate for each procedure respectively. Total costs of 228.02 euro; per case were,s a sum of 0.75 euros per case was calculated. Disposable materials were calculate for each procedure respectively. Total costs of 228.02 euros per case were, calculated for surgery on the peripheral nervous system. They amount to 196.40 euros per case for spinal interventions and to 347.63 euros per case for more complex spinal operations. Operations in the cerebellopontine angle and brain stem cost 376.63 euros and 397.33 euros per case respectively. IOM costs amount to 328.03 euros per case for surgical management of an intracranial aneurysm and to 537.15 euros per case for functional interventions. Expenses run up to 833.63 euros per case for operations near the motor cortex and to 117.65 euros per case for intraoperative speech monitoring. Costs for inpatient medical rehabilitation have increased considerably in recent years. In view of the financial situation, it is necessary to reduce postoperative morbidity and the costs it involves. IOM leads to a reduction of morbidity. The costs for IOM calculated here justify its routine application in view of the legal and socioeconomic consequences of surgery-related neurological deficits.
Inclusion of LCCA in Alaska flexible pavement design software manual.
DOT National Transportation Integrated Search
2012-10-01
Life cycle cost analysis is a key part for selecting materials and techniques that optimize the service life of a pavement in terms of cost and performance. While the Alaska : Flexible Pavement Design software has been in use since 2004, there is no ...
The Ruggedized STD Bus Microcomputer - A low cost computer suitable for Space Shuttle experiments
NASA Technical Reports Server (NTRS)
Budney, T. J.; Stone, R. W.
1982-01-01
Previous space flight computers have been costly in terms of both hardware and software. The Ruggedized STD Bus Microcomputer is based on the commercial Mostek/Pro-Log STD Bus. Ruggedized PC cards can be based on commercial cards from more than 60 manufacturers, reducing hardware cost and design time. Software costs are minimized by using standard 8-bit microprocessors and by debugging code using commercial versions of the ruggedized flight boards while the flight hardware is being fabricated.
CellProfiler and KNIME: open source tools for high content screening.
Stöter, Martin; Niederlein, Antje; Barsacchi, Rico; Meyenhofer, Felix; Brandl, Holger; Bickle, Marc
2013-01-01
High content screening (HCS) has established itself in the world of the pharmaceutical industry as an essential tool for drug discovery and drug development. HCS is currently starting to enter the academic world and might become a widely used technology. Given the diversity of problems tackled in academic research, HCS could experience some profound changes in the future, mainly with more imaging modalities and smart microscopes being developed. One of the limitations in the establishment of HCS in academia is flexibility and cost. Flexibility is important to be able to adapt the HCS setup to accommodate the multiple different assays typical of academia. Many cost factors cannot be avoided, but the costs of the software packages necessary to analyze large datasets can be reduced by using Open Source software. We present and discuss the Open Source software CellProfiler for image analysis and KNIME for data analysis and data mining that provide software solutions which increase flexibility and keep costs low.
The STARLINK software collection
NASA Astrophysics Data System (ADS)
Penny, A. J.; Wallace, P. T.; Sherman, J. C.; Terret, D. L.
1993-12-01
A demonstration will be given of some recent Starlink software. STARLINK is: a network of computers used by UK astronomers; a collection of programs for the calibration and analysis of astronomical data; a team of people giving hardware, software and administrative support. The Starlink Project has been in operation since 1980 to provide UK astronomers with interactive image processing and data reduction facilities. There are now Starlink computer systems at 25 UK locations, serving about 1500 registered users. The Starlink software collection now has about 25 major packages covering a wide range of astronomical data reduction and analysis techniques, as well as many smaller programs and utilities. At the core of most of the packages is a common `software environment', which provides many of the functions which applications need and offers standardized methods of structuring and accessing data. The software environment simplifies programming and support, and makes it easy to use different packages for different stages of the data reduction. Users see a consistent style, and can mix applications without hitting problems of differing data formats. The Project group coordinates the writing and distribution of this software collection, which is Unix based. Outside the UK, Starlink is used at a large number of places, which range from installations at major UK telescopes, which are Starlink-compatible and managed like Starlink sites, to individuals who run only small parts of the Starlink software collection.
Software Piracy, Ethics, and the Academician.
ERIC Educational Resources Information Center
Bassler, Richard A.
The numerous software programs available for easy, low-cost copying raise ethical questions. The problem can be examined from the viewpoints of software users, teachers, authors, vendors, and distributors. Software users might hesitate to purchase or use software which prevents the making of back-up copies for program protection. Teachers in…
Data Reduction and Control Software for Meteor Observing Stations Based on CCD Video Systems
NASA Technical Reports Server (NTRS)
Madiedo, J. M.; Trigo-Rodriguez, J. M.; Lyytinen, E.
2011-01-01
The SPanish Meteor Network (SPMN) is performing a continuous monitoring of meteor activity over Spain and neighbouring countries. The huge amount of data obtained by the 25 video observing stations that this network is currently operating made it necessary to develop new software packages to accomplish some tasks, such as data reduction and remote operation of autonomous systems based on high-sensitivity CCD video devices. The main characteristics of this software are described here.
Software Defined Radios - Architectures, Systems and Functions
NASA Technical Reports Server (NTRS)
Sims, William H.
2017-01-01
Software Defined Radio is an industry term describing a method of utilizing a minimum amount of Radio Frequency (RF)/analog electronics before digitization takes place. Upon digitization all other functions are performed in software/firmware. There are as many different types of SDRs as there are data systems. Software Defined Radio (SDR) technology has been proven in the commercial sector since the early 90's. Today's rapid advancement in mobile telephone reliability and power management capabilities exemplifies the effectiveness of the SDR technology for the modern communications market. In contrast the foundations of transponder technology presently qualified for satellite applications were developed during the early space program of the 1960's. SDR technology offers potential to revolutionize satellite transponder technology by increasing science data through-put capability by at least an order of magnitude. While the SDR is adaptive in nature and is "One-size-fits-all" by design, conventional transponders are built to a specific platform and must be redesigned for every new bus. The SDR uses a minimum amount of analog/Radio Frequency components to up/down-convert the RF signal to/from a digital format. Once analog data is digitized, all processing is performed using hardware logic. Typical SDR processes include; filtering, modulation, up/down converting and demodulation. This presentation will show how the emerging SDR market has leveraged the existing commercial sector to provide a path to a radiation tolerant SDR transponder. These innovations will reduce the cost of transceivers, a decrease in power requirements and a commensurate reduction in volume. A second pay-off is the increased flexibility of the SDR by allowing the same hardware to implement multiple transponder types by altering hardware logic - no change of analog hardware is required - all of which can be ultimately accomplished in orbit. This in turn would provide high capability and low cost transponder to programs of all sizes.
Software Defined Radios - Architectures, Systems and Functions
NASA Technical Reports Server (NTRS)
Sims, Herb
2017-01-01
Software Defined Radio is an industry term describing a method of utilizing a minimum amount of Radio Frequency (RF)/analog electronics before digitization takes place. Upon digitization all other functions are performed in software/firmware. There are as many different types of SDRs as there are data systems. Software Defined Radio (SDR) technology has been proven in the commercial sector since the early 90's. Today's rapid advancement in mobile telephone reliability and power management capabilities exemplifies the effectiveness of the SDR technology for the modern communications market. In contrast the foundations of transponder technology presently qualified for satellite applications were developed during the early space program of the 1960's. SDR technology offers potential to revolutionize satellite transponder technology by increasing science data through-put capability by at least an order of magnitude. While the SDR is adaptive in nature and is "One-size-fits-all" by design, conventional transponders are built to a specific platform and must be redesigned for every new bus. The SDR uses a minimum amount of analog/Radio Frequency components to up/down-convert the RF signal to/from a digital format. Once analog data is digitized, all processing is performed using hardware logic. Typical SDR processes include; filtering, modulation, up/down converting and demodulation. This presentation will show how the emerging SDR market has leveraged the existing commercial sector to provide a path to a radiation tolerant SDR transponder. These innovations will reduce the cost of transceivers, a decrease in power requirements and a commensurate reduction in volume. A second pay-off is the increased flexibility of the SDR by allowing the same hardware to implement multiple transponder types by altering hardware logic - no change of analog hardware is required - all of which can be ultimately accomplished in orbit. This in turn would provide high capability and low cost transponder to programs of all sizes
Costs to Automate Demand Response - Taxonomy and Results from Field Studies and Programs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Piette, Mary A.; Schetrit, Oren; Kiliccote, Sila
During the past decade, the technology to automate demand response (DR) in buildings and industrial facilities has advanced significantly. Automation allows rapid, repeatable, reliable operation. This study focuses on costs for DR automation in commercial buildings with some discussion on residential buildings and industrial facilities. DR automation technology relies on numerous components, including communication systems, hardware and software gateways, standards-based messaging protocols, controls and integration platforms, and measurement and telemetry systems. This report compares cost data from several DR automation programs and pilot projects, evaluates trends in the cost per unit of DR and kilowatts (kW) available from automated systems,more » and applies a standard naming convention and classification or taxonomy for system elements. Median costs for the 56 installed automated DR systems studied here are about $200/kW. The deviation around this median is large with costs in some cases being an order of magnitude great or less than the median. This wide range is a result of variations in system age, size of load reduction, sophistication, and type of equipment included in cost analysis. The costs to automate fast DR systems for ancillary services are not fully analyzed in this report because additional research is needed to determine the total cost to install, operate, and maintain these systems. However, recent research suggests that they could be developed at costs similar to those of existing hot-summer DR automation systems. This report considers installation and configuration costs and does include the costs of owning and operating DR automation systems. Future analysis of the latter costs should include the costs to the building or facility manager costs as well as utility or third party program manager cost.« less
ERIC Educational Resources Information Center
Clarke, Peter J.; Davis, Debra; King, Tariq M.; Pava, Jairo; Jones, Edward L.
2014-01-01
As software becomes more ubiquitous and complex, the cost of software bugs continues to grow at a staggering rate. To remedy this situation, there needs to be major improvement in the knowledge and application of software validation techniques. Although there are several software validation techniques, software testing continues to be one of the…
Software cost/resource modeling: Software quality tradeoff measurement
NASA Technical Reports Server (NTRS)
Lawler, R. W.
1980-01-01
A conceptual framework for treating software quality from a total system perspective is developed. Examples are given to show how system quality objectives may be allocated to hardware and software; to illustrate trades among quality factors, both hardware and software, to achieve system performance objectives; and to illustrate the impact of certain design choices on software functionality.
Studies in Software Cost Model Behavior: Do We Really Understand Cost Model Performance?
NASA Technical Reports Server (NTRS)
Lum, Karen; Hihn, Jairus; Menzies, Tim
2006-01-01
While there exists extensive literature on software cost estimation techniques, industry practice continues to rely upon standard regression-based algorithms. These software effort models are typically calibrated or tuned to local conditions using local data. This paper cautions that current approaches to model calibration often produce sub-optimal models because of the large variance problem inherent in cost data and by including far more effort multipliers than the data supports. Building optimal models requires that a wider range of models be considered while correctly calibrating these models requires rejection rules that prune variables and records and use multiple criteria for evaluating model performance. The main contribution of this paper is to document a standard method that integrates formal model identification, estimation, and validation. It also documents what we call the large variance problem that is a leading cause of cost model brittleness or instability.
NASA Astrophysics Data System (ADS)
S, Kyriacou; E, Kontoleontos; S, Weissenberger; L, Mangani; E, Casartelli; I, Skouteropoulou; M, Gattringer; A, Gehrer; M, Buchmayr
2014-03-01
An efficient hydraulic optimization procedure, suitable for industrial use, requires an advanced optimization tool (EASY software), a fast solver (block coupled CFD) and a flexible geometry generation tool. EASY optimization software is a PCA-driven metamodel-assisted Evolutionary Algorithm (MAEA (PCA)) that can be used in both single- (SOO) and multiobjective optimization (MOO) problems. In MAEAs, low cost surrogate evaluation models are used to screen out non-promising individuals during the evolution and exclude them from the expensive, problem specific evaluation, here the solution of Navier-Stokes equations. For additional reduction of the optimization CPU cost, the PCA technique is used to identify dependences among the design variables and to exploit them in order to efficiently drive the application of the evolution operators. To further enhance the hydraulic optimization procedure, a very robust and fast Navier-Stokes solver has been developed. This incompressible CFD solver employs a pressure-based block-coupled approach, solving the governing equations simultaneously. This method, apart from being robust and fast, also provides a big gain in terms of computational cost. In order to optimize the geometry of hydraulic machines, an automatic geometry and mesh generation tool is necessary. The geometry generation tool used in this work is entirely based on b-spline curves and surfaces. In what follows, the components of the tool chain are outlined in some detail and the optimization results of hydraulic machine components are shown in order to demonstrate the performance of the presented optimization procedure.
Mental Models of Software Forecasting
NASA Technical Reports Server (NTRS)
Hihn, J.; Griesel, A.; Bruno, K.; Fouser, T.; Tausworthe, R.
1993-01-01
The majority of software engineers resist the use of the currently available cost models. One problem is that the mathematical and statistical models that are currently available do not correspond with the mental models of the software engineers. In an earlier JPL funded study (Hihn and Habib-agahi, 1991) it was found that software engineers prefer to use analogical or analogy-like techniques to derive size and cost estimates, whereas curren CER's hide any analogy in the regression equations. In addition, the currently available models depend upon information which is not available during early planning when the most important forecasts must be made.
Deep space network software cost estimation model
NASA Technical Reports Server (NTRS)
Tausworthe, R. C.
1981-01-01
A parametric software cost estimation model prepared for Deep Space Network (DSN) Data Systems implementation tasks is presented. The resource estimation model incorporates principles and data from a number of existing models. The model calibrates task magnitude and difficulty, development environment, and software technology effects through prompted responses to a set of approximately 50 questions. Parameters in the model are adjusted to fit DSN software life cycle statistics. The estimation model output scales a standard DSN Work Breakdown Structure skeleton, which is then input into a PERT/CPM system, producing a detailed schedule and resource budget for the project being planned.
Modular Rocket Engine Control Software (MRECS)
NASA Technical Reports Server (NTRS)
Tarrant, Charlie; Crook, Jerry
1997-01-01
The Modular Rocket Engine Control Software (MRECS) Program is a technology demonstration effort designed to advance the state-of-the-art in launch vehicle propulsion systems. Its emphasis is on developing and demonstrating a modular software architecture for a generic, advanced engine control system that will result in lower software maintenance (operations) costs. It effectively accommodates software requirements changes that occur due to hardware. technology upgrades and engine development testing. Ground rules directed by MSFC were to optimize modularity and implement the software in the Ada programming language. MRECS system software and the software development environment utilize Commercial-Off-the-Shelf (COTS) products. This paper presents the objectives and benefits of the program. The software architecture, design, and development environment are described. MRECS tasks are defined and timing relationships given. Major accomplishment are listed. MRECS offers benefits to a wide variety of advanced technology programs in the areas of modular software, architecture, reuse software, and reduced software reverification time related to software changes. Currently, the program is focused on supporting MSFC in accomplishing a Space Shuttle Main Engine (SSME) hot-fire test at Stennis Space Center and the Low Cost Boost Technology (LCBT) Program.
Projecting manpower to attain quality
NASA Technical Reports Server (NTRS)
Rone, K. Y.
1983-01-01
The resulting model is useful as a projection tool but must be validated in order to be used as an on-going software cost engineering tool. A procedure is developed to facilitate the tracking of model projections and actual data to allow the model to be tuned. Finally, since the model must be used in an environment of overlapping development activities on a progression of software elements in development and maintenance, a manpower allocation model is developed for use in a steady state development/maintenance environment. In these days of soaring software costs it becomes increasingly important to properly manage a software development project. One element of the management task is the projection and tracking of manpower required to perform the task. In addition, since the total cost of the task is directly related to the initial quality built into the software, it becomes a necessity to project the development manpower in a way to attain that quality. An approach to projecting and tracking manpower with quality in mind is described.
Fault Tolerant Considerations and Methods for Guidance and Control Systems
1987-07-01
multifunction devices such as microprocessors with software. In striving toward the economic goal, however, a cost is incurred in a different coin, i.e...therefore been developed which reduces the software risk to acceptable proportions. Several of the techniques thus developed incur no significant cost ...complex that their design and implementation need computerized tools in order to be cost -effective (in a broad sense, including the capability of
Optimisation of MSW collection routes for minimum fuel consumption using 3D GIS modelling.
Tavares, G; Zsigraiova, Z; Semiao, V; Carvalho, M G
2009-03-01
Collection of municipal solid waste (MSW) may account for more than 70% of the total waste management budget, most of which is for fuel costs. It is therefore crucial to optimise the routing network used for waste collection and transportation. This paper proposes the use of geographical information systems (GIS) 3D route modelling software for waste collection and transportation, which adds one more degree of freedom to the system and allows driving routes to be optimised for minimum fuel consumption. The model takes into account the effects of road inclination and vehicle weight. It is applied to two different cases: routing waste collection vehicles in the city of Praia, the capital of Cape Verde, and routing the transport of waste from different municipalities of Santiago Island to an incineration plant. For the Praia city region, the 3D model that minimised fuel consumption yielded cost savings of 8% as compared with an approach that simply calculated the shortest 3D route. Remarkably, this was true despite the fact that the GIS-recommended fuel reduction route was actually 1.8% longer than the shortest possible travel distance. For the Santiago Island case, the difference was even more significant: a 12% fuel reduction for a similar total travel distance. These figures indicate the importance of considering both the relief of the terrain and fuel consumption in selecting a suitable cost function to optimise vehicle routing.
Numerical study of canister filters with alternatives filter cap configurations
NASA Astrophysics Data System (ADS)
Mohammed, A. N.; Daud, A. R.; Abdullah, K.; Seri, S. M.; Razali, M. A.; Hushim, M. F.; Khalid, A.
2017-09-01
Air filtration system and filter play an important role in getting a good quality air into turbo machinery such as gas turbine. The filtration system and filter has improved the quality of air and protect the gas turbine part from contaminants which could bring damage. During separation of contaminants from the air, pressure drop cannot be avoided but it can be minimized thus helps to reduce the intake losses of the engine [1]. This study is focused on the configuration of the filter in order to obtain the minimal pressure drop along the filter. The configuration used is the basic filter geometry provided by Salutary Avenue Manufacturing Sdn Bhd. and two modified canister filter cap which is designed based on the basic filter model. The geometries of the filter are generated by using SOLIDWORKS software and Computational Fluid Dynamics (CFD) software is used to analyse and simulates the flow through the filter. In this study, the parameters of the inlet velocity are 0.032 m/s, 0.063 m/s, 0.094 m/s and 0.126 m/s. The total pressure drop produce by basic, modified filter 1 and 2 is 292.3 Pa, 251.11 Pa and 274.7 Pa. The pressure drop reduction for the modified filter 1 is 41.19 Pa and 14.1% lower compared to basic filter and the pressure drop reduction for modified filter 2 is 17.6 Pa and 6.02% lower compared to the basic filter. The pressure drops for the basic filter are slightly different with the Salutary Avenue filter due to limited data and experiment details. CFD software are very reliable in running a simulation rather than produces the prototypes and conduct the experiment thus reducing overall time and cost in this study.
Code of Federal Regulations, 2012 CFR
2012-07-01
... such as disk, tape) or type (e.g., data bases, applications software, data base management software...., date bases, applications software, data base management software, utilities), sufficient to reflect... timeliness of the cost data, the FBI or other representatives of the Government shall have the right to...
Code of Federal Regulations, 2010 CFR
2010-07-01
... such as disk, tape) or type (e.g., data bases, applications software, data base management software...., date bases, applications software, data base management software, utilities), sufficient to reflect... timeliness of the cost data, the FBI or other representatives of the Government shall have the right to...
Code of Federal Regulations, 2011 CFR
2011-07-01
... such as disk, tape) or type (e.g., data bases, applications software, data base management software...., date bases, applications software, data base management software, utilities), sufficient to reflect... timeliness of the cost data, the FBI or other representatives of the Government shall have the right to...
Code of Federal Regulations, 2013 CFR
2013-07-01
... such as disk, tape) or type (e.g., data bases, applications software, data base management software...., date bases, applications software, data base management software, utilities), sufficient to reflect... timeliness of the cost data, the FBI or other representatives of the Government shall have the right to...
Code of Federal Regulations, 2014 CFR
2014-07-01
... such as disk, tape) or type (e.g., data bases, applications software, data base management software...., date bases, applications software, data base management software, utilities), sufficient to reflect... timeliness of the cost data, the FBI or other representatives of the Government shall have the right to...
Payload software technology: Software technology development plan
NASA Technical Reports Server (NTRS)
1977-01-01
Programmatic requirements for the advancement of software technology are identified for meeting the space flight requirements in the 1980 to 1990 time period. The development items are described, and software technology item derivation worksheets are presented along with the cost/time/priority assessments.
Software Reuse Within the Earth Science Community
NASA Technical Reports Server (NTRS)
Marshall, James J.; Olding, Steve; Wolfe, Robert E.; Delnore, Victor E.
2006-01-01
Scientific missions in the Earth sciences frequently require cost-effective, highly reliable, and easy-to-use software, which can be a challenge for software developers to provide. The NASA Earth Science Enterprise (ESE) spends a significant amount of resources developing software components and other software development artifacts that may also be of value if reused in other projects requiring similar functionality. In general, software reuse is often defined as utilizing existing software artifacts. Software reuse can improve productivity and quality while decreasing the cost of software development, as documented by case studies in the literature. Since large software systems are often the results of the integration of many smaller and sometimes reusable components, ensuring reusability of such software components becomes a necessity. Indeed, designing software components with reusability as a requirement can increase the software reuse potential within a community such as the NASA ESE community. The NASA Earth Science Data Systems (ESDS) Software Reuse Working Group is chartered to oversee the development of a process that will maximize the reuse potential of existing software components while recommending strategies for maximizing the reusability potential of yet-to-be-designed components. As part of this work, two surveys of the Earth science community were conducted. The first was performed in 2004 and distributed among government employees and contractors. A follow-up survey was performed in 2005 and distributed among a wider community, to include members of industry and academia. The surveys were designed to collect information on subjects such as the current software reuse practices of Earth science software developers, why they choose to reuse software, and what perceived barriers prevent them from reusing software. In this paper, we compare the results of these surveys, summarize the observed trends, and discuss the findings. The results are very similar, with the second, larger survey confirming the basic results of the first, smaller survey. The results suggest that reuse of ESE software can drive down the cost and time of system development, increase flexibility and responsiveness of these systems to new technologies and requirements, and increase effective and accountable community participation.
Requirements: Towards an understanding on why software projects fail
NASA Astrophysics Data System (ADS)
Hussain, Azham; Mkpojiogu, Emmanuel O. C.
2016-08-01
Requirement engineering is at the foundation of every successful software project. There are many reasons for software project failures; however, poorly engineered requirements process contributes immensely to the reason why software projects fail. Software project failure is usually costly and risky and could also be life threatening. Projects that undermine requirements engineering suffer or are likely to suffer from failures, challenges and other attending risks. The cost of project failures and overruns when estimated is very huge. Furthermore, software project failures or overruns pose a challenge in today's competitive market environment. It affects the company's image, goodwill, and revenue drive and decreases the perceived satisfaction of customers and clients. In this paper, requirements engineering was discussed. Its role in software projects success was elaborated. The place of software requirements process in relation to software project failure was explored and examined. Also, project success and failure factors were also discussed with emphasis placed on requirements factors as they play a major role in software projects' challenges, successes and failures. The paper relied on secondary data and empirical statistics to explore and examine factors responsible for the successes, challenges and failures of software projects in large, medium and small scaled software companies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Minana, Molly A.; Sturtevant, Judith E.; Heaphy, Robert
2005-01-01
The purpose of the Sandia National Laboratories (SNL) Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. Quality is defined in DOE/AL Quality Criteria (QC-1) as conformance to customer requirements and expectations. This quality plan defines the ASC program software quality practices and provides mappings of these practices to the SNL Corporate Process Requirements (CPR 1.3.2 and CPR 1.3.6) and the Department of Energy (DOE) document, ASCI Software Quality Engineering: Goals, Principles, and Guidelines (GP&G). This quality plan identifies ASC management andmore » software project teams' responsibilities for cost-effective software engineering quality practices. The SNL ASC Software Quality Plan establishes the signatories commitment to improving software products by applying cost-effective software engineering quality practices. This document explains the project teams opportunities for tailoring and implementing the practices; enumerates the practices that compose the development of SNL ASC's software products; and includes a sample assessment checklist that was developed based upon the practices in this document.« less
NASA Technical Reports Server (NTRS)
Pitman, C. L.; Erb, D. M.; Izygon, M. E.; Fridge, E. M., III; Roush, G. B.; Braley, D. M.; Savely, R. T.
1992-01-01
The United State's big space projects of the next decades, such as Space Station and the Human Exploration Initiative, will need the development of many millions of lines of mission critical software. NASA-Johnson (JSC) is identifying and developing some of the Computer Aided Software Engineering (CASE) technology that NASA will need to build these future software systems. The goal is to improve the quality and the productivity of large software development projects. New trends are outlined in CASE technology and how the Software Technology Branch (STB) at JSC is endeavoring to provide some of these CASE solutions for NASA is described. Key software technology components include knowledge-based systems, software reusability, user interface technology, reengineering environments, management systems for the software development process, software cost models, repository technology, and open, integrated CASE environment frameworks. The paper presents the status and long-term expectations for CASE products. The STB's Reengineering Application Project (REAP), Advanced Software Development Workstation (ASDW) project, and software development cost model (COSTMODL) project are then discussed. Some of the general difficulties of technology transfer are introduced, and a process developed by STB for CASE technology insertion is described.
Evaluating strategies for reducing scattered radiation in fixed-imaging hybrid operating suites.
Miller, Claire; Kendrick, Daniel; Shevitz, Andrew; Kim, Ann; Baele, Henry; Jordan, David; Kashyap, Vikram S
2018-04-01
High-resolution fixed C-arm fluoroscopic systems allow high-quality endovascular imaging but come at a cost of greater scatter radiation generation and increased occupational exposure for surgeons. The purpose of this study was to evaluate the efficacy of two methods in reducing scattered radiation exposure. There were 164 endovascular cases analyzed in three phases. In phase 1 (P1), baseline radiation exposure was calculated. In phase 2 (P2), staff used real-time radiation dose monitoring (dosimetry badges [RaySafe; Unfors, Hopkinton, Mass]). In phase 3 (P3), a software imaging algorithm was installed that reduced radiation (EcoDose software; Philips Healthcare, Best, The Netherlands). A total of 72 cases in P1, 34 cases in P2, and 58 cases in P3 were analyzed. Total mean dose-area product decreased across each phase, with statistical significance achieved for P1 vs P3 (mean ± standard error of the mean, 186,173 ± 16,754 mGy/cm 2 vs 121,536 ± 11,971 mGy/cm 2 ; P = .002) and P2 vs P3 (171,921 ± 26,276 mGy/cm 2 vs 121,536 ± 11,971 mGy/cm 2 ; P = .04), whereas total mean fluoroscopy time did not significantly differ across any phase. The radiation exposure to the primary operator did not change significantly from P1 to P2 but fell significantly in P3 (0.08 ± 0.02 mSv vs 0.03 ± 0.01 mSv; P = .02). The addition of dose reduction software had the most impact on endovascular aneurysm repair, with reductions in median room dose (P = .03) and primary operator exposure (P2 vs P3; 0.19 ± 0.04 mSv vs 0.03 ± 0.02 mSv; P < .01). Dose reduction software may be an effective technique to lower radiation exposure. Implementation of system-based strategies to reduce radiation is needed to reduce lifetime occupational radiation exposure for endovascular staff and to improve patient safety. Copyright © 2017 Society for Vascular Surgery. Published by Elsevier Inc. All rights reserved.
Three-dimensional planning in craniomaxillofacial surgery
Rubio-Palau, Josep; Prieto-Gundin, Alejandra; Cazalla, Asteria Albert; Serrano, Miguel Bejarano; Fructuoso, Gemma Garcia; Ferrandis, Francisco Parri; Baró, Alejandro Rivera
2016-01-01
Introduction: Three-dimensional (3D) planning in oral and maxillofacial surgery has become a standard in the planification of a variety of conditions such as dental implants and orthognathic surgery. By using custom-made cutting and positioning guides, the virtual surgery is exported to the operating room, increasing precision and improving results. Materials and Methods: We present our experience in the treatment of craniofacial deformities with 3D planning. Software to plan the different procedures has been selected for each case, depending on the procedure (Nobel Clinician, Kodak 3DS, Simplant O&O, Dolphin 3D, Timeus, Mimics and 3-Matic). The treatment protocol is exposed step by step from virtual planning, design, and printing of the cutting and positioning guides to patients’ outcomes. Conclusions: 3D planning reduces the surgical time and allows predicting possible difficulties and complications. On the other hand, it increases preoperative planning time and needs a learning curve. The only drawback is the cost of the procedure. At present, the additional preoperative work can be justified because of surgical time reduction and more predictable results. In the future, the cost and time investment will be reduced. 3D planning is here to stay. It is already a fact in craniofacial surgery and the investment is completely justified by the risk reduction and precise results. PMID:28299272
Three-dimensional planning in craniomaxillofacial surgery.
Rubio-Palau, Josep; Prieto-Gundin, Alejandra; Cazalla, Asteria Albert; Serrano, Miguel Bejarano; Fructuoso, Gemma Garcia; Ferrandis, Francisco Parri; Baró, Alejandro Rivera
2016-01-01
Three-dimensional (3D) planning in oral and maxillofacial surgery has become a standard in the planification of a variety of conditions such as dental implants and orthognathic surgery. By using custom-made cutting and positioning guides, the virtual surgery is exported to the operating room, increasing precision and improving results. We present our experience in the treatment of craniofacial deformities with 3D planning. Software to plan the different procedures has been selected for each case, depending on the procedure (Nobel Clinician, Kodak 3DS, Simplant O&O, Dolphin 3D, Timeus, Mimics and 3-Matic). The treatment protocol is exposed step by step from virtual planning, design, and printing of the cutting and positioning guides to patients' outcomes. 3D planning reduces the surgical time and allows predicting possible difficulties and complications. On the other hand, it increases preoperative planning time and needs a learning curve. The only drawback is the cost of the procedure. At present, the additional preoperative work can be justified because of surgical time reduction and more predictable results. In the future, the cost and time investment will be reduced. 3D planning is here to stay. It is already a fact in craniofacial surgery and the investment is completely justified by the risk reduction and precise results.
Towards high efficiency heliostat fields
NASA Astrophysics Data System (ADS)
Arbes, Florian; Wöhrbach, Markus; Gebreiter, Daniel; Weinrebe, Gerhard
2017-06-01
CSP power plants have great potential to substantially contribute to world energy supply. To set this free, cost reductions are required for future projects. Heliostat field layout optimization offers a great opportunity to improve field efficiency. Field efficiency primarily depends on the positions of the heliostats around the tower, commonly known as the heliostat field layout. Heliostat shape also influences efficiency. Improvements to optical efficiency results in electricity cost reduction without adding any extra technical complexity. Due to computational challenges heliostat fields are often arranged in patterns. The mathematical models of the radial staggered or spiral patterns are based on two parameters and thus lead to uniform patterns. Optical efficiencies of a heliostat field do not change uniformly with the distance to the tower, they even differ in the northern and southern field. A fixed pattern is not optimal in many parts of the heliostat field, especially when used as large scaled heliostat field. In this paper, two methods are described which allow to modify field density suitable to inconsistent field efficiencies. A new software for large scale heliostat field evaluation is presented, it allows for fast optimizations of several parameters for pattern modification routines. It was used to design a heliostat field with 23,000 heliostats, which is currently planned for a site in South Africa.
48 CFR 970.5215-4 - Cost reduction.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 48 Federal Acquisition Regulations System 5 2011-10-01 2011-10-01 false Cost reduction. 970.5215-4... and Operating Contracts 970.5215-4 Cost reduction. As prescribed in 970.1504-5(c), insert the following clause: Cost Reduction (AUG 2009) (a) General. It is the Department of Energy's (DOE's) intent to...
NASA Technical Reports Server (NTRS)
McNeill, Justin
1995-01-01
The Multimission Image Processing Subsystem (MIPS) at the Jet Propulsion Laboratory (JPL) has managed transitions of application software sets from one operating system and hardware platform to multiple operating systems and hardware platforms. As a part of these transitions, cost estimates were generated from the personal experience of in-house developers and managers to calculate the total effort required for such projects. Productivity measures have been collected for two such transitions, one very large and the other relatively small in terms of source lines of code. These estimates used a cost estimation model similar to the Software Engineering Laboratory (SEL) Effort Estimation Model. Experience in transitioning software within JPL MIPS have uncovered a high incidence of interface complexity. Interfaces, both internal and external to individual software applications, have contributed to software transition project complexity, and thus to scheduling difficulties and larger than anticipated design work on software to be ported.
Distributed augmented reality with 3-D lung dynamics--a planning tool concept.
Hamza-Lup, Felix G; Santhanam, Anand P; Imielińska, Celina; Meeks, Sanford L; Rolland, Jannick P
2007-01-01
Augmented reality (AR) systems add visual information to the world by using advanced display techniques. The advances in miniaturization and reduced hardware costs make some of these systems feasible for applications in a wide set of fields. We present a potential component of the cyber infrastructure for the operating room of the future: a distributed AR-based software-hardware system that allows real-time visualization of three-dimensional (3-D) lung dynamics superimposed directly on the patient's body. Several emergency events (e.g., closed and tension pneumothorax) and surgical procedures related to lung (e.g., lung transplantation, lung volume reduction surgery, surgical treatment of lung infections, lung cancer surgery) could benefit from the proposed prototype.
Simulation of trading strategies in the electricity market
NASA Astrophysics Data System (ADS)
Charkiewicz, Kamil; Nowak, Robert
2011-10-01
The main objective of the energy market existence is reduction of the total cost of production, transport and distribution of energy, and so the prices paid by terminal consumers. Energy market contains few markets that are varying on operational rules, the important segments: the Futures Contract Market and Next Day Market are analyzed in presented approach. The computer system was developed to simulate the Polish Energy Market. This system use the multi-agent approach, where each agent is the separate shared library with defined interface. The software was used to compare strategies for players in energy market, where the strategies uses auto-regression, k-nearest neighbours, neural network and mixed algorithm, to predict the next price.
Development and Application of New Quality Model for Software Projects
Karnavel, K.; Dillibabu, R.
2014-01-01
The IT industry tries to employ a number of models to identify the defects in the construction of software projects. In this paper, we present COQUALMO and its limitations and aim to increase the quality without increasing the cost and time. The computation time, cost, and effort to predict the residual defects are very high; this was overcome by developing an appropriate new quality model named the software testing defect corrective model (STDCM). The STDCM was used to estimate the number of remaining residual defects in the software product; a few assumptions and the detailed steps of the STDCM are highlighted. The application of the STDCM is explored in software projects. The implementation of the model is validated using statistical inference, which shows there is a significant improvement in the quality of the software projects. PMID:25478594
Development and application of new quality model for software projects.
Karnavel, K; Dillibabu, R
2014-01-01
The IT industry tries to employ a number of models to identify the defects in the construction of software projects. In this paper, we present COQUALMO and its limitations and aim to increase the quality without increasing the cost and time. The computation time, cost, and effort to predict the residual defects are very high; this was overcome by developing an appropriate new quality model named the software testing defect corrective model (STDCM). The STDCM was used to estimate the number of remaining residual defects in the software product; a few assumptions and the detailed steps of the STDCM are highlighted. The application of the STDCM is explored in software projects. The implementation of the model is validated using statistical inference, which shows there is a significant improvement in the quality of the software projects.
NOSC Program Managers Handbook. Revision 1
1988-02-01
cost. The effects of application of life-cycle cost analysis through the planning and RIDT&E phases of a program, and the " design to cost" concept on...is the plan for assuring the quality of the design , design documentation, and fabricated/assembled hardware and associated computer software. 13.5.3.2...listings and printouts, which document the n. requirements, design , or details of compute : software; explain the capabilities and limitations of the
10 CFR 436.15 - Formatting cost data.
Code of Federal Regulations, 2013 CFR
2013-01-01
... Procedures for Life Cycle Cost Analyses § 436.15 Formatting cost data. In establishing cost data under §§ 436... software referenced in the Life Cycle Cost Manual for the Federal Energy Management Program. ...
10 CFR 436.15 - Formatting cost data.
Code of Federal Regulations, 2014 CFR
2014-01-01
... Procedures for Life Cycle Cost Analyses § 436.15 Formatting cost data. In establishing cost data under §§ 436... software referenced in the Life Cycle Cost Manual for the Federal Energy Management Program. ...
10 CFR 436.15 - Formatting cost data.
Code of Federal Regulations, 2012 CFR
2012-01-01
... Procedures for Life Cycle Cost Analyses § 436.15 Formatting cost data. In establishing cost data under §§ 436... software referenced in the Life Cycle Cost Manual for the Federal Energy Management Program. ...
10 CFR 436.15 - Formatting cost data.
Code of Federal Regulations, 2010 CFR
2010-01-01
... Procedures for Life Cycle Cost Analyses § 436.15 Formatting cost data. In establishing cost data under §§ 436... software referenced in the Life Cycle Cost Manual for the Federal Energy Management Program. ...
[The planning of resource support of secondary medical care in hospital].
Kungurov, N V; Zil'berberg, N V
2010-01-01
The Ural Institute of dermatovenerology and immunopathology developed and implemented the software concerning the personalized total recording of medical services and pharmaceuticals. The Institute also presents such software as listing of medical services, software module of calculation of financial costs of implementing full standards of secondary medical care in case of chronic dermatopathy, reference book of standards of direct specific costs on laboratory and physiotherapy services, reference book of pharmaceuticals, testing systems and consumables. The unified information system of management recording is a good technique to substantiate the costs of the implementation of standards of medical care, including high-tech care with taking into account the results of total calculation of provided medical services.
Selecting a software development methodology. [of digital flight control systems
NASA Technical Reports Server (NTRS)
Jones, R. E.
1981-01-01
The state of the art analytical techniques for the development and verification of digital flight control software is studied and a practical designer oriented development and verification methodology is produced. The effectiveness of the analytic techniques chosen for the development and verification methodology are assessed both technically and financially. Technical assessments analyze the error preventing and detecting capabilities of the chosen technique in all of the pertinent software development phases. Financial assessments describe the cost impact of using the techniques, specifically, the cost of implementing and applying the techniques as well as the relizable cost savings. Both the technical and financial assessment are quantitative where possible. In the case of techniques which cannot be quantitatively assessed, qualitative judgements are expressed about the effectiveness and cost of the techniques. The reasons why quantitative assessments are not possible will be documented.
Cyborg beast: a low-cost 3d-printed prosthetic hand for children with upper-limb differences.
Zuniga, Jorge; Katsavelis, Dimitrios; Peck, Jean; Stollberg, John; Petrykowski, Marc; Carson, Adam; Fernandez, Cristina
2015-01-20
There is an increasing number of children with traumatic and congenital hand amputations or reductions. Children's prosthetic needs are complex due to their small size, constant growth, and psychosocial development. Families' financial resources play a crucial role in the prescription of prostheses for their children, especially when private insurance and public funding are insufficient. Electric-powered (i.e., myoelectric) and body-powered (i.e., mechanical) devices have been developed to accommodate children's needs, but the cost of maintenance and replacement represents an obstacle for many families. Due to the complexity and high cost of these prosthetic hands, they are not accessible to children from low-income, uninsured families or to children from developing countries. Advancements in computer-aided design (CAD) programs, additive manufacturing, and image editing software offer the possibility of designing, printing, and fitting prosthetic hands devices at a distance and at very low cost. The purpose of this preliminary investigation was to describe a low-cost three-dimensional (3D)-printed prosthetic hand for children with upper-limb reductions and to propose a prosthesis fitting methodology that can be performed at a distance. No significant mean differences were found between the anthropometric and range of motion measurements taken directly from the upper limbs of subjects versus those extracted from photographs. The Bland and Altman plots show no major bias and narrow limits of agreements for lengths and widths and small bias and wider limits of agreements for the range of motion measurements. The main finding of the survey was that our prosthetic device may have a significant potential to positively impact quality of life and daily usage, and can be incorporated in several activities at home and in school. This investigation describes a low-cost 3D-printed prosthetic hand for children and proposes a distance fitting procedure. The Cyborg Beast prosthetic hand and the proposed distance-fitting procedures may represent a possible low-cost alternative for children in developing countries and those who have limited access to health care providers. Further studies should examine the functionality, validity, durability, benefits, and rejection rate of this type of low-cost 3D-printed prosthetic device.
Open-source meteor detection software for low-cost single-board computers
NASA Astrophysics Data System (ADS)
Vida, D.; Zubović, D.; Šegon, D.; Gural, P.; Cupec, R.
2016-01-01
This work aims to overcome the current price threshold of meteor stations which can sometimes deter meteor enthusiasts from owning one. In recent years small card-sized computers became widely available and are used for numerous applications. To utilize such computers for meteor work, software which can run on them is needed. In this paper we present a detailed description of newly-developed open-source software for fireball and meteor detection optimized for running on low-cost single board computers. Furthermore, an update on the development of automated open-source software which will handle video capture, fireball and meteor detection, astrometry and photometry is given.
NASA Technical Reports Server (NTRS)
Basili, V. R.
1981-01-01
Work on metrics is discussed. Factors that affect software quality are reviewed. Metrics is discussed in terms of criteria achievements, reliability, and fault tolerance. Subjective and objective metrics are distinguished. Product/process and cost/quality metrics are characterized and discussed.
Software IV and V Research Priorities and Applied Program Accomplishments Within NASA
NASA Technical Reports Server (NTRS)
Blazy, Louis J.
2000-01-01
The mission of this research is to be world-class creators and facilitators of innovative, intelligent, high performance, reliable information technologies that enable NASA missions to (1) increase software safety and quality through error avoidance, early detection and resolution of errors, by utilizing and applying empirically based software engineering best practices; (2) ensure customer software risks are identified and/or that requirements are met and/or exceeded; (3) research, develop, apply, verify, and publish software technologies for competitive advantage and the advancement of science; and (4) facilitate the transfer of science and engineering data, methods, and practices to NASA, educational institutions, state agencies, and commercial organizations. The goals are to become a national Center Of Excellence (COE) in software and system independent verification and validation, and to become an international leading force in the field of software engineering for improving the safety, quality, reliability, and cost performance of software systems. This project addresses the following problems: Ensure safety of NASA missions, ensure requirements are met, minimize programmatic and technological risks of software development and operations, improve software quality, reduce costs and time to delivery, and improve the science of software engineering
Nema, Shubham; Hasan, Whidul; Bhargava, Anamika; Bhargava, Yogesh
2016-09-15
Behavioural neuroscience relies on software driven methods for behavioural assessment, but the field lacks cost-effective, robust, open source software for behavioural analysis. Here we propose a novel method which we called as ZebraTrack. It includes cost-effective imaging setup for distraction-free behavioural acquisition, automated tracking using open-source ImageJ software and workflow for extraction of behavioural endpoints. Our ImageJ algorithm is capable of providing control to users at key steps while maintaining automation in tracking without the need for the installation of external plugins. We have validated this method by testing novelty induced anxiety behaviour in adult zebrafish. Our results, in agreement with established findings, showed that during state-anxiety, zebrafish showed reduced distance travelled, increased thigmotaxis and freezing events. Furthermore, we proposed a method to represent both spatial and temporal distribution of choice-based behaviour which is currently not possible to represent using simple videograms. ZebraTrack method is simple and economical, yet robust enough to give results comparable with those obtained from costly proprietary software like Ethovision XT. We have developed and validated a novel cost-effective method for behavioural analysis of adult zebrafish using open-source ImageJ software. Copyright © 2016 Elsevier B.V. All rights reserved.
The impact of organizational structure on flight software cost risk
NASA Technical Reports Server (NTRS)
Hihn, Jairus; Lum, Karen; Monson, Erik
2004-01-01
This paper summarizes the final results of the follow-up study updating the estimated software effort growth for those projects that were still under development and including an evaluation of the roles versus observed cost risk for the missions included in the original study which expands the data set to thirteen missions.
Computer Software for Life Cycle Cost.
1987-04-01
34 111. 1111I .25 IL4 jj 16 MICROCOPY RESOLUTION TEST CHART hut FILE C AIR CoMMNAMN STFF COLLG STUJDET PORTO i COMpUTER SOFTWARE FOR LIFE CYCLE CO879...obsolete), physical life (utility before physically wearing out), or application life (utility in a given function)." (7:5) The costs are usually
The Elusive Cost of Library Software
ERIC Educational Resources Information Center
Breeding, Marshall
2009-01-01
Software pricing is not a straightforward issue, since each procurement involves a special business arrangement between a library and its chosen vendor. The author thinks that it is reasonable to scale the cost of a product to such factors as the size of the library, the complexity of the installation, the number of simultaneous users, or the…
Modular Rocket Engine Control Software (MRECS)
NASA Technical Reports Server (NTRS)
Tarrant, C.; Crook, J.
1998-01-01
The Modular Rocket Engine Control Software (MRECS) Program is a technology demonstration effort designed to advance the state-of-the-art in launch vehicle propulsion systems. Its emphasis is on developing and demonstrating a modular software architecture for advanced engine control systems that will result in lower software maintenance (operations) costs. It effectively accommodates software requirement changes that occur due to hardware technology upgrades and engine development testing. Ground rules directed by MSFC were to optimize modularity and implement the software in the Ada programming language. MRECS system software and the software development environment utilize Commercial-Off-the-Shelf (COTS) products. This paper presents the objectives, benefits, and status of the program. The software architecture, design, and development environment are described. MRECS tasks are defined and timing relationships given. Major accomplishments are listed. MRECS offers benefits to a wide variety of advanced technology programs in the areas of modular software architecture, reuse software, and reduced software reverification time related to software changes. MRECS was recently modified to support a Space Shuttle Main Engine (SSME) hot-fire test. Cold Flow and Flight Readiness Testing were completed before the test was cancelled. Currently, the program is focused on supporting NASA MSFC in accomplishing development testing of the Fastrac Engine, part of NASA's Low Cost Technologies (LCT) Program. MRECS will be used for all engine development testing.
Dimensional Error in Rapid Prototyping with Open Source Software and Low-cost 3D-printer
Andrade-Delgado, Laura; Telich-Tarriba, Jose E.; Fuente-del-Campo, Antonio; Altamirano-Arcos, Carlos A.
2018-01-01
Summary: Rapid prototyping models (RPMs) had been extensively used in craniofacial and maxillofacial surgery, especially in areas such as orthognathic surgery, posttraumatic or oncological reconstructions, and implantology. Economic limitations are higher in developing countries such as Mexico, where resources dedicated to health care are limited, therefore limiting the use of RPM to few selected centers. This article aims to determine the dimensional error of a low-cost fused deposition modeling 3D printer (Tronxy P802MA, Shenzhen, Tronxy Technology Co), with Open source software. An ordinary dry human mandible was scanned with a computed tomography device. The data were processed with open software to build a rapid prototype with a fused deposition machine. Linear measurements were performed to find the mean absolute and relative difference. The mean absolute and relative difference was 0.65 mm and 1.96%, respectively (P = 0.96). Low-cost FDM machines and Open Source Software are excellent options to manufacture RPM, with the benefit of low cost and a similar relative error than other more expensive technologies. PMID:29464171
Dimensional Error in Rapid Prototyping with Open Source Software and Low-cost 3D-printer.
Rendón-Medina, Marco A; Andrade-Delgado, Laura; Telich-Tarriba, Jose E; Fuente-Del-Campo, Antonio; Altamirano-Arcos, Carlos A
2018-01-01
Rapid prototyping models (RPMs) had been extensively used in craniofacial and maxillofacial surgery, especially in areas such as orthognathic surgery, posttraumatic or oncological reconstructions, and implantology. Economic limitations are higher in developing countries such as Mexico, where resources dedicated to health care are limited, therefore limiting the use of RPM to few selected centers. This article aims to determine the dimensional error of a low-cost fused deposition modeling 3D printer (Tronxy P802MA, Shenzhen, Tronxy Technology Co), with Open source software. An ordinary dry human mandible was scanned with a computed tomography device. The data were processed with open software to build a rapid prototype with a fused deposition machine. Linear measurements were performed to find the mean absolute and relative difference. The mean absolute and relative difference was 0.65 mm and 1.96%, respectively ( P = 0.96). Low-cost FDM machines and Open Source Software are excellent options to manufacture RPM, with the benefit of low cost and a similar relative error than other more expensive technologies.
Software and the future of programming languages.
Aho, Alfred V
2004-02-27
Although software is the key enabler of the global information infrastructure, the amount and extent of software in use in the world today are not widely understood, nor are the programming languages and paradigms that have been used to create the software. The vast size of the embedded base of existing software and the increasing costs of software maintenance, poor security, and limited functionality are posing significant challenges for the software R&D community.
Managing configuration software of ground software applications with glueware
NASA Technical Reports Server (NTRS)
Larsen, B.; Herrera, R.; Sesplaukis, T.; Cheng, L.; Sarrel, M.
2003-01-01
This paper reports on a simple, low-cost effort to streamline the configuration of the uplink software tools. Even though the existing ground system consisted of JPL and custom Cassini software rather than COTS, we chose a glueware approach--reintegrating with wrappers and bridges and adding minimal new functionality.
SOFTCOST - DEEP SPACE NETWORK SOFTWARE COST MODEL
NASA Technical Reports Server (NTRS)
Tausworthe, R. C.
1994-01-01
The early-on estimation of required resources and a schedule for the development and maintenance of software is usually the least precise aspect of the software life cycle. However, it is desirable to make some sort of an orderly and rational attempt at estimation in order to plan and organize an implementation effort. The Software Cost Estimation Model program, SOFTCOST, was developed to provide a consistent automated resource and schedule model which is more formalized than the often used guesswork model based on experience, intuition, and luck. SOFTCOST was developed after the evaluation of a number of existing cost estimation programs indicated that there was a need for a cost estimation program with a wide range of application and adaptability to diverse kinds of software. SOFTCOST combines several software cost models found in the open literature into one comprehensive set of algorithms that compensate for nearly fifty implementation factors relative to size of the task, inherited baseline, organizational and system environment, and difficulty of the task. SOFTCOST produces mean and variance estimates of software size, implementation productivity, recommended staff level, probable duration, amount of computer resources required, and amount and cost of software documentation. Since the confidence level for a project using mean estimates is small, the user is given the opportunity to enter risk-biased values for effort, duration, and staffing, to achieve higher confidence levels. SOFTCOST then produces a PERT/CPM file with subtask efforts, durations, and precedences defined so as to produce the Work Breakdown Structure (WBS) and schedule having the asked-for overall effort and duration. The SOFTCOST program operates in an interactive environment prompting the user for all of the required input. The program builds the supporting PERT data base in a file for later report generation or revision. The PERT schedule and the WBS schedule may be printed and stored in a file for later use. The SOFTCOST program is written in Microsoft BASIC for interactive execution and has been implemented on an IBM PC-XT/AT operating MS-DOS 2.1 or higher with 256K bytes of memory. SOFTCOST was originally developed for the Zylog Z80 system running under CP/M in 1981. It was converted to run on the IBM PC XT/AT in 1986. SOFTCOST is a copyrighted work with all copyright vested in NASA.
NASA Technical Reports Server (NTRS)
Bregman, Jesse; Harker, David; Dunham, E.; Rank, David; Temi, Pasquale
1997-01-01
Ames Research Center and UCSC have been working on the development of a Mid IR Camera for the KAO in order to search for extra galactic supernovae. The development of the camera and its associated data reduction software have been successfully completed. Spectral Imaging of the Orion Bar at 6.2 and 7.8 microns demonstrates the derotation and data reduction software which was developed.
ERIC Educational Resources Information Center
Stone, Antonia
1982-01-01
Provides general information on currently available microcomputers, computer programs (software), hardware requirements, software sources, costs, computer games, and programing. Includes a list of popular microcomputers, providing price category, model, list price, software (cassette, tape, disk), monitor specifications, amount of random access…
Automating spectral measurements
NASA Astrophysics Data System (ADS)
Goldstein, Fred T.
2008-09-01
This paper discusses the architecture of software utilized in spectroscopic measurements. As optical coatings become more sophisticated, there is mounting need to automate data acquisition (DAQ) from spectrophotometers. Such need is exacerbated when 100% inspection is required, ancillary devices are utilized, cost reduction is crucial, or security is vital. While instrument manufacturers normally provide point-and-click DAQ software, an application programming interface (API) may be missing. In such cases automation is impossible or expensive. An API is typically provided in libraries (*.dll, *.ocx) which may be embedded in user-developed applications. Users can thereby implement DAQ automation in several Windows languages. Another possibility, developed by FTG as an alternative to instrument manufacturers' software, is the ActiveX application (*.exe). ActiveX, a component of many Windows applications, provides means for programming and interoperability. This architecture permits a point-and-click program to act as automation client and server. Excel, for example, can control and be controlled by DAQ applications. Most importantly, ActiveX permits ancillary devices such as barcode readers and XY-stages to be easily and economically integrated into scanning procedures. Since an ActiveX application has its own user-interface, it can be independently tested. The ActiveX application then runs (visibly or invisibly) under DAQ software control. Automation capabilities are accessed via a built-in spectro-BASIC language with industry-standard (VBA-compatible) syntax. Supplementing ActiveX, spectro-BASIC also includes auxiliary serial port commands for interfacing programmable logic controllers (PLC). A typical application is automatic filter handling.
IRAF: Lessons for Project Longevity
NASA Astrophysics Data System (ADS)
Fitzpatrick, M.
2012-09-01
Although sometimes derided as a product of the 80's (or more generously, as a legacy system), the fact that IRAF remains a productive work environment for many astronomers today is a testament to one of its core design principles, portability. This idea has meaning beyond a survey of platforms in use at the peak of a project's active development; for true longevity, a project must be able to weather completely unimagined OS, hardware, data, staffing and political environments. A lack of attention to the broader issues of portability, or the true lifespan of a software system (e.g. archival science may extend for years beyond a given mission, upgraded or similar instruments may be developed that require the same reduction/analysis techniques, etc) might require costly new software development instead of simple code re-use. Additionally, one under-appreciated benefit to having a long history in the community is the trust that users have established in the science results produced by a particular system. However a software system evolves architecturally, preserving this trust (and by implication, the applications themselves) is the key to continued success. In this paper, we will discuss how the system architecture has allowed IRAF to navigate the many changes in computing since it was first released. It is hoped that the lessons learned can be adopted by software systems being built today so that they too can survive long enough to one day earn the distinction of being called a legacy system.
Using Selection Pressure as an Asset to Develop Reusable, Adaptable Software Systems
NASA Astrophysics Data System (ADS)
Berrick, S. W.; Lynnes, C.
2007-12-01
The Goddard Earth Sciences Data and Information Services Center (GES DISC) at NASA has over the years developed and honed a number of reusable architectural components for supporting large-scale data centers with a large customer base. These include a processing system (S4PM) and an archive system (S4PA) based upon a workflow engine called the Simple, Scalable, Script-based Science Processor (S4P); an online data visualization and analysis system (Giovanni); and the radically simple and fast data search tool, Mirador. These subsystems are currently reused internally in a variety of combinations to implement customized data management on behalf of instrument science teams and other science investigators. Some of these subsystems (S4P and S4PM) have also been reused by other data centers for operational science processing. Our experience has been that development and utilization of robust, interoperable, and reusable software systems can actually flourish in environments defined by heterogeneous commodity hardware systems, the emphasis on value-added customer service, and continual cost reduction pressures. The repeated internal reuse that is fostered by such an environment encourages and even forces changes to the software that make it more reusable and adaptable. Allowing and even encouraging such selective pressures to software development has been a key factor in the success of S4P and S4PM, which are now available to the open source community under the NASA Open Source Agreement.
Integrated testing and verification system for research flight software design document
NASA Technical Reports Server (NTRS)
Taylor, R. N.; Merilatt, R. L.; Osterweil, L. J.
1979-01-01
The NASA Langley Research Center is developing the MUST (Multipurpose User-oriented Software Technology) program to cut the cost of producing research flight software through a system of software support tools. The HAL/S language is the primary subject of the design. Boeing Computer Services Company (BCS) has designed an integrated verification and testing capability as part of MUST. Documentation, verification and test options are provided with special attention on real time, multiprocessing issues. The needs of the entire software production cycle have been considered, with effective management and reduced lifecycle costs as foremost goals. Capabilities have been included in the design for static detection of data flow anomalies involving communicating concurrent processes. Some types of ill formed process synchronization and deadlock also are detected statically.
Proactive Security Testing and Fuzzing
NASA Astrophysics Data System (ADS)
Takanen, Ari
Software is bound to have security critical flaws, and no testing or code auditing can ensure that software is flaw-less. But software security testing requirements have improved radically during the past years, largely due to criticism from security conscious consumers and Enterprise customers. Whereas in the past, security flaws were taken for granted (and patches were quietly and humbly installed), they now are probably one of the most common reasons why people switch vendors or software providers. The maintenance costs from security updates often add to become one of the biggest cost items to large Enterprise users. Fortunately test automation techniques have also improved. Techniques like model-based testing (MBT) enable efficient generation of security tests that reach good confidence levels in discovering zero-day mistakes in software. This technique is called fuzzing.
Cost Estimation of Software Development and the Implications for the Program Manager
1992-06-01
Software Lifecycle Model (SLIM), the Jensen System-4 model, the Software Productivity, Quality, and Reliability Estimator ( SPQR \\20), the Constructive...function models in current use are the Software Productivity, Quality, and Reliability Estimator ( SPQR /20) and the Software Architecture Sizing and...Estimator ( SPQR /20) was developed by T. Capers Jones of Software Productivity Research, Inc., in 1985. The model is intended to estimate the outcome
NASA Astrophysics Data System (ADS)
Leidig, Mathias; Teeuw, Richard M.; Gibson, Andrew D.
2016-08-01
The article presents a time series (2009-2013) analysis for a new version of the ;Digital Divide; concept that developed in the 1990s. Digital information technologies, such as the Internet, mobile phones and social media, provide vast amounts of data for decision-making and resource management. The Data Poverty Index (DPI) provides an open-source means of annually evaluating global access to data and information. The DPI can be used to monitor aspects of data and information availability at global and national levels, with potential application at local (district) levels. Access to data and information is a major factor in disaster risk reduction, increased resilience to disaster and improved adaptation to climate change. In that context, the DPI could be a useful tool for monitoring the Sustainable Development Goals of the Sendai Framework for Disaster Risk Reduction (2015-2030). The effects of severe data poverty, particularly limited access to geoinformatic data, free software and online training materials, are discussed in the context of sustainable development and disaster risk reduction. Unlike many other indices, the DPI is underpinned by datasets that are consistently provided annually for almost all the countries of the world and can be downloaded without restriction or cost.
Artificial intelligence approaches to software engineering
NASA Technical Reports Server (NTRS)
Johannes, James D.; Macdonald, James R.
1988-01-01
Artificial intelligence approaches to software engineering are examined. The software development life cycle is a sequence of not so well-defined phases. Improved techniques for developing systems have been formulated over the past 15 years, but pressure continues to attempt to reduce current costs. Software development technology seems to be standing still. The primary objective of the knowledge-based approach to software development presented in this paper is to avoid problem areas that lead to schedule slippages, cost overruns, or software products that fall short of their desired goals. Identifying and resolving software problems early, often in the phase in which they first occur, has been shown to contribute significantly to reducing risks in software development. Software development is not a mechanical process but a basic human activity. It requires clear thinking, work, and rework to be successful. The artificial intelligence approaches to software engineering presented support the software development life cycle through the use of software development techniques and methodologies in terms of changing current practices and methods. These should be replaced by better techniques that that improve the process of of software development and the quality of the resulting products. The software development process can be structured into well-defined steps, of which the interfaces are standardized, supported and checked by automated procedures that provide error detection, production of the documentation and ultimately support the actual design of complex programs.
ERIC Educational Resources Information Center
Paquet, Katherine G.
2013-01-01
Cloud computing may provide cost benefits for organizations by eliminating the overhead costs of software, hardware, and maintenance (e.g., license renewals, upgrading software, servers and their physical storage space, administration along with funding a large IT department). In addition to the promised savings, the organization may require…
Reducing Lifecycle Sustainment Costs
2015-05-01
ahead of government systems – Specific O&S needs in government: depots, software centers, VAMOSC/ ERP interfaces Implications of ERP Systems...funding is not allocated for its implementation . Technology Refresh often requires non-recurring engineering investment, but the Working Capital Funds...VAMOSC Systems – Cost and Software Data Reports (CSDRs) • Contractor Logistics Support Contracts • Includes subcontractor reporting – Effects of
NASA Astrophysics Data System (ADS)
Bearden, David A.; Duclos, Donald P.; Barrera, Mark J.; Mosher, Todd J.; Lao, Norman Y.
1997-12-01
Emerging technologies and micro-instrumentation are changing the way remote sensing spacecraft missions are developed and implemented. Government agencies responsible for procuring space systems are increasingly requesting analyses to estimate cost, performance and design impacts of advanced technology insertion for both state-of-the-art systems as well as systems to be built 5 to 10 years in the future. Numerous spacecraft technology development programs are being sponsored by Department of Defense (DoD) and National Aeronautics and Space Administration (NASA) agencies with the goal of enhancing spacecraft performance, reducing mass, and reducing cost. However, it is often the case that technology studies, in the interest of maximizing subsystem-level performance and/or mass reduction, do not anticipate synergistic system-level effects. Furthermore, even though technical risks are often identified as one of the largest cost drivers for space systems, many cost/design processes and models ignore effects of cost risk in the interest of quick estimates. To address these issues, the Aerospace Corporation developed a concept analysis methodology and associated software tools. These tools, collectively referred to as the concept analysis and design evaluation toolkit (CADET), facilitate system architecture studies and space system conceptual designs focusing on design heritage, technology selection, and associated effects on cost, risk and performance at the system and subsystem level. CADET allows: (1) quick response to technical design and cost questions; (2) assessment of the cost and performance impacts of existing and new designs/technologies; and (3) estimation of cost uncertainties and risks. These capabilities aid mission designers in determining the configuration of remote sensing missions that meet essential requirements in a cost- effective manner. This paper discuses the development of CADET modules and their application to several remote sensing satellite mission concepts.
NASA Technical Reports Server (NTRS)
Fordyce, Jess
1996-01-01
Work carried out to re-engineer the mission analysis segment of JPL's mission planning ground system architecture is reported on. The aim is to transform the existing software tools, originally developed for specific missions on different support environments, into an integrated, general purpose, multi-mission tool set. The issues considered are: the development of a partnership between software developers and users; the definition of key mission analysis functions; the development of a consensus based architecture; the move towards evolutionary change instead of revolutionary replacement; software reusability, and the minimization of future maintenance costs. The current status and aims of new developments are discussed and specific examples of cost savings and improved productivity are presented.
Advanced Structural Optimization Under Consideration of Cost Tracking
NASA Astrophysics Data System (ADS)
Zell, D.; Link, T.; Bickelmaier, S.; Albinger, J.; Weikert, S.; Cremaschi, F.; Wiegand, A.
2014-06-01
In order to improve the design process of launcher configurations in the early development phase, the software Multidisciplinary Optimization (MDO) was developed. The tool combines different efficient software tools such as Optimal Design Investigations (ODIN) for structural optimizations, Aerospace Trajectory Optimization Software (ASTOS) for trajectory and vehicle design optimization for a defined payload and mission.The present paper focuses to the integration and validation of ODIN. ODIN enables the user to optimize typical axis-symmetric structures by means of sizing the stiffening designs concerning strength and stability while minimizing the structural mass. In addition a fully automatic finite element model (FEM) generator module creates ready-to-run FEM models of a complete stage or launcher assembly.Cost tracking respectively future improvements concerning cost optimization are indicated.
Software for Photometric and Astrometric Reduction of Video Meteors
NASA Astrophysics Data System (ADS)
Atreya, Prakash; Christou, Apostolos
2007-12-01
SPARVM is a Software for Photometric and Astrometric Reduction of Video Meteors being developed at Armagh Observatory. It is written in Interactive Data Language (IDL) and is designed to run primarily under Linux platform. The basic features of the software will be derivation of light curves, estimation of angular velocity and radiant position for single station data. For double station data, calculation of 3D coordinates of meteors, velocity, brightness, and estimation of meteoroid's orbit including uncertainties. Currently, the software supports extraction of time and date from video frames, estimation of position of cameras (Azimuth, Altitude), finding stellar sources in video frames and transformation of coordinates from video, frames to Horizontal coordinate system (Azimuth, Altitude), and Equatorial coordinate system (RA, Dec).
ERIC Educational Resources Information Center
Wulfson, Stephen
1988-01-01
Presents reviews of six computer software programs for teaching science. Provides the publisher, grade level, cost, and descriptions of software, including: (1) "Recycling Logic"; (2) "Introduction to Biochemistry"; (3) "Food for Thought"; (4) "Watts in a Home"; (5) "Geology in Action"; and (6)…
Open source pipeline for ESPaDOnS reduction and analysis
NASA Astrophysics Data System (ADS)
Martioli, Eder; Teeple, Doug; Manset, Nadine; Devost, Daniel; Withington, Kanoa; Venne, Andre; Tannock, Megan
2012-09-01
OPERA is a Canada-France-Hawaii Telescope (CFHT) open source collaborative software project currently under development for an ESPaDOnS echelle spectro-polarimetric image reduction pipeline. OPERA is designed to be fully automated, performing calibrations and reduction, producing one-dimensional intensity and polarimetric spectra. The calibrations are performed on two-dimensional images. Spectra are extracted using an optimal extraction algorithm. While primarily designed for CFHT ESPaDOnS data, the pipeline is being written to be extensible to other echelle spectrographs. A primary design goal is to make use of fast, modern object-oriented technologies. Processing is controlled by a harness, which manages a set of processing modules, that make use of a collection of native OPERA software libraries and standard external software libraries. The harness and modules are completely parametrized by site configuration and instrument parameters. The software is open- ended, permitting users of OPERA to extend the pipeline capabilities. All these features have been designed to provide a portable infrastructure that facilitates collaborative development, code re-usability and extensibility. OPERA is free software with support for both GNU/Linux and MacOSX platforms. The pipeline is hosted on SourceForge under the name "opera-pipeline".
When to Renew Software Licences at HPC Centres? A Mathematical Analysis
NASA Astrophysics Data System (ADS)
Baolai, Ge; MacIsaac, Allan B.
2010-11-01
In this paper we study a common problem faced by many high performance computing (HPC) centres: When and how to renew commercial software licences. Software vendors often sell perpetual licences along with forward update and support contracts at an additional, annual cost. Every year or so, software support personnel and the budget units of HPC centres are required to make the decision of whether or not to renew such support, and usually such decisions are made intuitively. The total cost for a continuing support contract can, however, be costly. One might therefore want a rational answer to the question of whether the option for a renewal should be exercised and when. In an attempt to study this problem within a market framework, we present the mathematical problem derived for the day to day operation of a hypothetical HPC centre that charges for the use of software packages. In the mathematical model, we assume that the uncertainty comes from the demand, number of users using the packages, as well as the price. Further we assume the availability of up to date software versions may also affect the demand. We develop a renewal strategy that aims to maximize the expected profit from the use the software under consideration. The derived problem involves a decision tree, which constitutes a numerical procedure that can be processed in parallel.
Cost Effective Development of Usable Systems: Gaps between HCI and Software Architecture Design
NASA Astrophysics Data System (ADS)
Folmer, Eelke; Bosch, Jan
A software product with poor usability is likely to fail in a highly competitive market; therefore software developing organizations are paying more and more attention to ensuring the usability of their software. Practice, however, shows that product quality (which includes usability among others) is not that high as it could be. Studies of software projects (Pressman, 2001) reveal that organizations spend a relative large amount of money and effort on fixing usability problems during late stage development. Some of these problems could have been detected and fixed much earlier. This avoidable rework leads to high costs and because during development different tradeoffs have to be made, for example between cost and quality leads to systems with less than optimal usability. This problem has been around for a couple of decades especially after software engineering (SE) and human computer interaction (HCI) became disciplines on their own. While both disciplines developed themselves, several gaps appeared which are now receiving increased attention in research literature. Major gaps of understanding, both between suggested practice and how software is actually developed in industry, but also between the best practices of each of the fields have been identified (Carrol et al, 1994, Bass et al, 2001, Folmer and Bosch, 2002). In addition, there are gaps in the fields of differing terminology, concepts, education, and methods.
Vognild, Lars K; Burkow, Tatjana M; Luque, Luis Fernandez
2009-01-01
In this paper we present an approach to building personal health services, supporting following-up, physical exercising, health education, and psychosocial support for the chronically ill, based on free open source software and low-cost computers, mobile devices, and consumer health and fitness devices. We argue that this will lower the cost of the systems, which is important given the increasing number of people with chronicle diseases and limited healthcare budgets.
Low-cost data analysis systems for processing multispectral scanner data
NASA Technical Reports Server (NTRS)
Whitely, S. L.
1976-01-01
The basic hardware and software requirements are described for four low cost analysis systems for computer generated land use maps. The data analysis systems consist of an image display system, a small digital computer, and an output recording device. Software is described together with some of the display and recording devices, and typical costs are cited. Computer requirements are given, and two approaches are described for converting black-white film and electrostatic printer output to inexpensive color output products. Examples of output products are shown.
ERIC Educational Resources Information Center
Northwest Regional Educational Lab., Portland, OR.
This document consists of 24 microcomputer software package evaluations prepared by the MicroSIFT (Microcomputer Software and Information for Teachers) Clearinghouse at the Northwest Regional Educational Laboratory. Each software review lists source, cost, ability level, subject, topic, medium of transfer, required hardware, required software,…
Software Size Estimation Using Expert Estimation: A Fuzzy Logic Approach
ERIC Educational Resources Information Center
Stevenson, Glenn A.
2012-01-01
For decades software managers have been using formal methodologies such as the Constructive Cost Model and Function Points to estimate the effort of software projects during the early stages of project development. While some research shows these methodologies to be effective, many software managers feel that they are overly complicated to use and…
Estimating Software Effort Hours for Major Defense Acquisition Programs
ERIC Educational Resources Information Center
Wallshein, Corinne C.
2010-01-01
Software Cost Estimation (SCE) uses labor hours or effort required to conceptualize, develop, integrate, test, field, or maintain program components. Department of Defense (DoD) SCE can use initial software data parameters to project effort hours for large, software-intensive programs for contractors reporting the top levels of process maturity,…
Automated Translation of Safety Critical Application Software Specifications into PLC Ladder Logic
NASA Technical Reports Server (NTRS)
Leucht, Kurt W.; Semmel, Glenn S.
2008-01-01
The numerous benefits of automatic application code generation are widely accepted within the software engineering community. A few of these benefits include raising the abstraction level of application programming, shorter product development time, lower maintenance costs, and increased code quality and consistency. Surprisingly, code generation concepts have not yet found wide acceptance and use in the field of programmable logic controller (PLC) software development. Software engineers at the NASA Kennedy Space Center (KSC) recognized the need for PLC code generation while developing their new ground checkout and launch processing system. They developed a process and a prototype software tool that automatically translates a high-level representation or specification of safety critical application software into ladder logic that executes on a PLC. This process and tool are expected to increase the reliability of the PLC code over that which is written manually, and may even lower life-cycle costs and shorten the development schedule of the new control system at KSC. This paper examines the problem domain and discusses the process and software tool that were prototyped by the KSC software engineers.
Trends in computer hardware and software.
Frankenfeld, F M
1993-04-01
Previously identified and current trends in the development of computer systems and in the use of computers for health care applications are reviewed. Trends identified in a 1982 article were increasing miniaturization and archival ability, increasing software costs, increasing software independence, user empowerment through new software technologies, shorter computer-system life cycles, and more rapid development and support of pharmaceutical services. Most of these trends continue today. Current trends in hardware and software include the increasing use of reduced instruction-set computing, migration to the UNIX operating system, the development of large software libraries, microprocessor-based smart terminals that allow remote validation of data, speech synthesis and recognition, application generators, fourth-generation languages, computer-aided software engineering, object-oriented technologies, and artificial intelligence. Current trends specific to pharmacy and hospitals are the withdrawal of vendors of hospital information systems from the pharmacy market, improved linkage of information systems within hospitals, and increased regulation by government. The computer industry and its products continue to undergo dynamic change. Software development continues to lag behind hardware, and its high cost is offsetting the savings provided by hardware.
Cost-Reduction Roadmap Outlines Two Pathways to Meet DOE Residential Solar
Cost Target for 2030 | News | NREL Cost-Reduction Roadmap Outlines Two Pathways to Meet DOE Residential Solar Cost Target for 2030 News Release: Cost-Reduction Roadmap Outlines Two Pathways to Meet DOE Residential Solar Cost Target for 2030 Installing photovoltaics at the time of roof replacement or as part of
NASA Technical Reports Server (NTRS)
Sims. Herb; Varnavas, Kosta; Eberly, Eric
2013-01-01
Software Defined Radio (SDR) technology has been proven in the commercial sector since the early 1990's. Today's rapid advancement in mobile telephone reliability and power management capabilities exemplifies the effectiveness of the SDR technology for the modern communications market. In contrast, presently qualified satellite transponder applications were developed during the early 1960's space program. Programmable Ultra Lightweight System Adaptable Radio (PULSAR, NASA-MSFC SDR) technology revolutionizes satellite transponder technology by increasing data through-put capability by, at least, an order of magnitude. PULSAR leverages existing Marshall Space Flight Center SDR designs and commercially enhanced capabilities to provide a path to a radiation tolerant SDR transponder. These innovations will (1) reduce the cost of NASA Low Earth Orbit (LEO) and Deep Space transponders, (2) decrease power requirements, and (3) a commensurate volume reduction. Also, PULSAR increases flexibility to implement multiple transponder types by utilizing the same hardware with altered logic - no analog hardware change is required - all of which can be accomplished in orbit. This provides high capability, low cost, transponders to programs of all sizes. The final project outcome would be the introduction of a Technology Readiness Level (TRL) 7 low-cost CubeSat to SmallSat telemetry system into the NASA Portfolio.
48 CFR 252.251-7000 - Ordering from Government supply sources.
Code of Federal Regulations, 2010 CFR
2010-10-01
... Enterprise Software Agreements, the Contractor shall follow the terms of the applicable schedule or agreement... Enterprise Software Agreement contractor). (2) The following statement: Any price reductions negotiated as part of an Enterprise Software Agreement issued under a Federal Supply Schedule contract shall control...
Evaluation and selection of security products for authentication of computer software
NASA Astrophysics Data System (ADS)
Roenigk, Mark W.
2000-04-01
Software Piracy is estimated to cost software companies over eleven billion dollars per year in lost revenue worldwide. Over fifty three percent of all intellectual property in the form of software is pirated on a global basis. Software piracy has a dramatic effect on the employment figures for the information industry as well. In the US alone, over 130,000 jobs are lost annually as a result of software piracy.
Data systems and computer science: Software Engineering Program
NASA Technical Reports Server (NTRS)
Zygielbaum, Arthur I.
1991-01-01
An external review of the Integrated Technology Plan for the Civil Space Program is presented. This review is specifically concerned with the Software Engineering Program. The goals of the Software Engineering Program are as follows: (1) improve NASA's ability to manage development, operation, and maintenance of complex software systems; (2) decrease NASA's cost and risk in engineering complex software systems; and (3) provide technology to assure safety and reliability of software in mission critical applications.
Emerging information management technologies and the future of disease management.
Nobel, Jeremy J; Norman, Gordon K
2003-01-01
Disease management (DM) has become a widely accepted way to support care delivery in the chronically ill patient population. Patients enrolled in these programs have been shown to have better health, fewer complications and comorbidities, and lower health care costs. The development of advanced information management technologies is further enhancing the role DM plays in optimizing outcomes and cost-effectiveness in clinical care. These emerging information management technologies (EIMT) include advances in software, hardware, and networking, all of which share common impact attributes in their ability to improve cost-effectiveness of care, quality of care, and access to care. Specific examples include interactive websites with the ability to engage patients in the self-care management process, the embedding of biometric devices (digital scales, modem-enabled glucose meters in the home, blood pressure monitoring, etc.), workflow and care coordination programs that add intelligence via guideline-directed alerts and reminders to the delivery process, registries that include a summary of personal health data that can be used as a reference point for improved clinical decisions, and the systematic collection of aggregated, de-identified clinical, administrative, and cost data into comprehensive data sets to which predictive modeling analytic tools can be applied. By way of case example, we also present data from a controlled clinical trial utilizing EIMT in the form of home-based weight measurement using a digital scale and linkage to a care coordination center for the management of severe congestive heart failure. Outcome results on 85,515 patient-months of an aggregate commercial and Medicare continuously enrolled population demonstrated an average reduction of care utilization (hospitalization) of 57% and a reduction in related delivery cost (per member per year payments) of 55%. We conclude that EIMT have already begun to offer significant and quantifiable benefits to DM and are likely to become heavily embedded in care management strategies in the future.
What Librarians Still Don't Know about Free Software
ERIC Educational Resources Information Center
Chudnov, Daniel
2009-01-01
Free software isn't about cost, and it isn't about hype, it isn't about taking business away from vendors. It's about four kinds of freedom--the freedom to use the software for any purpose, the freedom to study how the software works, the freedom to modify the software to adapt it to one's needs, and the freedom to copy and share copies of the…
NASA Technical Reports Server (NTRS)
Christenson, D.; Gordon, M.; Kistler, R.; Kriegler, F.; Lampert, S.; Marshall, R.; Mclaughlin, R.
1977-01-01
A third-generation, fast, low cost, multispectral recognition system (MIDAS) able to keep pace with the large quantity and high rates of data acquisition from large regions with present and projected sensots is described. The program can process a complete ERTS frame in forty seconds and provide a color map of sixteen constituent categories in a few minutes. A principle objective of the MIDAS program is to provide a system well interfaced with the human operator and thus to obtain large overall reductions in turn-around time and significant gains in throughput. The hardware and software generated in the overall program is described. The system contains a midi-computer to control the various high speed processing elements in the data path, a preprocessor to condition data, and a classifier which implements an all digital prototype multivariate Gaussian maximum likelihood or a Bayesian decision algorithm. Sufficient software was developed to perform signature extraction, control the preprocessor, compute classifier coefficients, control the classifier operation, operate the color display and printer, and diagnose operation.
10 CFR 961.11 - Text of the contract.
Code of Federal Regulations, 2014 CFR
2014-01-01
... program including information on cost projections, project plans and progress reports. 5. (a) Beginning on...-type documents or computer software (including computer programs, computer software data bases, and computer software documentation). Examples of technical data include research and engineering data...
10 CFR 961.11 - Text of the contract.
Code of Federal Regulations, 2013 CFR
2013-01-01
... program including information on cost projections, project plans and progress reports. 5. (a) Beginning on...-type documents or computer software (including computer programs, computer software data bases, and computer software documentation). Examples of technical data include research and engineering data...
Influence of model order reduction methods on dynamical-optical simulations
NASA Astrophysics Data System (ADS)
Störkle, Johannes; Eberhard, Peter
2017-04-01
In this work, the influence of model order reduction (MOR) methods on optical aberrations is analyzed within a dynamical-optical simulation of a high precision optomechanical system. Therefore, an integrated modeling process and new methods have to be introduced for the computation and investigation of the overall dynamical-optical behavior. For instance, this optical system can be a telescope optic or a lithographic objective. In order to derive a simplified mechanical model for transient time simulations with low computational cost, the method of elastic multibody systems in combination with MOR methods can be used. For this, software tools and interfaces are defined and created. Furthermore, mechanical and optical simulation models are derived and implemented. With these, on the one hand, the mechanical sensitivity can be investigated for arbitrary external excitations and on the other hand, the related optical behavior can be predicted. In order to clarify these methods, academic examples are chosen and the influences of the MOR methods and simulation strategies are analyzed. Finally, the systems are investigated with respect to the mechanical-optical frequency responses, and in conclusion, some recommendations for the application of reduction methods are given.
Dexter, Franklin; Abouleish, Amr E; Epstein, Richard H; Whitten, Charles W; Lubarsky, David A
2003-10-01
Potential benefits to reducing turnover times are both quantitative (e.g., complete more cases and reduce staffing costs) and qualitative (e.g., improve professional satisfaction). Analyses have shown the quantitative arguments to be unsound except for reducing staffing costs. We describe a methodology by which each surgical suite can use its own numbers to calculate its individual potential reduction in staffing costs from reducing its turnover times. Calculations estimate optimal allocated operating room (OR) time (based on maximizing OR efficiency) before and after reducing the maximum and average turnover times. At four academic tertiary hospitals, reductions in average turnover times of 3 to 9 min would result in 0.8% to 1.8% reductions in staffing cost. Reductions in average turnover times of 10 to 19 min would result in 2.5% to 4.0% reductions in staffing costs. These reductions in staffing cost are achieved predominantly by reducing allocated OR time, not by reducing the hours that staff work late. Heads of anesthesiology groups often serve on OR committees that are fixated on turnover times. Rather than having to argue based on scientific studies, this methodology provides the ability to show the specific quantitative effects (small decreases in staffing costs and allocated OR time) of reducing turnover time using a surgical suite's own data. Many anesthesiologists work at hospitals where surgeons and/or operating room (OR) committees focus repeatedly on turnover time reduction. We developed a methodology by which the reductions in staffing cost as a result of turnover time reduction can be calculated for each facility using its own data. Staffing cost reductions are generally very small and would be achieved predominantly by reducing allocated OR time to the surgeons.
Implementing Software Safety in the NASA Environment
NASA Technical Reports Server (NTRS)
Wetherholt, Martha S.; Radley, Charles F.
1994-01-01
Until recently, NASA did not consider allowing computers total control of flight systems. Human operators, via hardware, have constituted the ultimate safety control. In an attempt to reduce costs, NASA has come to rely more and more heavily on computers and software to control space missions. (For example. software is now planned to control most of the operational functions of the International Space Station.) Thus the need for systematic software safety programs has become crucial for mission success. Concurrent engineering principles dictate that safety should be designed into software up front, not tested into the software after the fact. 'Cost of Quality' studies have statistics and metrics to prove the value of building quality and safety into the development cycle. Unfortunately, most software engineers are not familiar with designing for safety, and most safety engineers are not software experts. Software written to specifications which have not been safety analyzed is a major source of computer related accidents. Safer software is achieved step by step throughout the system and software life cycle. It is a process that includes requirements definition, hazard analyses, formal software inspections, safety analyses, testing, and maintenance. The greatest emphasis is placed on clearly and completely defining system and software requirements, including safety and reliability requirements. Unfortunately, development and review of requirements are the weakest link in the process. While some of the more academic methods, e.g. mathematical models, may help bring about safer software, this paper proposes the use of currently approved software methodologies, and sound software and assurance practices to show how, to a large degree, safety can be designed into software from the start. NASA's approach today is to first conduct a preliminary system hazard analysis (PHA) during the concept and planning phase of a project. This determines the overall hazard potential of the system to be built. Shortly thereafter, as the system requirements are being defined, the second iteration of hazard analyses takes place, the systems hazard analysis (SHA). During the systems requirements phase, decisions are made as to what functions of the system will be the responsibility of software. This is the most critical time to affect the safety of the software. From this point, software safety analyses as well as software engineering practices are the main focus for assuring safe software. While many of the steps proposed in this paper seem like just sound engineering practices, they are the best technical and most cost effective means to assure safe software within a safe system.
Voice recognition software for clinical use.
Korn, K
1998-11-01
The current generation voice recognition products truly offer the promise of voice recognition systems, that are financially and operationally acceptable for use in a health care facility. Although the initial capital outlay for the purchase of such equipment may be substantial, the long-term benefit is felt to outweigh the expense. The ability to utilize computer equipment for educational purposes and information management alone helps to rationalize the cost. In addition, it is important to remember that the Internet has become a substantial source of information which provides another functional use for this equipment. Although one can readily see the implication for such a program in clinical practice, other uses for the program should not be overlooked. Uses far beyond the writing of clinic notes and correspondence can be easily envisioned. Utilization of voice recognition software offers clinical practices the ability to produce quality printed records in a timely and cost-effective manner. After learning procedures for the selected product and appropriately formatting word processing software and printers, printed progress notes should be able to be produced in less time than traditional dictation and transcription methods. Although certain procedures and practices may need to be altered, or may preclude optimal utilization of this type of system, many advantages are apparent. It is recommended that facilities consider utilization of Voice Recognition products such as Dragon Systems Naturally Speaking Software, or at least consider a trial of this method with one of the limited-feature products, if current dictation practices are unsatisfactory or excessively costly. Free downloadable trial software or single user software can provide a reduced-cost method for trial evaluation of such products if a major commitment is not felt to be desired. A list of voice recognition software manufacturer web sites may be accessed through the following: http://www.dragonsys.com/ http://www.software.ibm/com/is/voicetype/ http://www.lhs.com/
NASA Technical Reports Server (NTRS)
Feather, M. S.
2001-01-01
Risk assessment and mitigation is the focus of the Defect Detection and Prevention (DDP) process, which has been applied to spacecraft technology assessments and planning, both hardware and software. DDP's major elements and their relevance to core requirement engineering concerns are summarized. The accompanying research demonstration illustrates DDP's tool support, and further customizations for application to software.
NASA Technical Reports Server (NTRS)
Rosenberg, Linda H.; Arthur, James D.; Stapko, Ruth K.; Davani, Darush
1999-01-01
The Software Assurance Technology Center (SATC) at NASA Goddard Space Flight Center has been investigating how projects can determine when sufficient testing has been completed. For most projects, schedules are underestimated, and the last phase of the software development, testing, must be decreased. Two questions are frequently asked: "To what extent is the software error-free? " and "How much time and effort is required to detect and remove the remaining errors? " Clearly, neither question can be answered with absolute certainty. Nonetheless, the ability to answer these questions with some acceptable level of confidence is highly desirable. First, knowing the extent to which a product is error-free, we can judge when it is time to terminate testing. Secondly, if errors are judged to be present, we can perform a cost/benefit trade-off analysis to estimate when the software will be ready for use and at what cost. This paper explains the efforts of the SATC to help projects determine what is sufficient testing and when is the most cost-effective time to stop testing.
Cost-Reduction Roadmap for Residential Solar Photovoltaics (PV),
2017-2030 | Solar Research | NREL Cost-Reduction Roadmap for Residential Solar Photovoltaics (PV), 2017-2030 Cost-Reduction Roadmap for Residential Solar Photovoltaics (PV), 2017-2030 This report Office (SETO) residential 2030 photovoltaics (PV) cost target of $0.05 per kilowatt-hour by identifying
NASA Technical Reports Server (NTRS)
Szczur, Martha R.
1991-01-01
The Transportable Applications Environment (TAE) Plus, developed at NASA's Goddard Space Flight Center, is an advanced portable user interface development environment which simplifies the process of creating and managing complex application graphical user interfaces (GUIs), supports prototyping, allows applications to be oported easily between different platforms, and encourages appropriate levels of user interface consistency between applications. This paper discusses the capabilities of the TAE Plus tool, and how it makes the job of designing and developing GUIs easier for the application developers. The paper also explains how tools like TAE Plus provide for reusability and ensure reliability of UI software components, as well as how they aid in the reduction of development and maintenance costs.
SAGA: A project to automate the management of software production systems
NASA Technical Reports Server (NTRS)
Campbell, Roy H.; Beckman-Davies, C. S.; Benzinger, L.; Beshers, G.; Laliberte, D.; Render, H.; Sum, R.; Smith, W.; Terwilliger, R.
1986-01-01
Research into software development is required to reduce its production cost and to improve its quality. Modern software systems, such as the embedded software required for NASA's space station initiative, stretch current software engineering techniques. The requirements to build large, reliable, and maintainable software systems increases with time. Much theoretical and practical research is in progress to improve software engineering techniques. One such technique is to build a software system or environment which directly supports the software engineering process, i.e., the SAGA project, comprising the research necessary to design and build a software development which automates the software engineering process. Progress under SAGA is described.
Refurbishment of a Victorian terraced house for energy efficiency
NASA Astrophysics Data System (ADS)
Dimitriou, Angeliki
The impacts of global warming are now obvious. The international community has committed itself to reduce CO2 emissions, the main contributor to the greenhouse effect, both at international and national levels. In the Kyoto Protocol signed in 1997, countries have committed to reduce their greenhouse gases emissions below their 1990 levels by the period 2008-2012. The UK specifically should reduce those emissions by 12.5%. Format reason, the UK has introduced a package of policies, which promote not only the use of renewable energy resources, but most importantly the reduction in energy use, with energy efficiency. Refurbishment of existing houses has and will contribute to the reduction of energy consumption. A Victorian mid-terraced house was studied in this report, and different refurbishment measures were tested, using two software programmes: TAS and SAP. The targets were to achieve certain levels of thermal comfort, to comply with the Building Regulation for building thermal elements and to achieve a high SAP rating. Then, the cost of each measure was calculated and its CO2 emissions were compared. Heat losses were mainly through the walls and roof. Roof and mainly wall refurbishment measures reduce the heating loads the most. Ground floor insulation does not contribute to the reduction of the heating loads, on the contrary it has detrimental effect in summer, where the cooling effect coming from the ground is being reduced. Window replacement achieves a very good performance in summer resulting in the reduction of overheating. Wall and roof insulation increase the SAP rating the most, between the building elements, but boiler replacement and upgrading of heating controls increase it more. According to the SAP rating, CO2 annual emissions are reduced the most by boiler replacement and then by wall and roof. The results given by the two softwares concerning which measure is more leads more to energy efficiency, are the same. Finally, if the measures which lead to the best energy performance are combined together, then the house could cut its energy bills by half, and could have 70% reduction in CO2 emissions.
Front-End/Gateway Software: Availability and Usefulness.
ERIC Educational Resources Information Center
Kesselman, Martin
1985-01-01
Reviews features of front-end software packages (interface between user and online system)--database selection, search strategy development, saving and downloading, hardware and software requirements, training and documentation, online systems and database accession, and costs--and discusses gateway services (user searches through intermediary…
The Economics of Educational Software Portability.
ERIC Educational Resources Information Center
Oliveira, Joao Batista Araujo e
1990-01-01
Discusses economic issues that affect the portability of educational software. Topics discussed include economic reasons for portability, including cost effectiveness; the nature and behavior of educational computer software markets; the role of producers, buyers, and consumers; potential effects of government policies; computer piracy; and…
Ontology for Semantic Data Integration in the Domain of IT Benchmarking.
Pfaff, Matthias; Neubig, Stefan; Krcmar, Helmut
2018-01-01
A domain-specific ontology for IT benchmarking has been developed to bridge the gap between a systematic characterization of IT services and their data-based valuation. Since information is generally collected during a benchmark exercise using questionnaires on a broad range of topics, such as employee costs, software licensing costs, and quantities of hardware, it is commonly stored as natural language text; thus, this information is stored in an intrinsically unstructured form. Although these data form the basis for identifying potentials for IT cost reductions, neither a uniform description of any measured parameters nor the relationship between such parameters exists. Hence, this work proposes an ontology for the domain of IT benchmarking, available at https://w3id.org/bmontology. The design of this ontology is based on requirements mainly elicited from a domain analysis, which considers analyzing documents and interviews with representatives from Small- and Medium-Sized Enterprises and Information and Communications Technology companies over the last eight years. The development of the ontology and its main concepts is described in detail (i.e., the conceptualization of benchmarking events, questionnaires, IT services, indicators and their values) together with its alignment with the DOLCE-UltraLite foundational ontology.
NASA Astrophysics Data System (ADS)
Rabieh, Masood; Soukhakian, Mohammad Ali; Mosleh Shirazi, Ali Naghi
2016-06-01
Selecting the best suppliers is crucial for a company's success. Since competition is a determining factor nowadays, reducing cost and increasing quality of products are two key criteria for appropriate supplier selection. In the study, first the inventories of agglomeration plant of Isfahan Steel Company were categorized through VED and ABC methods. Then the models to supply two important kinds of raw materials (inventories) were developed, considering the following items: (1) the optimal consumption composite of the materials, (2) the total cost of logistics, (3) each supplier's terms and conditions, (4) the buyer's limitations and (5) the consumption behavior of the buyers. Among diverse developed and tested models—using the company's actual data within three pervious years—the two new innovative models of mixed-integer non-linear programming type were found to be most suitable. The results of solving two models by lingo software (based on company's data in this particular case) were equaled. Comparing the results of the new models to the actual performance of the company revealed 10.9 and 7.1 % reduction in total procurement costs of the company in two consecutive years.
GIS technology transfer for use in private sector consulting
NASA Astrophysics Data System (ADS)
Gibas, Dawn R.; Davis, Roger J.
1996-03-01
Summit Envirosolutions, Inc. (Summit) is an EOCAP '93 company working in partnership with NASA's Commercial Remote Sensing Program to integrate the use of Geographic Information Systems (GIS) and Remote Sensing (RS) technology into our environmental consulting business. The EOCAP program has allowed us to obtain the hardware and software necessary for this technology that would have been difficult for a small company, such as Summit, to purchase outright. We are integrating GIS/RS into our consulting business in several areas including wellhead protection and environmental assessments. The major emphasis in the EOCAP project is to develop a system, termed RealFlowSM. The goals of RealFlowSM are to reduce client costs associated with environmental compliance (in particular preparation of EPA-mandated Wellhead Protection Plans), more accurately characterize aquifer parameters, provide a scientifically sound basis for delineating Wellhead Protection Areas, and readily assess changes in well field operations and potential impacts of environmental stresses. RealFlowSM utilizes real-time telemetric data, digital imagery, GIS, Global Positioning System (GPS), and field data to characterize a study area at a lower cost. In addition, we are applying this technology in other service areas and showing a reduction in the overall costs for large projects.
Design and Implementation of a Mechanical Control System for the Scanning Microwave Limb Sounder
NASA Technical Reports Server (NTRS)
Bowden, William
2011-01-01
The Scanning Microwave Limb Sounder (SMLS) will use technological improvements in low noise mixers to provide precise data on the Earth's atmospheric composition with high spatial resolution. This project focuses on the design and implementation of a real time control system needed for airborne engineering tests of the SMLS. The system must coordinate the actuation of optical components using four motors with encoder readback, while collecting synchronized telemetric data from a GPS receiver and 3-axis gyrometric system. A graphical user interface for testing the control system was also designed using Python. Although the system could have been implemented with a FPGA-based setup, we chose to use a low cost processor development kit manufactured by XMOS. The XMOS architecture allows parallel execution of multiple tasks on separate threads-making it ideal for this application and is easily programmed using XC (a subset of C). The necessary communication interfaces were implemented in software, including Ethernet, with significant cost and time reduction compared to an FPGA-based approach. For these reasons, the XMOS technology is an attractive, cost effective, alternative to FPGA-based technologies for this design and similar rapid prototyping projects.
[Stressor and stress reduction strategies for computer software engineers].
Asakura, Takashi
2002-07-01
First, in this article we discuss 10 significant occupational stressors for computer software engineers, based on the review of the scientific literature on their stress and mental health. The stressors include 1) quantitative work overload, 2) time pressure, 3) qualitative work load, 4) speed and diffusion of technological innovation, and technological divergence, 5) low discretional power, 6) underdeveloped career pattern, 7) low earnings/reward from jobs, 8) difficulties in managing a project team for software development and establishing support system, 9) difficulties in customer relations, and 10) personality characteristics. In addition, we delineate their working and organizational conditions that cause such occupational stressors in order to find strategies to reduce those stressors in their workplaces. Finally, we suggest three stressor and stress reduction strategies for software engineers.
Fuzzy/Neural Software Estimates Costs of Rocket-Engine Tests
NASA Technical Reports Server (NTRS)
Douglas, Freddie; Bourgeois, Edit Kaminsky
2005-01-01
The Highly Accurate Cost Estimating Model (HACEM) is a software system for estimating the costs of testing rocket engines and components at Stennis Space Center. HACEM is built on a foundation of adaptive-network-based fuzzy inference systems (ANFIS) a hybrid software concept that combines the adaptive capabilities of neural networks with the ease of development and additional benefits of fuzzy-logic-based systems. In ANFIS, fuzzy inference systems are trained by use of neural networks. HACEM includes selectable subsystems that utilize various numbers and types of inputs, various numbers of fuzzy membership functions, and various input-preprocessing techniques. The inputs to HACEM are parameters of specific tests or series of tests. These parameters include test type (component or engine test), number and duration of tests, and thrust level(s) (in the case of engine tests). The ANFIS in HACEM are trained by use of sets of these parameters, along with costs of past tests. Thereafter, the user feeds HACEM a simple input text file that contains the parameters of a planned test or series of tests, the user selects the desired HACEM subsystem, and the subsystem processes the parameters into an estimate of cost(s).
Minimizing communication cost among distributed controllers in software defined networks
NASA Astrophysics Data System (ADS)
Arlimatti, Shivaleela; Elbreiki, Walid; Hassan, Suhaidi; Habbal, Adib; Elshaikh, Mohamed
2016-08-01
Software Defined Networking (SDN) is a new paradigm to increase the flexibility of today's network by promising for a programmable network. The fundamental idea behind this new architecture is to simplify network complexity by decoupling control plane and data plane of the network devices, and by making the control plane centralized. Recently controllers have distributed to solve the problem of single point of failure, and to increase scalability and flexibility during workload distribution. Even though, controllers are flexible and scalable to accommodate more number of network switches, yet the problem of intercommunication cost between distributed controllers is still challenging issue in the Software Defined Network environment. This paper, aims to fill the gap by proposing a new mechanism, which minimizes intercommunication cost with graph partitioning algorithm, an NP hard problem. The methodology proposed in this paper is, swapping of network elements between controller domains to minimize communication cost by calculating communication gain. The swapping of elements minimizes inter and intra communication cost among network domains. We validate our work with the OMNeT++ simulation environment tool. Simulation results show that the proposed mechanism minimizes the inter domain communication cost among controllers compared to traditional distributed controllers.
Computational Simulations and the Scientific Method
NASA Technical Reports Server (NTRS)
Kleb, Bil; Wood, Bill
2005-01-01
As scientific simulation software becomes more complicated, the scientific-software implementor's need for component tests from new model developers becomes more crucial. The community's ability to follow the basic premise of the Scientific Method requires independently repeatable experiments, and model innovators are in the best position to create these test fixtures. Scientific software developers also need to quickly judge the value of the new model, i.e., its cost-to-benefit ratio in terms of gains provided by the new model and implementation risks such as cost, time, and quality. This paper asks two questions. The first is whether other scientific software developers would find published component tests useful, and the second is whether model innovators think publishing test fixtures is a feasible approach.
NASA Astrophysics Data System (ADS)
Naguib, Hussein; Bol, Igor I.; Lora, J.; Chowdhry, R.
1994-09-01
This paper presents a case study on the implementation of ABC to calculate the cost per wafer and to drive cost reduction efforts for a new IC product line. The cost reduction activities were conducted through the efforts of 11 cross-functional teams which included members of the finance, purchasing, technology development, process engineering, equipment engineering, production control, and facility groups. The activities of these cross functional teams were coordinated by a cost council. It will be shown that these activities have resulted in a 57% reduction in the wafer manufacturing cost of the new product line. Factors contributed to successful implementation of an ABC management system are discussed.
CAI System Costs: Present and Future.
ERIC Educational Resources Information Center
Pressman, Israel; Rosenbloom, Bruce
1984-01-01
Discusses costs related to providing computer assisted instruction (CAI), considering hardware, software, user training, maintenance, and installation. Provides an example of the total cost of CAI broken down into these categories, giving an adjusted yearly cost. Projects future trends and costs of CAI as well as cost savings possibilities. (JM)
Davis, Erika N; Chung, Kevin C; Kotsis, Sandra V; Lau, Frank H; Vijan, Sandeep
2006-04-01
Open reduction and internal fixation and cast immobilization are both acceptable treatment options for nondisplaced waist fractures of the scaphoid. The authors conducted a cost/utility analysis to weigh open reduction and internal fixation against cast immobilization in the treatment of acute nondisplaced mid-waist scaphoid fractures. The authors used a decision-analytic model to calculate the outcomes and costs of open reduction and internal fixation and cast immobilization, assuming the societal perspective. Utilities were assessed from 50 randomly selected medical students using the time trade-off method. Outcome probabilities taken from the literature were factored into the calculation of quality-adjusted life-years associated with each treatment. The authors estimated medical costs using Medicare reimbursement rates, and costs of lost productivity were estimated by average wages obtained from the U.S. Bureau of Labor Statistics. Open reduction and internal fixation offers greater quality-adjusted life-years compared with casting, with an increase ranging from 0.21 quality-adjusted life-years for the 25- to 34-year age group to 0.04 quality-adjusted life-years for the > or =65-year age group. Open reduction and internal fixation is less costly than casting ($7940 versus $13,851 per patient) because of a longer period of lost productivity with casting. Open reduction and internal fixation is therefore the dominant strategy. When considering only direct costs, the incremental cost/utility ratio for open reduction and internal fixation ranges from $5438 per quality-adjusted life-year for the 25- to 34-year age group to $11,420 for the 55- to 64-year age group, and $29,850 for the > or =65-year age group. Compared with casting, open reduction and internal fixation is cost saving from the societal perspective ($5911 less per patient). When considering only direct costs, open reduction and internal fixation is cost-effective relative to other widely accepted interventions.
48 CFR 52.215-10 - Price Reduction for Defective Certified Cost or Pricing Data.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 48 Federal Acquisition Regulations System 2 2011-10-01 2011-10-01 false Price Reduction for... Text of Provisions and Clauses 52.215-10 Price Reduction for Defective Certified Cost or Pricing Data. As prescribed in 15.408(b), insert the following clause: Price Reduction for Defective Certified Cost...
1985 Winners of the Cost Reduction Incentive Awards. Tenth Anniversary.
ERIC Educational Resources Information Center
National Association of College and University Business Officers, Washington, DC.
Fifty-two cost reduction efforts on college and university campuses are described, as part of the Cost Reduction Incentive Awards Program sponsored by the National Association of College and University Business Officers and the United States Steel Foundation. The incentive program is designed to stimulate cost-effective ideas and awareness of the…
PV O&M Cost Model and Cost Reduction
DOE Office of Scientific and Technical Information (OSTI.GOV)
Walker, Andy
This is a presentation on PV O&M cost model and cost reduction for the annual Photovoltaic Reliability Workshop (2017), covering estimating PV O&M costs, polynomial expansion, and implementation of Net Present Value (NPV) and reserve account in cost models.
Elementary Keyboarding Software Product Reports.
ERIC Educational Resources Information Center
Northwest Regional Educational Lab., Portland, OR.
This report provides detailed product descriptions of 45 software programs designed to teach or improve the keyboarding skills of elementary school students that were identified by the MicroSIFT (Microcomputer Information and Software for Teachers) staff. The descriptions include program titles, producer names, costs, grade levels, hardware,…
45 CFR 307.5 - Mandatory computerized support enforcement systems.
Code of Federal Regulations, 2013 CFR
2013-10-01
... hardware, operational system software, and electronic linkages with the separate components of an... plans to use and how they will interface with the base system; (3) Provide documentation that the... and for operating costs including hardware, operational software and applications software of a...
45 CFR 307.5 - Mandatory computerized support enforcement systems.
Code of Federal Regulations, 2014 CFR
2014-10-01
... hardware, operational system software, and electronic linkages with the separate components of an... plans to use and how they will interface with the base system; (3) Provide documentation that the... and for operating costs including hardware, operational software and applications software of a...
45 CFR 307.5 - Mandatory computerized support enforcement systems.
Code of Federal Regulations, 2012 CFR
2012-10-01
... hardware, operational system software, and electronic linkages with the separate components of an... plans to use and how they will interface with the base system; (3) Provide documentation that the... and for operating costs including hardware, operational software and applications software of a...
10 CFR 603.550 - Acceptability of intellectual property.
Code of Federal Regulations, 2011 CFR
2011-01-01
... property (e.g., copyrighted material, including software) as cost sharing because: (1) It is difficult to... the contribution. For example, a for-profit firm may offer the use of commercially available software... the software would not be a reasonable basis for valuing its use. ...
10 CFR 603.550 - Acceptability of intellectual property.
Code of Federal Regulations, 2012 CFR
2012-01-01
... property (e.g., copyrighted material, including software) as cost sharing because: (1) It is difficult to... the contribution. For example, a for-profit firm may offer the use of commercially available software... the software would not be a reasonable basis for valuing its use. ...
Software engineering with application-specific languages
NASA Technical Reports Server (NTRS)
Campbell, David J.; Barker, Linda; Mitchell, Deborah; Pollack, Robert H.
1993-01-01
Application-Specific Languages (ASL's) are small, special-purpose languages that are targeted to solve a specific class of problems. Using ASL's on software development projects can provide considerable cost savings, reduce risk, and enhance quality and reliability. ASL's provide a platform for reuse within a project or across many projects and enable less-experienced programmers to tap into the expertise of application-area experts. ASL's have been used on several software development projects for the Space Shuttle Program. On these projects, the use of ASL's resulted in considerable cost savings over conventional development techniques. Two of these projects are described.
Towards real-time risk mitigation for NPP in Switzerland: the potential role of EEW and OEF.
NASA Astrophysics Data System (ADS)
Cauzzi, Carlo; Wiemer, Stefan; Behr, Yannik; Clinton, John; Renault, Philippe; Le Guenan, Thomas; Douglas, John; Woessner, Jochen; Biro, Yesim; Caprio, Marta; Cua, Georgia
2014-05-01
Spurred by the research activities being carried out within the EC-funded project REAKT (Strategies and Tools for Real Time Earthquake Risk Reduction, FP7, contract no. 282862, 2011-2014, www.reaktproject.eu), we present herein the key elements to understanding the potential benefits of routinely using Earthquake Early Warning and Operational Earthquake Forecasting methods to mitigate the seismic risk at NPP in Switzerland. The advantages of using the aforementioned real-time risk reduction tools are critically discussed based on the limitations of the current scientific knowledge and technology, as well as on the costs associated to both system maintenance and machine- or human-triggered actions following an alert. Basic inputs to this discussion are, amongst others: a) the performances of the Swiss seismic network (http://www.seismo.ethz.ch/monitor, where SeisComP3 is used as earthquake monitoring software) and the selected EEW algorithm (the Virtual Seismologist, VS, http://www.seiscomp3.org/doc/seattle/2013.200/apps/vs.html), in terms of correct detections, false alerts, and missed events; b) the reliability of time-dependent hazard scenarios for the region of interest; c) a careful assessment of the frequency of occurrence of critical warnings based on the local and regional seismicity; d) the identification of the mitigation actions and their benefits and costs for the stakeholders.
ERIC Educational Resources Information Center
Pressman, Israel; Rosenbloom, Bruce
1984-01-01
Describes and evaluates costs of hardware, software, training, and maintenance for computer assisted instruction (CAI) as they relate to total system cost. An example of an educational system provides an illustration of CAI cost analysis. Future developments, cost effectiveness, affordability, and applications in public and private environments…
SpecTracer: A Python-Based Interactive Solution for Echelle Spectra Reduction
NASA Astrophysics Data System (ADS)
Romero Matamala, Oscar Fernando; Petit, Véronique; Caballero-Nieves, Saida Maria
2018-01-01
SpecTracer is a newly developed interactive solution to reduce cross dispersed echelle spectra. The use of widgets saves the user the steep learning curves of currently available reduction software. SpecTracer uses well established image processing techniques based on IRAF to succesfully extract the stellar spectra. Comparisons with other reduction software, like IRAF, show comparable results, with the added advantages of ease of use, platform independence and portability. This tool can obtain meaningful scientific data and serve also as a training tool, especially for undergraduates doing research, in the procedure for spectroscopic analysis.
NASA Technical Reports Server (NTRS)
Freeman, William T.; Ilcewicz, L. B.; Swanson, G. D.; Gutowski, T.
1992-01-01
A conceptual and preliminary designers' cost prediction model has been initiated. The model will provide a technically sound method for evaluating the relative cost of different composite structural designs, fabrication processes, and assembly methods that can be compared to equivalent metallic parts or assemblies. The feasibility of developing cost prediction software in a modular form for interfacing with state of the art preliminary design tools and computer aided design programs is being evaluated. The goal of this task is to establish theoretical cost functions that relate geometric design features to summed material cost and labor content in terms of process mechanics and physics. The output of the designers' present analytical tools will be input for the designers' cost prediction model to provide the designer with a data base and deterministic cost methodology that allows one to trade and synthesize designs with both cost and weight as objective functions for optimization. The approach, goals, plans, and progress is presented for development of COSTADE (Cost Optimization Software for Transport Aircraft Design Evaluation).
Echelle Data Reduction Cookbook
NASA Astrophysics Data System (ADS)
Clayton, Martin
This document is the first version of the Starlink Echelle Data Reduction Cookbook. It contains scripts and procedures developed by regular or heavy users of the existing software packages. These scripts are generally of two types; templates which readers may be able to modify to suit their particular needs and utilities which carry out a particular common task and can probably be used `off-the-shelf'. In the nature of this subject the recipes given are quite strongly tied to the software packages, rather than being science-data led. The major part of this document is divided into two sections dealing with scripts to be used with IRAF and with Starlink software (SUN/1).
ORBS: A reduction software for SITELLE and SpiOMM data
NASA Astrophysics Data System (ADS)
Martin, Thomas
2014-09-01
ORBS merges, corrects, transforms and calibrates interferometric data cubes and produces a spectral cube of the observed region for analysis. It is a fully automatic data reduction software for use with SITELLE (installed at the Canada-France-Hawaii Telescope) and SpIOMM (a prototype attached to the Observatoire du Mont Mégantic); these imaging Fourier transform spectrometers obtain a hyperspectral data cube which samples a 12 arc-minutes field of view into 4 millions of visible spectra. ORBS is highly parallelized; its core classes (ORB) have been designed to be used in a suite of softwares for data analysis (ORCS and OACS), data simulation (ORUS) and data acquisition (IRIS).
Hubble Systems Optimize Hospital Schedules
NASA Technical Reports Server (NTRS)
2009-01-01
Don Rosenthal, a former Ames Research Center computer scientist who helped design the Hubble Space Telescope's scheduling software, co-founded Allocade Inc. of Menlo Park, California, in 2004. Allocade's OnCue software helps hospitals reclaim unused capacity and optimize constantly changing schedules for imaging procedures. After starting to use the software, one medical center soon reported noticeable improvements in efficiency, including a 12 percent increase in procedure volume, 35 percent reduction in staff overtime, and significant reductions in backlog and technician phone time. Allocade now offers versions for outpatient and inpatient magnetic resonance imaging (MRI), ultrasound, interventional radiology, nuclear medicine, Positron Emission Tomography (PET), radiography, radiography-fluoroscopy, and mammography.
[Cost analysis for navigation in knee endoprosthetics].
Cerha, O; Kirschner, S; Günther, K-P; Lützner, J
2009-12-01
Total knee arthroplasty (TKA) is one of the most frequent procedures in orthopaedic surgery. The outcome depends on a range of factors including alignment of the leg and the positioning of the implant in addition to patient-associated factors. Computer-assisted navigation systems can improve the restoration of a neutral leg alignment. This procedure has been established especially in Europe and North America. The additional expenses are not reimbursed in the German DRG system (Diagnosis Related Groups). In the present study a cost analysis of computer-assisted TKA compared to the conventional technique was performed. The acquisition expenses of various navigation systems (5 and 10 year depreciation), annual costs for maintenance and software updates as well as the accompanying costs per operation (consumables, additional operating time) were considered. The additional operating time was determined on the basis of a meta-analysis according to the current literature. Situations with 25, 50, 100, 200 and 500 computer-assisted TKAs per year were simulated. The amount of the incremental costs of the computer-assisted TKA depends mainly on the annual volume and the additional operating time. A relevant decrease of the incremental costs was detected between 50 and 100 procedures per year. In a model with 100 computer-assisted TKAs per year an additional operating time of 14 mins and a 10 year depreciation of the investment costs, the incremental expenses amount to
ERIC Educational Resources Information Center
Computer Symbolic, Inc., Washington, DC.
A pseudo assembly language, PAL, was developed and specified for use as the lowest level in a general, multilevel programing system for the realization of cost-effective, hardware-independent Naval software. The language was developed as part of the system called FIRMS (Fast Iterative Recursive Macro System) and is sufficiently general to allow…
Streamlining Software Aspects of Certification: Technical Team Report on the First Industry Workshop
NASA Technical Reports Server (NTRS)
Hayhurst, Kelly J.; Holloway, C. Michael; Knight, John C.; Leveson, Nancy G.; Yang, Jeffrey C.; Dorsey, Cheryl A.; McCormick, G. Frank
1998-01-01
To address concerns about time and expense associated with software aspects of certification, the Federal Aviation Administration (FAA) began the Streamlining Software Aspects of Certification (SSAC) program. As part of this program, a Technical Team was established to determine whether the cost and time associated with certifying aircraft can be reduced while maintaining or improving safety, with the intent of impacting the FAA's Flight 2000 program. The Technical Team conducted a workshop to gain a better understanding of the major concerns in industry about software cost and schedule. Over 120 people attended the workshop, including representatives from the FAA,commercial transport and general aviation aircraft manufacturers and suppliers, and procurers and developers of non-airborne systems; and, more than 200 issues about software aspects of certification were recorded. This paper provides an overview of the SSAC program, motivation for the workshop, details of the workshop activities and outcomes, and recommendations for follow-on work.
Software Certification for Temporal Properties With Affordable Tool Qualification
NASA Technical Reports Server (NTRS)
Xia, Songtao; DiVito, Benedetto L.
2005-01-01
It has been recognized that a framework based on proof-carrying code (also called semantic-based software certification in its community) could be used as a candidate software certification process for the avionics industry. To meet this goal, tools in the "trust base" of a proof-carrying code system must be qualified by regulatory authorities. A family of semantic-based software certification approaches is described, each different in expressive power, level of automation and trust base. Of particular interest is the so-called abstraction-carrying code, which can certify temporal properties. When a pure abstraction-carrying code method is used in the context of industrial software certification, the fact that the trust base includes a model checker would incur a high qualification cost. This position paper proposes a hybrid of abstraction-based and proof-based certification methods so that the model checker used by a client can be significantly simplified, thereby leading to lower cost in tool qualification.
Cloud flexibility using DIRAC interware
NASA Astrophysics Data System (ADS)
Fernandez Albor, Víctor; Seco Miguelez, Marcos; Fernandez Pena, Tomas; Mendez Muñoz, Victor; Saborido Silva, Juan Jose; Graciani Diaz, Ricardo
2014-06-01
Communities of different locations are running their computing jobs on dedicated infrastructures without the need to worry about software, hardware or even the site where their programs are going to be executed. Nevertheless, this usually implies that they are restricted to use certain types or versions of an Operating System because either their software needs an definite version of a system library or a specific platform is required by the collaboration to which they belong. On this scenario, if a data center wants to service software to incompatible communities, it has to split its physical resources among those communities. This splitting will inevitably lead to an underuse of resources because the data centers are bound to have periods where one or more of its subclusters are idle. It is, in this situation, where Cloud Computing provides the flexibility and reduction in computational cost that data centers are searching for. This paper describes a set of realistic tests that we ran on one of such implementations. The test comprise software from three different HEP communities (Auger, LHCb and QCD phenomelogists) and the Parsec Benchmark Suite running on one or more of three Linux flavors (SL5, Ubuntu 10.04 and Fedora 13). The implemented infrastructure has, at the cloud level, CloudStack that manages the virtual machines (VM) and the hosts on which they run, and, at the user level, the DIRAC framework along with a VM extension that will submit, monitorize and keep track of the user jobs and also requests CloudStack to start or stop the necessary VM's. In this infrastructure, the community software is distributed via the CernVM-FS, which has been proven to be a reliable and scalable software distribution system. With the resulting infrastructure, users are allowed to send their jobs transparently to the Data Center. The main purpose of this system is the creation of flexible cluster, multiplatform with an scalable method for software distribution for several VOs. Users from different communities do not need to care about the installation of the standard software that is available at the nodes, nor the operating system of the host machine, which is transparent to the user.
RiskChanges Spatial Decision Support system for the analysis of changing multi-hazard risk
NASA Astrophysics Data System (ADS)
van Westen, Cees; Zhang, Kaixi; Bakker, Wim; Andrejchenko, Vera; Berlin, Julian; Olyazadeh, Roya; Cristal, Irina
2015-04-01
Within the framework of the EU FP7 Marie Curie Project CHANGES and the EU FP7 Copernicus project INCREO a spatial decision support system was developed with the aim to analyse the effect of risk reduction planning alternatives on reducing the risk now and in the future, and support decision makers in selecting the best alternatives. Central to the SDSS are the stakeholders. The envisaged users of the system are organizations involved in planning of risk reduction measures, and that have staff capable of visualizing and analyzing spatial data at a municipal scale. The SDSS should be able to function in different countries with different legal frameworks and with organizations with different mandates. These could be subdivided into Civil protection organization with the mandate to design disaster response plans, Expert organizations with the mandate to design structural risk reduction measures (e.g. dams, dikes, check-dams etc), and planning organizations with the mandate to make land development plans. The SDSS can be used in different ways: analyzing the current level of risk, analyzing the best alternatives for risk reduction, the evaluation of the consequences of possible future scenarios to the risk levels, and the evaluation how different risk reduction alternatives will lead to risk reduction under different future scenarios. The SDSS is developed based on open source software and following open standards, for code as well as for data formats and service interfaces. Code development was based upon open source software as well. The architecture of the system is modular. The various parts of the system are loosely coupled, extensible, using standards for interoperability, flexible and web-based. The Spatial Decision Support System is composed of a number of integrated components. The Risk Assessment component allows to carry out spatial risk analysis, with different degrees of complexity, ranging from simple exposure (overlay of hazard and assets maps) to quantitative analysis (using different hazard types, temporal scenarios and vulnerability curves) resulting into risk curves. The platform does not include a component to calculate hazard maps, and existing hazard maps are used as input data for the risk component. The second component of the SDSS is a risk reduction planning component, which forms the core of the platform. This component includes the definition of risk reduction alternatives (related to disaster response planning, risk reduction measures and spatial planning) and links back to the risk assessment module to calculate the new level of risk if the measure is implemented, and a cost-benefit (or cost-effectiveness/ Spatial Multi Criteria Evaluation) component to compare the alternatives and make decision on the optimal one. The third component of the SDSS is a temporal scenario component, which allows to define future scenarios in terms of climate change, land use change and population change, and the time periods for which these scenarios will be made. The component doesn't generate these scenarios but uses input maps for the effect of the scenarios on the hazard and assets maps. The last component is a communication and visualization component, which can compare scenarios and alternatives, not only in the form of maps, but also in other forms (risk curves, tables, graphs)
NASA Astrophysics Data System (ADS)
Al-Khawaldeh, Ihsan Naji
Inventory management is a vital tool for any organization to survive the competency and reduce the operating cost. In the field of aviation its importance is more provident as the spares are more on the move. Apart from the aspects related to inventory management like quantity, quality, price, lead time...etc., inventory in aviation caters for those items, where to store them and how and when to circulate them. On other hand, safety which is a prominent crucial factor in aviation field makes it more and more demanding to have an inventory management as an integral part of both aviation maintenance management and quality assurance program. Just-in Time (JIT) inventory management systems that worked well in reducing the waste and increasing the profit might not work well in aviation field both civil and military. Hence, a need for an adaptive management system that takes care of cost reduction along with high readiness is of a vital need. The Inventory Management System (IMS) in aviation and especially in military is seen to follow a mix of the different inventory management methods. In other word, it is a combination of Fixed-Order Quantity (Q-Model), and Fixed-Time Period Reordering (P-model) to cope with the dynamics of aviation maintenance needs. The uniqueness feature of aviation inventory, where a shortage of trivial spares like nuts, bolts may at some point be considered as critical, grounding a complete fleet especially one that matters a flight safety issue. Different Platforms, operating locations, aging and many others influence the need for an adaptive inventory system. Using Access software for a simple programming and using it as inventory management system that will help in defining the rate of usage of spares, and consumables, and on the other hand may give an insight in material deficiency, that will lead for engineering design improvement and modification. The main aim of this (IMS) is reduction of both A.O.G chances, and inventory cost related to effective usage of needed items based on the maintenance requirements. The key to success lies in the perseverance to use the software and develop its capabilities continuously, through a qualified workforce.
Executive Guide to Software Maintenance. Reports on Computer Science and Technology.
ERIC Educational Resources Information Center
Osborne, Wilma M.
This guide is designed for federal executives and managers who have a responsibility for the planning and management of software projects and for federal staff members who are affected by, or involved in, making software changes, and who need to be aware of steps that can reduce both the difficulty and cost of software maintenance. Organized in a…
ERIC Educational Resources Information Center
Lafferty, Mark T.
2010-01-01
The number of project failures and those projects completed over cost and over schedule has been a significant issue for software project managers. Among the many reasons for failure, inaccuracy in software estimation--the basis for project bidding, budgeting, planning, and probability estimates--has been identified as a root cause of a high…
NASA/Navy lift/cruise fan cost reduction studies
NASA Technical Reports Server (NTRS)
1977-01-01
Cost reduction studies were performed for the LCF459 turbotip fan for application with the YJ97-GE-100 gas generator in a multimission V/STOL research and technology aircraft. A 20 percent cost reduction of the research configuration based on the original preliminary design was achieved. The trade studies performed and the results in the area of cost reduction and weight are covered. A fan configuration is defined for continuation of the program through the detailed design phase.
Software Engineering Improvement Activities/Plan
NASA Technical Reports Server (NTRS)
2003-01-01
bd Systems personnel accomplished the technical responsibilities for this reporting period, as planned. A close working relationship was maintained with personnel of the MSFC Avionics Department Software Group (ED14). Work accomplishments included development, evaluation, and enhancement of a software cost model, performing literature search and evaluation of software tools available for code analysis and requirements analysis, and participating in other relevant software engineering activities. Monthly reports were submitted. This support was provided to the Flight Software Group/ED 1 4 in accomplishing the software engineering improvement engineering activities of the Marshall Space Flight Center (MSFC) Software Engineering Improvement Plan.
Software engineering technology transfer: Understanding the process
NASA Technical Reports Server (NTRS)
Zelkowitz, Marvin V.
1993-01-01
Technology transfer is of crucial concern to both government and industry today. In this report, the mechanisms developed by NASA to transfer technology are explored and the actual mechanisms used to transfer software development technologies are investigated. Time, cost, and effectiveness of software engineering technology transfer is reported.
Quantitative software models for the estimation of cost, size, and defects
NASA Technical Reports Server (NTRS)
Hihn, J.; Bright, L.; Decker, B.; Lum, K.; Mikulski, C.; Powell, J.
2002-01-01
The presentation will provide a brief overview of the SQI measurement program as well as describe each of these models and how they are currently being used in supporting JPL project, task and software managers to estimate and plan future software systems and subsystems.
Altmann, Uwe; Thielemann, Désirée; Zimmermann, Anna; Steffanowski, Andrés; Bruckmeier, Ellen; Pfaffinger, Irmgard; Fembacher, Andrea; Strauß, Bernhard
2018-01-01
Background: In view of a shortage of health care costs, monetary aspects of psychotherapy become increasingly relevant. The present study examined the pre-post reduction of impairment and direct health care costs depending on therapy termination (regularly terminated, dropout with an unproblematic reason, and dropout with a quality-relevant reason) and the association of symptom and cost reduction. Methods: In a naturalistic longitudinal study, we examined a disorder heterogeneous sample of N = 584 outpatients who were either treated with cognitive-behavioral, psychodynamic, or psychoanalytic therapy. Depression, anxiety, stress, and somatization were assessed with the Patient Health Questionnaire (PHQ). Annual amounts of inpatient costs, outpatient costs, medication costs, days of hospitalization, work disability days, utilization of psychotherapy, and utilization of pharmacotherapy 1 year before therapy and 1 year after therapy were provided by health care insurances. Symptom and cost reduction were analyzed using t-tests. Associations between symptom and cost reduction were examined using partial correlations and hierarchical linear models. Results: Patients who terminated therapy regularly showed the largest symptom reduction (d = 0.981–1.22). Patients who dropped out due to an unproblematic reason and patients who terminated early due to a quality-relevant reason showed significant but small effects of symptom reductions (e.g., depression: d = 0.429 vs. d = 0.366). For patients with a regular end and those dropping out due to a quality-relevant reason, we observed a significant reduction of work disability (diff in % of pre-test value = 56.3 vs. 42.9%) and hospitalization days (52.8 vs. 35.0%). Annual inpatient costs decreased in the group with a regular therapy end (31.5%). Furthermore, reduction of symptoms on the one side and reduction of work disability days and psychotherapy utilization on the other side were significant correlated (r = 0.091–0.135). Conclusion: Health care costs and symptoms were reduced in each of the three groups. The average symptom and cost reduction of patients with a quality-relevant dropout suggested that not each dropout might be seen as therapy failure. PMID:29867697
DOE Office of Scientific and Technical Information (OSTI.GOV)
Biyikli, Emre; To, Albert C., E-mail: albertto@pitt.edu
Atomistic/continuum coupling methods combine accurate atomistic methods and efficient continuum methods to simulate the behavior of highly ordered crystalline systems. Coupled methods utilize the advantages of both approaches to simulate systems at a lower computational cost, while retaining the accuracy associated with atomistic methods. Many concurrent atomistic/continuum coupling methods have been proposed in the past; however, their true computational efficiency has not been demonstrated. The present work presents an efficient implementation of a concurrent coupling method called the Multiresolution Molecular Mechanics (MMM) for serial, parallel, and adaptive analysis. First, we present the features of the software implemented along with themore » associated technologies. The scalability of the software implementation is demonstrated, and the competing effects of multiscale modeling and parallelization are discussed. Then, the algorithms contributing to the efficiency of the software are presented. These include algorithms for eliminating latent ghost atoms from calculations and measurement-based dynamic balancing of parallel workload. The efficiency improvements made by these algorithms are demonstrated by benchmark tests. The efficiency of the software is found to be on par with LAMMPS, a state-of-the-art Molecular Dynamics (MD) simulation code, when performing full atomistic simulations. Speed-up of the MMM method is shown to be directly proportional to the reduction of the number of the atoms visited in force computation. Finally, an adaptive MMM analysis on a nanoindentation problem, containing over a million atoms, is performed, yielding an improvement of 6.3–8.5 times in efficiency, over the full atomistic MD method. For the first time, the efficiency of a concurrent atomistic/continuum coupling method is comprehensively investigated and demonstrated.« less
Multiresolution molecular mechanics: Implementation and efficiency
NASA Astrophysics Data System (ADS)
Biyikli, Emre; To, Albert C.
2017-01-01
Atomistic/continuum coupling methods combine accurate atomistic methods and efficient continuum methods to simulate the behavior of highly ordered crystalline systems. Coupled methods utilize the advantages of both approaches to simulate systems at a lower computational cost, while retaining the accuracy associated with atomistic methods. Many concurrent atomistic/continuum coupling methods have been proposed in the past; however, their true computational efficiency has not been demonstrated. The present work presents an efficient implementation of a concurrent coupling method called the Multiresolution Molecular Mechanics (MMM) for serial, parallel, and adaptive analysis. First, we present the features of the software implemented along with the associated technologies. The scalability of the software implementation is demonstrated, and the competing effects of multiscale modeling and parallelization are discussed. Then, the algorithms contributing to the efficiency of the software are presented. These include algorithms for eliminating latent ghost atoms from calculations and measurement-based dynamic balancing of parallel workload. The efficiency improvements made by these algorithms are demonstrated by benchmark tests. The efficiency of the software is found to be on par with LAMMPS, a state-of-the-art Molecular Dynamics (MD) simulation code, when performing full atomistic simulations. Speed-up of the MMM method is shown to be directly proportional to the reduction of the number of the atoms visited in force computation. Finally, an adaptive MMM analysis on a nanoindentation problem, containing over a million atoms, is performed, yielding an improvement of 6.3-8.5 times in efficiency, over the full atomistic MD method. For the first time, the efficiency of a concurrent atomistic/continuum coupling method is comprehensively investigated and demonstrated.
ERIC Educational Resources Information Center
Classroom Computer Learning, 1988
1988-01-01
Provides reviews of three software packages including "MusicShapes,""For Comment," and "Colortrope," which were developed for music, writing, and science, respectively. Includes information on grade levels, publishers, hardware needed, and cost. (TW)
ERIC Educational Resources Information Center
Managan, William H.
1999-01-01
Describes a facilities-management software program that helps managers better document and understand maintenance backlogs, improvements, and future cyclic renewal needs. Major software components are examined including a software tool that filters, groups, and ranks projects to help determine funding requests. (GR)
10 CFR 436.15 - Formatting cost data.
Code of Federal Regulations, 2011 CFR
2011-01-01
... Procedures for Life Cycle Cost Analyses § 436.15 Formatting cost data. In establishing cost data under §§ 436.16 and 436.17 and measuring cost effectiveness by the modes of analysis described by § 436.19 through... software referenced in the Life Cycle Cost Manual for the Federal Energy Management Program. ...
DOT National Transportation Integrated Search
2004-05-01
This manual provides basic instruction for using RealCost, software that was developed by the Federal Highway Administration (FHWA) to support the application of life-cycle cost analysis (LCCA) in the pavement project-level decisionmaking process. Th...
Deitelzweig, Steve; Amin, Alpesh; Jing, Yonghua; Makenbaeva, Dinara; Wiederkehr, Daniel; Lin, Jay; Graham, John
2012-01-01
The randomized clinical trials, RE-LY, ROCKET-AF, and ARISTOTLE, demonstrate that the novel oral anticoagulants (NOACs) are effective options for stroke prevention among non-valvular atrial fibrillation (AF) patients. This study aimed to evaluate the medical cost reductions associated with the use of individual NOACs instead of warfarin from the US payer perspective. Rates for efficacy and safety clinical events for warfarin were estimated as the weighted averages from the RE-LY, ROCKET-AF and ARISTOTLE trials, and event rates for NOACs were determined by applying trial hazard ratios or relative risk ratios to such weighted averages. Incremental medical costs to a US health payer of an AF patient experiencing a clinical event during 1 year following the event were obtained from published literature and inflation adjusted to 2010 cost levels. Medical costs, excluding drug costs, were evaluated and compared for each NOAC vs warfarin. Sensitivity analyses were conducted to determine the influence of variations in clinical event rates and incremental costs on the medical cost reduction. In a patient year, the medical cost reduction associated with NOAC usage instead of warfarin was estimated to be -$179, -$89, and -$485 for dabigatran, rivaroxaban, and apixaban, respectively. When clinical event rates and costs were allowed to vary simultaneously, through a Monte Carlo simulation, the 95% confidence interval of annual medical costs differences ranged between -$424 and +$71 for dabigatran, -$301 and +$135 for rivaroxaban, and -$741 and -$252 for apixaban, with a negative number indicating a cost reduction. Of the 10,000 Monte-Carlo iterations 92.6%, 79.8%, and 100.0% were associated with a medical cost reduction >$0 for dabigatran, rivaroxaban, and apixaban, respectively. Usage of the NOACs, dabigatran, rivaroxaban, and apixaban may be associated with lower medical (excluding drug costs) costs relative to warfarin, with apixaban having the most substantial medical cost reduction.
NASA Technical Reports Server (NTRS)
2015-01-01
Topics covered include: 3D Endoscope to Boost Safety, Cut Cost of Surgery; Audio App Brings a Better Night's Sleep Liquid Cooling Technology Increases Exercise Efficiency; Algae-Derived Dietary Ingredients Nourish Animals; Space Grant Research Launches Rehabilitation Chair; Vision Trainer Teaches Focusing Techniques at Home; Aircraft Geared Architecture Reduces Fuel Cost and Noise; Ubiquitous Supercritical Wing Design Cuts Billions in Fuel Costs; Flight Controller Software Protects Lightweight Flexible Aircraft; Cabin Pressure Monitors Notify Pilots to Save Lives; Ionospheric Mapping Software Ensures Accuracy of Pilots' GPS; Water Mapping Technology Rebuilds Lives in Arid Regions; Shock Absorbers Save Structures and Lives during Earthquakes; Software Facilitates Sharing of Water Quality Data Worldwide; Underwater Adhesives Retrofit Pipelines with Advanced Sensors; Laser Imaging Video Camera Sees through Fire, Fog, Smoke; 3D Lasers Increase Efficiency, Safety of Moving Machines; Air Revitalization System Enables Excursions to the Stratosphere; Magnetic Fluids Deliver Better Speaker Sound Quality; Bioreactor Yields Extracts for Skin Cream; Private Astronaut Training Prepares Commercial Crews of Tomorrow; Activity Monitors Help Users Get Optimum Sun Exposure; LEDs Illuminate Bulbs for Better Sleep, Wake Cycles; Charged Particles Kill Pathogens and Round Up Dust; Balance Devices Train Golfers for a Consistent Swing; Landsat Imagery Enables Global Studies of Surface Trends; Ruggedized Spectrometers Are Built for Tough Jobs; Gas Conversion Systems Reclaim Fuel for Industry; Remote Sensing Technologies Mitigate Drought; Satellite Data Inform Forecasts of Crop Growth; Probes Measure Gases for Environmental Research; Cloud Computing Technologies Facilitate Earth Research; Software Cuts Homebuilding Costs, Increases Energy Efficiency; Portable Planetariums Teach Science; Schedule Analysis Software Saves Time for Project Planners; Sound Modeling Simplifies Vehicle Noise Management; Custom 3D Printers Revolutionize Space Supply Chain; Improved Calibration Shows Images' True Colors; Micromachined Parts Advance Medicine, Astrophysics, and More; Metalworking Techniques Unlock a Unique Alloy; Low-Cost Sensors Deliver Nanometer-Accurate Measurements; Electrical Monitoring Devices Save on Time and Cost; Dry Lubricant Smooths the Way for Space Travel, Industry; and Compact Vapor Chamber Cools Critical Components.
Scientific Software: How to Find What You Need and Get What You Pay for.
ERIC Educational Resources Information Center
Gabaldon, Diana J.
1984-01-01
Provides examples of software for the sciences, including: packages for pathology/toxicology laboratories (costing over $15,000), DNA sequencing, and data acquisition/analysis; general-purpose software for scientific uses; and "custom" packages, including a program to maintain a listing of "Escherichia coli" strains and a…
NASA Tech Briefs, November/December 1986, Special Edition
NASA Technical Reports Server (NTRS)
1986-01-01
Topics: Computing: The View from NASA Headquarters; Earth Resources Laboratory Applications Software: Versatile Tool for Data Analysis; The Hypercube: Cost-Effective Supercomputing; Artificial Intelligence: Rendezvous with NASA; NASA's Ada Connection; COSMIC: NASA's Software Treasurehouse; Golden Oldies: Tried and True NASA Software; Computer Technical Briefs; NASA TU Services; Digital Fly-by-Wire.
Future of Software Engineering Standards
NASA Technical Reports Server (NTRS)
Poon, Peter T.
1997-01-01
In the new millennium, software engineering standards are expected to continue to influence the process of producing software-intensive systems which are cost-effetive and of high quality. These sytems may range from ground and flight systems used for planetary exploration to educational support systems used in schools as well as consumer-oriented systems.
Cartographic applications software
,
1992-01-01
The Office of the Assistant Division Chief for Research, National Mapping Division, develops computer software for the solution of geometronic problems in the fields of surveying, geodesy, remote sensing, and photogrammetry. Software that has been developed using public funds is available on request for a nominal charge to recover the cost of duplication.
Campus-Based Practices for Promoting Student Success: Software Solutions. Research Brief
ERIC Educational Resources Information Center
Horn, Aaron S.; Reinert, Leah; Reis, Michael
2015-01-01
Colleges and universities are increasingly adopting various software solutions to raise degree completion rates and lower costs (Ferguson, 2012; Vendituoli, 2014; Yanosky, 2014). Student success software, also known as Integrated Planning and Advising Services (IPAS), appears to be in high demand among both students and faculty (Dahlstrom &…
Rajan, Prashant V; Qudsi, Rameez A; Dyer, George S M; Losina, Elena
2018-02-07
There is no consensus on the optimal fixation method for patients who require a surgical procedure for distal radial fractures. We used cost-effectiveness analyses to determine which of 3 modalities offers the best value: closed reduction and percutaneous pinning, open reduction and internal fixation, or external fixation. We developed a Markov model that projected short-term and long-term health benefits and costs in patients undergoing a surgical procedure for a distal radial fracture. Simulations began at the patient age of 50 years and were run over the patient's lifetime. The analysis was conducted from health-care payer and societal perspectives. We estimated transition probabilities and quality-of-life values from the literature and determined costs from Medicare reimbursement schedules in 2016 U.S. dollars. Suboptimal postoperative outcomes were determined by rates of reduction loss (4% for closed reduction and percutaneous pinning, 1% for open reduction and internal fixation, and 11% for external fixation) and rates of orthopaedic complications. Procedural costs were $7,638 for closed reduction and percutaneous pinning, $10,170 for open reduction and internal fixation, and $9,886 for external fixation. Outputs were total costs and quality-adjusted life-years (QALYs), discounted at 3% per year. We considered willingness-to-pay thresholds of $50,000 and $100,000. We conducted deterministic and probabilistic sensitivity analyses to evaluate the impact of data uncertainty. From the health-care payer perspective, closed reduction and percutaneous pinning dominated (i.e., produced greater QALYs at lower costs than) open reduction and internal fixation and dominated external fixation. From the societal perspective, the incremental cost-effectiveness ratio for closed reduction and percutaneous pinning compared with open reduction and internal fixation was $21,058 per QALY and external fixation was dominated. In probabilistic sensitivity analysis, open reduction and internal fixation was cost-effective roughly 50% of the time compared with roughly 45% for closed reduction and percutaneous pinning. When considering data uncertainty, there is only a 5% to 10% difference in the frequency of probability combinations that find open reduction and internal fixation to be more cost-effective. The current degree of uncertainty in the data produces difficulty in distinguishing either strategy as being more cost-effective overall and thus it may be left to surgeon and patient shared decision-making. Economic Level III. See Instructions for Authors for a complete description of levels of evidence.
How Virtual Technology Can Impact Total Ownership Costs on a USN Vessel
2012-03-01
Clients (After Lam, 2010) Alternative Solutions Labor $M Hardware $M Software $M Transport $M Power & Cooling $M Virtualization $M...and will hold contractors accountable to ensure energy efficiency targets of new equipment are as advertised . 2. Total Cost of Ownership...automatically placed into Standby by the VMware software and reduced energy consumption by 230 watts. Even though there were 12 virtual desktops online and in
Defense Facility Condition: Revised Guidance Needed to Improve Oversight of Assessments and Ratings
2016-06-01
are to implement the standardized process in part by assessing the condition of buildings, pavement , and rail using the same set of software tools...facility to current standards; costs for labor, equipment, materials, and currency exchange rates overseas; costs for project planning and design ...example, the services are to assess the condition of buildings, pavement , and rail using Sustainment Management System software tools developed by the
A Comparison of Four Software Programs for Implementing Decision Analytic Cost-Effectiveness Models.
Hollman, Chase; Paulden, Mike; Pechlivanoglou, Petros; McCabe, Christopher
2017-08-01
The volume and technical complexity of both academic and commercial research using decision analytic modelling has increased rapidly over the last two decades. The range of software programs used for their implementation has also increased, but it remains true that a small number of programs account for the vast majority of cost-effectiveness modelling work. We report a comparison of four software programs: TreeAge Pro, Microsoft Excel, R and MATLAB. Our focus is on software commonly used for building Markov models and decision trees to conduct cohort simulations, given their predominance in the published literature around cost-effectiveness modelling. Our comparison uses three qualitative criteria as proposed by Eddy et al.: "transparency and validation", "learning curve" and "capability". In addition, we introduce the quantitative criterion of processing speed. We also consider the cost of each program to academic users and commercial users. We rank the programs based on each of these criteria. We find that, whilst Microsoft Excel and TreeAge Pro are good programs for educational purposes and for producing the types of analyses typically required by health technology assessment agencies, the efficiency and transparency advantages of programming languages such as MATLAB and R become increasingly valuable when more complex analyses are required.
The EPA Control Strategy Tool (CoST) is a software tool for projecting potential future control scenarios, their effects on emissions and estimated costs. This tool uses the NEI and the Control Measures Dataset as key inputs. CoST outputs are projections of future control scenarios.
Deriving the Cost of Software Maintenance for Software Intensive Systems
2011-08-29
more of software maintenance). Figure 4. SEER-SEM Maintenance Effort by Year Report (Reifer, Allen, Fersch, Hitchings, Judy , & Rosa, 2010...understand the linear relationship between two variables. The formula for the simple Pearson product-moment correlation is represented in Equation 5...standardization is required across the software maintenance community in order to ensure that the data being recorded can be employed beyond the agency or
Manager's handbook for software development, revision 1
NASA Technical Reports Server (NTRS)
1990-01-01
Methods and aids for the management of software development projects are presented. The recommendations are based on analyses and experiences of the Software Engineering Laboratory (SEL) with flight dynamics software development. The management aspects of the following subjects are described: organizing the project, producing a development plan, estimating costs, scheduling, staffing, preparing deliverable documents, using management tools, monitoring the project, conducting reviews, auditing, testing, and certifying.
Proceedings of the Center for National Software Studies Workshop on Trustworthy Software
2004-05-10
just the de - velopment cost) to achieve a sustained level of software trustworthiness. • Reforming the procurement process. We could reform the...failure or breach of security. Some examples include software used in safety systems of nuclear power plants, transportation systems, medical devices...issue in many vital systems, including those found in transportation , telecommunications, utilities, health care, and financial services. Any lack of
A second generation experiment in fault-tolerant software
NASA Technical Reports Server (NTRS)
Knight, J. C.
1986-01-01
The primary goal was to determine whether the application of fault tolerance to software increases its reliability if the cost of production is the same as for an equivalent nonfault tolerance version derived from the same requirements specification. Software development protocols are discussed. The feasibility of adapting to software design fault tolerance the technique of N-fold Modular Redundancy with majority voting was studied.
NASA Technical Reports Server (NTRS)
Mayer, Richard J.; Blinn, Thomas M.; Mayer, Paula S. D.; Reddy, Uday; Ackley, Keith; Futrell, Mike
1991-01-01
The Framework Programmable Software Development Platform (FPP) is a project aimed at combining effective tool and data integration mechanisms with a model of the software development process in an intelligent integrated software development environment. Guided by this model, this system development framework will take advantage of an integrated operating environment to automate effectively the management of the software development process so that costly mistakes during the development phase can be eliminated.
Software engineering project management - A state-of-the-art report
NASA Technical Reports Server (NTRS)
Thayer, R. H.; Lehman, J. H.
1977-01-01
The management of software engineering projects in the aerospace industry was investigated. The survey assessed such features as contract type, specification preparation techniques, software documentation required by customers, planning and cost-estimating, quality control, the use of advanced program practices, software tools and test procedures, the education levels of project managers, programmers and analysts, work assignment, automatic software monitoring capabilities, design and coding reviews, production times, success rates, and organizational structure of the projects.
Cost-Effectiveness and Cost-Reduction in United States Colleges and Universities.
ERIC Educational Resources Information Center
Miller, Richard I.; Miller, Peggy M.
1991-01-01
The relationship in college administration between cost effectiveness/cost reduction and planning, management, and evaluation is explored, and approaches to cost accounting and financial ratio analysis are discussed. It is concluded that it is important to emphasize institutional mission and people rather than cost containment and productivity.…
NASA Technical Reports Server (NTRS)
Mizell, Carolyn Barrett; Malone, Linda
2007-01-01
The development process for a large software development project is very complex and dependent on many variables that are dynamic and interrelated. Factors such as size, productivity and defect injection rates will have substantial impact on the project in terms of cost and schedule. These factors can be affected by the intricacies of the process itself as well as human behavior because the process is very labor intensive. The complex nature of the development process can be investigated with software development process models that utilize discrete event simulation to analyze the effects of process changes. The organizational environment and its effects on the workforce can be analyzed with system dynamics that utilizes continuous simulation. Each has unique strengths and the benefits of both types can be exploited by combining a system dynamics model and a discrete event process model. This paper will demonstrate how the two types of models can be combined to investigate the impacts of human resource interactions on productivity and ultimately on cost and schedule.
Generation and reduction of the data for the Ulysses gravitational wave experiment
NASA Technical Reports Server (NTRS)
Agresti, R.; Bonifazi, P.; Iess, L.; Trager, G. B.
1987-01-01
A procedure for the generation and reduction of the radiometric data known as REGRES is described. The software is implemented on a HP-1000F computer and was tested on REGRES data relative to the Voyager I spacecraft. The REGRES data are a current output of NASA's Orbit Determination Program. The software package was developed in view of the data analysis of the gravitational wave experiment planned for the European spacecraft Ulysses.
Software manual for operating particle displacement tracking data acquisition and reduction system
NASA Technical Reports Server (NTRS)
Wernet, Mark P.
1991-01-01
The software manual is presented. The necessary steps required to record, analyze, and reduce Particle Image Velocimetry (PIV) data using the Particle Displacement Tracking (PDT) technique are described. The new PDT system is an all electronic technique employing a CCD video camera and a large memory buffer frame-grabber board to record low velocity (less than or equal to 20 cm/s) flows. Using a simple encoding scheme, a time sequence of single exposure images are time coded into a single image and then processed to track particle displacements and determine 2-D velocity vectors. All the PDT data acquisition, analysis, and data reduction software is written to run on an 80386 PC.
Software Safety Progress in NASA
NASA Technical Reports Server (NTRS)
Radley, Charles F.
1995-01-01
NASA has developed guidelines for development and analysis of safety-critical software. These guidelines have been documented in a Guidebook for Safety Critical Software Development and Analysis. The guidelines represent a practical 'how to' approach, to assist software developers and safety analysts in cost effective methods for software safety. They provide guidance in the implementation of the recent NASA Software Safety Standard NSS-1740.13 which was released as 'Interim' version in June 1994, scheduled for formal adoption late 1995. This paper is a survey of the methods in general use, resulting in the NASA guidelines for safety critical software development and analysis.
Evaluation of levee setbacks for flood-loss reduction, Middle Mississippi River, USA
NASA Astrophysics Data System (ADS)
Dierauer, Jennifer; Pinter, Nicholas; Remo, Jonathan W. F.
2012-07-01
SummaryOne-dimensional hydraulic modeling and flood-loss modeling were used to test the effectiveness of levee setbacks for flood-loss reduction along the Middle Mississippi River (MMR). Four levee scenarios were assessed: (1) the present-day levee configuration, (2) a 1000 m levee setback, (3) a 1500 m levee setback, and (4) an optimized setback configuration. Flood losses were estimated using FEMA's Hazus-MH (Hazards US Multi-Hazard) loss-estimation software on a structure-by-structure basis for a range of floods from the 2- to the 500-year events. These flood-loss estimates were combined with a levee-reliability model to calculate probability-weighted damage estimates. In the simplest case, the levee setback scenarios tested here reduced flood losses compared to current conditions for large, infrequent flooding events but increased flood losses for smaller, more frequent flood events. These increases occurred because levee protection was removed for some of the existing structures. When combined with buyouts of unprotected structures, levee setbacks reduced flood losses for all recurrence intervals. The "optimized" levee setback scenario, involving a levee configuration manually planned to protect existing high-value infrastructure, reduced damages with or without buyouts. This research shows that levee setbacks in combination with buyouts are an economically viable approach for flood-risk reduction along the study reach and likely elsewhere where levees are widely employed for flood control. Designing a levee setback around existing high-value infrastructure can maximize the benefit of the setback while simultaneously minimizing the costs. The optimized levee setback scenario analyzed here produced payback periods (costs divided by benefits) of less than 12 years. With many aging levees failing current inspections across the US, and flood losses spiraling up over time, levee setbacks are a viable solution for reducing flood exposure and flood levels.
NASA Technical Reports Server (NTRS)
Shell, Elaine M.; Lue, Yvonne; Chu, Martha I.
1999-01-01
Flight software (FSW) is a mission critical element of spacecraft functionality and performance. When ground operations personnel interface to a spacecraft, they are dealing almost entirely with onboard software. This software, even more than ground/flight communications systems, is expected to perform perfectly at all times during all phases of on-orbit mission life. Due to the fact that FSW can be reconfigured and reprogrammed to accommodate new spacecraft conditions, the on-orbit FSW maintenance team is usually significantly responsible for the long-term success of a science mission. Failure of FSW can result in very expensive operations work-around costs and lost science opportunities. There are three basic approaches to staffing on-orbit software maintenance, namely: (1) using the original developers, (2) using mission operations personnel, or (3) assembling a Center of Excellence for multi-spacecraft on-orbit FSW support. This paper explains a National Aeronautics and Space Administration, Goddard Space Flight Center (NASA/GSFC) experience related to the roles of on-orbit FSW maintenance personnel. It identifies the advantages and disadvantages of each of the three approaches to staffing the FSW roles, and demonstrates how a cost efficient on-orbit FSW Maintenance Center of Excellence can be established and maintained with significant return on the investment.
NASA Astrophysics Data System (ADS)
Kumlander, Deniss
The globalization of companies operations and competitor between software vendors demand improving quality of delivered software and decreasing the overall cost. The same in fact introduce a lot of problem into software development process as produce distributed organization breaking the co-location rule of modern software development methodologies. Here we propose a reformulation of the ambassador position increasing its productivity in order to bridge communication and workflow gap by managing the entire communication process rather than concentrating purely on the communication result.
NASA Technical Reports Server (NTRS)
Simmons, D. B.; Marchbanks, M. P., Jr.; Quick, M. J.
1982-01-01
The results of an effort to thoroughly and objectively analyze the statistical and historical information gathered during the development of the Shuttle Orbiter Primary Flight Software are given. The particular areas of interest include cost of the software, reliability of the software, requirements for the software and how the requirements changed during development of the system. Data related to the current version of the software system produced some interesting results. Suggestions are made for the saving of additional data which will allow additional investigation.
Microcomputer software to facilitate costing in pathology laboratories.
Stilwell, J A; Woodford, F P
1987-01-01
A software program is described which will enable laboratory managers to calculate, for their laboratory over a 12 month period, the cost of each test or investigation and of components of that cost. These comprise the costs of direct labour, consumables, equipment maintenance and depreciation; allocated costs of intermediate operations--for example, specimen procurement, reception, and data processing; and apportioned indirect costs such as senior staff time as well as external overheads such as telephone charges, rent, and rates. Total annual expenditure on each type of test is also calculated. The principles on which the program is based are discussed. Considered in particular, are the problems of apportioning indirect costs (which are considerable in clinical laboratory work) over different test costs, and the merits of different ways of estimating the amount or fraction of staff members' time spent on each kind of test. The computer program is Crown copyright but is available under licence from one of us (JAS). PMID:3654982
Enhancing crystalline silicon solar cell efficiency with SixGe1-x layers
NASA Astrophysics Data System (ADS)
Ali, Adnan; Cheow, S. L.; Azhari, A. W.; Sopian, K.; Zaidi, Saleem H.
Crystalline silicon (c-Si) solar cell represents a cost effective, environment-friendly, and proven renewable energy resource. Industrially manufacturing of c-Si solar has now matured in terms of efficiency and cost. Continuing cost-effective efficiency enhancement requires transition towards thinner wafers in near term and thin-films in the long term. Successful implementation of either of these alternatives must address intrinsic optical absorption limitation of Si. Bandgap engineering through integration with SixGe1-x layers offers an attractive, inexpensive option. With the help of PC1D software, role of SixGe1-x layers in conventional c-Si solar cells has been intensively investigated in both wafer and thin film configurations by varying Ge concentration, thickness, and placement. In wafer configuration, increase in Ge concentration leads to enhanced absorption through bandgap broadening with an efficiency enhancement of 8% for Ge concentrations of less than 20%. At higher Ge concentrations, despite enhanced optical absorption, efficiency is reduced due to substantial lowering of open-circuit voltage. In 5-25-μm thickness, thin-film solar cell configurations, efficiency gain in excess of 30% is achievable. Therefore, SixGe1-x based thin-film solar cells with an order of magnitude reduction in costly Si material are ideally-suited both in terms of high efficiency and cost. Recent research has demonstrated significant improvement in epitaxially grown SixGe1-x layers on nanostructured Si substrates, thereby enhancing potential of this approach for next generation of c-Si based photovoltaics.
Open-Source 3D-Printable Optics Equipment
Zhang, Chenlong; Anzalone, Nicholas C.; Faria, Rodrigo P.; Pearce, Joshua M.
2013-01-01
Just as the power of the open-source design paradigm has driven down the cost of software to the point that it is accessible to most people, the rise of open-source hardware is poised to drive down the cost of doing experimental science to expand access to everyone. To assist in this aim, this paper introduces a library of open-source 3-D-printable optics components. This library operates as a flexible, low-cost public-domain tool set for developing both research and teaching optics hardware. First, the use of parametric open-source designs using an open-source computer aided design package is described to customize the optics hardware for any application. Second, details are provided on the use of open-source 3-D printers (additive layer manufacturing) to fabricate the primary mechanical components, which are then combined to construct complex optics-related devices. Third, the use of the open-source electronics prototyping platform are illustrated as control for optical experimental apparatuses. This study demonstrates an open-source optical library, which significantly reduces the costs associated with much optical equipment, while also enabling relatively easily adapted customizable designs. The cost reductions in general are over 97%, with some components representing only 1% of the current commercial investment for optical products of similar function. The results of this study make its clear that this method of scientific hardware development enables a much broader audience to participate in optical experimentation both as research and teaching platforms than previous proprietary methods. PMID:23544104
Open-source 3D-printable optics equipment.
Zhang, Chenlong; Anzalone, Nicholas C; Faria, Rodrigo P; Pearce, Joshua M
2013-01-01
Just as the power of the open-source design paradigm has driven down the cost of software to the point that it is accessible to most people, the rise of open-source hardware is poised to drive down the cost of doing experimental science to expand access to everyone. To assist in this aim, this paper introduces a library of open-source 3-D-printable optics components. This library operates as a flexible, low-cost public-domain tool set for developing both research and teaching optics hardware. First, the use of parametric open-source designs using an open-source computer aided design package is described to customize the optics hardware for any application. Second, details are provided on the use of open-source 3-D printers (additive layer manufacturing) to fabricate the primary mechanical components, which are then combined to construct complex optics-related devices. Third, the use of the open-source electronics prototyping platform are illustrated as control for optical experimental apparatuses. This study demonstrates an open-source optical library, which significantly reduces the costs associated with much optical equipment, while also enabling relatively easily adapted customizable designs. The cost reductions in general are over 97%, with some components representing only 1% of the current commercial investment for optical products of similar function. The results of this study make its clear that this method of scientific hardware development enables a much broader audience to participate in optical experimentation both as research and teaching platforms than previous proprietary methods.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ardani, K.; Seif, D.; Margolis, R.
2013-08-01
The objective of this analysis is to roadmap the cost reductions and innovations necessary to achieve the U.S. Department of Energy (DOE) SunShot Initiative's total soft-cost targets by 2020. The roadmap focuses on advances in four soft-cost areas: (1) customer acquisition; (2) permitting, inspection, and interconnection (PII); (3) installation labor; and (4) financing. Financing cost reductions are in terms of the weighted average cost of capital (WACC) for financing PV system installations, with real-percent targets of 3.0% (residential) and 3.4% (commercial).
The DEEP-South: Scheduling and Data Reduction Software System
NASA Astrophysics Data System (ADS)
Yim, Hong-Suh; Kim, Myung-Jin; Bae, Youngho; Moon, Hong-Kyu; Choi, Young-Jun; Roh, Dong-Goo; the DEEP-South Team
2015-08-01
The DEep Ecliptic Patrol of the Southern sky (DEEP-South), started in October 2012, is currently in test runs with the first Korea Microlensing Telescope Network (KMTNet) 1.6 m wide-field telescope located at CTIO in Chile. While the primary objective for the DEEP-South is physical characterization of small bodies in the Solar System, it is expected to discover a large number of such bodies, many of them previously unknown.An automatic observation planning and data reduction software subsystem called "The DEEP-South Scheduling and Data reduction System" (the DEEP-South SDS) is currently being designed and implemented for observation planning, data reduction and analysis of huge amount of data with minimum human interaction. The DEEP-South SDS consists of three software subsystems: the DEEP-South Scheduling System (DSS), the Local Data Reduction System (LDR), and the Main Data Reduction System (MDR). The DSS manages observation targets, makes decision on target priority and observation methods, schedules nightly observations, and archive data using the Database Management System (DBMS). The LDR is designed to detect moving objects from CCD images, while the MDR conducts photometry and reconstructs lightcurves. Based on analysis made at the LDR and the MDR, the DSS schedules follow-up observation to be conducted at other KMTNet stations. In the end of 2015, we expect the DEEP-South SDS to achieve a stable operation. We also have a plan to improve the SDS to accomplish finely tuned observation strategy and more efficient data reduction in 2016.
Scheftic, Charlie M.; Brinson, L. Catherine
2018-01-01
The cost of specialized scientific equipment can be high and with limited funding resources, researchers and students are often unable to access or purchase the ideal equipment for their projects. In the fields of materials science and mechanical engineering, fundamental equipment such as tensile testing devices can cost tens to hundreds of thousands of dollars. While a research lab often has access to a large-scale testing machine suitable for conventional samples, loading devices for meso- and micro-scale samples for in-situ testing with the myriad of microscopy tools are often hard to source and cost prohibitive. Open-source software has allowed for great strides in the reduction of costs associated with software development and open-source hardware and additive manufacturing have the potential to similarly reduce the costs of scientific equipment and increase the accessibility of scientific research. To investigate the feasibility of open-source hardware, a micro-tensile tester was designed with a freely accessible computer-aided design package and manufactured with a desktop 3D-printer and off-the-shelf components. To our knowledge this is one of the first demonstrations of a tensile tester with additively manufactured components for scientific research. The capabilities of the tensile tester were demonstrated by investigating the mechanical properties of Graphene Oxide (GO) paper and thin films. A 3D printed tensile tester was successfully used in conjunction with an atomic force microscope to provide one of the first quantitative measurements of GO thin film buckling under compression. The tensile tester was also used in conjunction with an atomic force microscope to observe the change in surface topology of a GO paper in response to increasing tensile strain. No significant change in surface topology was observed in contrast to prior hypotheses from the literature. Based on this result obtained with the new open source tensile stage we propose an alternative hypothesis we term ‘superlamellae consolidation’ to explain the initial deformation of GO paper. The additively manufactured tensile tester tested represents cost savings of >99% compared to commercial solutions in its class and offers simple customization. However, continued development is needed for the tensile tester presented here to approach the technical specifications achievable with commercial solutions. PMID:29813103
Nandy, Krishanu; Collinson, David W; Scheftic, Charlie M; Brinson, L Catherine
2018-01-01
The cost of specialized scientific equipment can be high and with limited funding resources, researchers and students are often unable to access or purchase the ideal equipment for their projects. In the fields of materials science and mechanical engineering, fundamental equipment such as tensile testing devices can cost tens to hundreds of thousands of dollars. While a research lab often has access to a large-scale testing machine suitable for conventional samples, loading devices for meso- and micro-scale samples for in-situ testing with the myriad of microscopy tools are often hard to source and cost prohibitive. Open-source software has allowed for great strides in the reduction of costs associated with software development and open-source hardware and additive manufacturing have the potential to similarly reduce the costs of scientific equipment and increase the accessibility of scientific research. To investigate the feasibility of open-source hardware, a micro-tensile tester was designed with a freely accessible computer-aided design package and manufactured with a desktop 3D-printer and off-the-shelf components. To our knowledge this is one of the first demonstrations of a tensile tester with additively manufactured components for scientific research. The capabilities of the tensile tester were demonstrated by investigating the mechanical properties of Graphene Oxide (GO) paper and thin films. A 3D printed tensile tester was successfully used in conjunction with an atomic force microscope to provide one of the first quantitative measurements of GO thin film buckling under compression. The tensile tester was also used in conjunction with an atomic force microscope to observe the change in surface topology of a GO paper in response to increasing tensile strain. No significant change in surface topology was observed in contrast to prior hypotheses from the literature. Based on this result obtained with the new open source tensile stage we propose an alternative hypothesis we term 'superlamellae consolidation' to explain the initial deformation of GO paper. The additively manufactured tensile tester tested represents cost savings of >99% compared to commercial solutions in its class and offers simple customization. However, continued development is needed for the tensile tester presented here to approach the technical specifications achievable with commercial solutions.
Gachango, F G; Pedersen, S M; Kjaergaard, C
2015-12-01
Constructed wetlands have been proposed as cost-effective and more targeted technologies in the reduction of nitrogen and phosphorous water pollution in drainage losses from agricultural fields in Denmark. Using two pig farms and one dairy farm situated in a pumped lowland catchment as case studies, this paper explores the feasibility of implementing surface flow constructed wetlands (SFCW) based on their cost effectiveness. Sensitivity analysis is conducted by varying the cost elements of the wetlands in order to establish the most cost-effective scenario and a comparison with the existing nutrients reduction measures carried out. The analyses show that the cost effectiveness of the SFCW is higher in the drainage catchments with higher nutrient loads. The range of the cost effectiveness ratio on nitrogen reduction differs distinctively with that of catch crop measure. The study concludes that SFCW could be a better optimal nutrients reduction measure in drainage catchments characterized with higher nutrient loads.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bill Stanley; Patrick Gonzalez; Sandra Brown
2005-10-01
The Nature Conservancy is participating in a Cooperative Agreement with the Department of Energy (DOE) National Energy Technology Laboratory (NETL) to explore the compatibility of carbon sequestration in terrestrial ecosystems and the conservation of biodiversity. The title of the research project is ''Application and Development of Appropriate Tools and Technologies for Cost-Effective Carbon Sequestration''. The objectives of the project are to: (1) improve carbon offset estimates produced in both the planning and implementation phases of projects; (2) build valid and standardized approaches to estimate project carbon benefits at a reasonable cost; and (3) lay the groundwork for implementing cost-effective projects,more » providing new testing ground for biodiversity protection and restoration projects that store additional atmospheric carbon. This Technical Progress Report discusses preliminary results of the six specific tasks that The Nature Conservancy is undertaking to answer research needs while facilitating the development of real projects with measurable greenhouse gas reductions. The research described in this report occurred between April 1st , 2005 and June 30th, 2005. The specific tasks discussed include: Task 1: carbon inventory advancements; Task 2: emerging technologies for remote sensing of terrestrial carbon; Task 3: baseline method development; Task 4: third-party technical advisory panel meetings; Task 5: new project feasibility studies; and Task 6: development of new project software screening tool.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bill Stanley; Patrick Gonzalez; Sandra Brown
2006-01-01
The Nature Conservancy is participating in a Cooperative Agreement with the Department of Energy (DOE) National Energy Technology Laboratory (NETL) to explore the compatibility of carbon sequestration in terrestrial ecosystems and the conservation of biodiversity. The title of the research project is ''Application and Development of Appropriate Tools and Technologies for Cost-Effective Carbon Sequestration''. The objectives of the project are to: (1) improve carbon offset estimates produced in both the planning and implementation phases of projects; (2) build valid and standardized approaches to estimate project carbon benefits at a reasonable cost; and (3) lay the groundwork for implementing cost-effective projects,more » providing new testing ground for biodiversity protection and restoration projects that store additional atmospheric carbon. This Technical Progress Report discusses preliminary results of the six specific tasks that The Nature Conservancy is undertaking to answer research needs while facilitating the development of real projects with measurable greenhouse gas reductions. The research described in this report occurred between April 1st , 2005 and June 30th, 2005. The specific tasks discussed include: Task 1: carbon inventory advancements; Task 2: emerging technologies for remote sensing of terrestrial carbon; Task 3: baseline method development; Task 4: third-party technical advisory panel meetings; Task 5: new project feasibility studies; and Task 6: development of new project software screening tool.« less
Using Amazon's Elastic Compute Cloud to dynamically scale CMS computational resources
NASA Astrophysics Data System (ADS)
Evans, D.; Fisk, I.; Holzman, B.; Melo, A.; Metson, S.; Pordes, R.; Sheldon, P.; Tiradani, A.
2011-12-01
Large international scientific collaborations such as the Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider have traditionally addressed their data reduction and analysis needs by building and maintaining dedicated computational infrastructure. Emerging cloud computing services such as Amazon's Elastic Compute Cloud (EC2) offer short-term CPU and storage resources with costs based on usage. These services allow experiments to purchase computing resources as needed, without significant prior planning and without long term investments in facilities and their management. We have demonstrated that services such as EC2 can successfully be integrated into the production-computing model of CMS, and find that they work very well as worker nodes. The cost-structure and transient nature of EC2 services makes them inappropriate for some CMS production services and functions. We also found that the resources are not truely "on-demand" as limits and caps on usage are imposed. Our trial workflows allow us to make a cost comparison between EC2 resources and dedicated CMS resources at a University, and conclude that it is most cost effective to purchase dedicated resources for the "base-line" needs of experiments such as CMS. However, if the ability to use cloud computing resources is built into an experiment's software framework before demand requires their use, cloud computing resources make sense for bursting during times when spikes in usage are required.
The efficiency and budgeting of public hospitals: case study of iran.
Yusefzadeh, Hasan; Ghaderi, Hossein; Bagherzade, Rafat; Barouni, Mohsen
2013-05-01
Hospitals are the most costly and important components of any health care system, so it is important to know their economic values, pay attention to their efficiency and consider factors affecting them. The aim of this study was to assess the technical scale and economic efficiency of hospitals in the West Azerbaijan province of Iran, for which Data Envelopment Analysis (DEA) was used to propose a model for operational budgeting. This study was a descriptive-analysis that was conducted in 2009 and had three inputs and two outputs. Deap2, 1 software was used for data analysis. Slack and radial movements and surplus of inputs were calculated for selected hospitals. Finally, a model was proposed for performance-based budgeting of hospitals and health sectors using the DEA technique. The average scores of technical efficiency, pure technical efficiency (managerial efficiency) and scale efficiency of hospitals were 0.584, 0.782 and 0.771, respectively. In other words the capacity of efficiency promotion in hospitals without any increase in costs and with the same amount of inputs was about 41.5%. Only four hospitals among all hospitals had the maximum level of technical efficiency. Moreover, surplus production factors were evident in these hospitals. Reduction of surplus production factors through comprehensive planning based on the results of the Data Envelopment Analysis can play a major role in cost reduction of hospitals and health sectors. In hospitals with a technical efficiency score of less than one, the original and projected values of inputs were different; resulting in a surplus. Hence, these hospitals should reduce their values of inputs to achieve maximum efficiency and optimal performance. The results of this method was applied to hospitals a benchmark for making decisions about resource allocation; linking budgets to performance results; and controlling and improving hospitals performance.
ERIC Educational Resources Information Center
Milshtein, Amy
1998-01-01
Discusses how to avoid costly mistakes when buying facility-management software. Provides answers to questions buyers should ask before committing funds to a particular program. Selected facility-management software companies and product profiles are highlighted. (GR)
A white paper: Operational efficiency. New approaches to future propulsion systems
NASA Technical Reports Server (NTRS)
Rhodes, Russel; Wong, George
1991-01-01
Advanced launch systems for the next generation of space transportation systems (1995 to 2010) must deliver large payloads (125,000 to 500,000 lbs) to low earth orbit (LEO) at one tenth of today's cost, or 300 to 400 $/lb of payload. This cost represents an order of magnitude reduction from the Titan unmanned vehicle cost of delivering payload to orbit. To achieve this sizable reduction, the operations cost as well as the engine cost must both be lower than current engine system. The Advanced Launch System (ALS) is studying advanced engine designs, such as the Space Transportation Main Engine (STME), which has achieved notable reduction in cost. The results are presented of a current study wherein another level of cost reduction can be achieved by designing the propulsion module utilizing these advanced engines for enhanced operations efficiency and reduced operations cost.
Preliminary design of the redundant software experiment
NASA Technical Reports Server (NTRS)
Campbell, Roy; Deimel, Lionel; Eckhardt, Dave, Jr.; Kelly, John; Knight, John; Lauterbach, Linda; Lee, Larry; Mcallister, Dave; Mchugh, John
1985-01-01
The goal of the present experiment is to characterize the fault distributions of highly reliable software replicates, constructed using techniques and environments which are similar to those used in comtemporary industrial software facilities. The fault distributions and their effect on the reliability of fault tolerant configurations of the software will be determined through extensive life testing of the replicates against carefully constructed randomly generated test data. Each detected error will be carefully analyzed to provide insight in to their nature and cause. A direct objective is to develop techniques for reducing the intensity of coincident errors, thus increasing the reliability gain which can be achieved with fault tolerance. Data on the reliability gains realized, and the cost of the fault tolerant configurations can be used to design a companion experiment to determine the cost effectiveness of the fault tolerant strategy. Finally, the data and analysis produced by this experiment will be valuable to the software engineering community as a whole because it will provide a useful insight into the nature and cause of hard to find, subtle faults which escape standard software engineering validation techniques and thus persist far into the software life cycle.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-14
... Collection; Comments Requested: COPS Comparative Assessment of Cost Reduction by Agencies Survey ACTION: 30...; comments requested. (2) Title of the Form/Collection: COPS Comparative Assessment of Cost Reduction by... will be asked complete the COPS Comparative Assessment of Cost Reduction Survey. The survey will be...
Energy Cost Reduction for Automotive Service Facilities.
ERIC Educational Resources Information Center
Federal Energy Administration, Washington, DC.
This handbook on energy cost reduction for automotive service facilities consists of four sections. The importance and economic benefits of energy conservation are discussed in the first section. In the second section six energy cost reduction measures are discussed: relamping interior areas; relamping and reducing interior lighting; setting back…
Best Practices for Reduction of Uncertainty in CFD Results
NASA Technical Reports Server (NTRS)
Mendenhall, Michael R.; Childs, Robert E.; Morrison, Joseph H.
2003-01-01
This paper describes a proposed best-practices system that will present expert knowledge in the use of CFD. The best-practices system will include specific guidelines to assist the user in problem definition, input preparation, grid generation, code selection, parameter specification, and results interpretation. The goal of the system is to assist all CFD users in obtaining high quality CFD solutions with reduced uncertainty and at lower cost for a wide range of flow problems. The best-practices system will be implemented as a software product which includes an expert system made up of knowledge databases of expert information with specific guidelines for individual codes and algorithms. The process of acquiring expert knowledge is discussed, and help from the CFD community is solicited. Benefits and challenges associated with this project are examined.
Vectorized program architectures for supercomputer-aided circuit design
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rizzoli, V.; Ferlito, M.; Neri, A.
1986-01-01
Vector processors (supercomputers) can be effectively employed in MIC or MMIC applications to solve problems of large numerical size such as broad-band nonlinear design or statistical design (yield optimization). In order to fully exploit the capabilities of a vector hardware, any program architecture must be structured accordingly. This paper presents a possible approach to the ''semantic'' vectorization of microwave circuit design software. Speed-up factors of the order of 50 can be obtained on a typical vector processor (Cray X-MP), with respect to the most powerful scaler computers (CDC 7600), with cost reductions of more than one order of magnitude. Thismore » could broaden the horizon of microwave CAD techniques to include problems that are practically out of the reach of conventional systems.« less
Wilcox, Meredith L; Mason, Helen; Fouad, Fouad M; Rastam, Samer; al Ali, Radwan; Page, Timothy F; Capewell, Simon; O'Flaherty, Martin; Maziak, Wasim
2015-01-01
This study presents a cost-effectiveness analysis of salt reduction policies to lower coronary heart disease in Syria. Costs and benefits of a health promotion campaign about salt reduction (HP); labeling of salt content on packaged foods (L); reformulation of salt content within packaged foods (R); and combinations of the three were estimated over a 10-year time frame. Policies were deemed cost-effective if their cost-effectiveness ratios were below the region's established threshold of $38,997 purchasing power parity (PPP). Sensitivity analysis was conducted to account for the uncertainty in the reduction of salt intake. HP, L, and R+HP+L were cost-saving using the best estimates. The remaining policies were cost-effective (CERs: R=$5,453 PPP/LYG; R+HP=$2,201 PPP/LYG; R+L=$2,125 PPP/LYG). R+HP+L provided the largest benefit with net savings using the best and maximum estimates, while R+L was cost-effective with the lowest marginal cost using the minimum estimates. This study demonstrated that all policies were cost-saving or cost effective, with the combination of reformulation plus labeling and a comprehensive policy involving all three approaches being the most promising salt reduction strategies to reduce CHD mortality in Syria.
Walden, Anita; Nahm, Meredith; Barnett, M Edwina; Conde, Jose G; Dent, Andrew; Fadiel, Ahmed; Perry, Theresa; Tolk, Chris; Tcheng, James E; Eisenstein, Eric L
2011-01-01
New data management models are emerging in multi-center clinical studies. We evaluated the incremental costs associated with decentralized vs. centralized models. We developed clinical research network economic models to evaluate three data management models: centralized, decentralized with local software, and decentralized with shared database. Descriptive information from three clinical research studies served as inputs for these models. The primary outcome was total data management costs. Secondary outcomes included: data management costs for sites, local data centers, and central coordinating centers. Both decentralized models were more costly than the centralized model for each clinical research study: the decentralized with local software model was the most expensive. Decreasing the number of local data centers and case book pages reduced cost differentials between models. Decentralized vs. centralized data management in multi-center clinical research studies is associated with increases in data management costs.
Walden, Anita; Nahm, Meredith; Barnett, M. Edwina; Conde, Jose G.; Dent, Andrew; Fadiel, Ahmed; Perry, Theresa; Tolk, Chris; Tcheng, James E.; Eisenstein, Eric L.
2012-01-01
Background New data management models are emerging in multi-center clinical studies. We evaluated the incremental costs associated with decentralized vs. centralized models. Methods We developed clinical research network economic models to evaluate three data management models: centralized, decentralized with local software, and decentralized with shared database. Descriptive information from three clinical research studies served as inputs for these models. Main Outcome Measures The primary outcome was total data management costs. Secondary outcomes included: data management costs for sites, local data centers, and central coordinating centers. Results Both decentralized models were more costly than the centralized model for each clinical research study: the decentralized with local software model was the most expensive. Decreasing the number of local data centers and case book pages reduced cost differentials between models. Conclusion Decentralized vs. centralized data management in multi-center clinical research studies is associated with increases in data management costs. PMID:21335692
NASA Technical Reports Server (NTRS)
Freeman, W.; Ilcewicz, L.; Swanson, G.; Gutowski, T.
1992-01-01
The Structures Technology Program Office (STPO) at NASA LaRC has initiated development of a conceptual and preliminary designers' cost prediction model. The model will provide a technically sound method for evaluating the relative cost of different composite structural designs, fabrication processes, and assembly methods that can be compared to equivalent metallic parts or assemblies. The feasibility of developing cost prediction software in a modular form for interfacing with state-of-the-art preliminary design tools and computer aided design programs is being evaluated. The goal of this task is to establish theoretical cost functions that relate geometric design features to summed material cost and labor content in terms of process mechanics and physics. The output of the designers' present analytical tools will be input for the designers' cost prediction model to provide the designer with a database and deterministic cost methodology that allows one to trade and synthesize designs with both cost and weight as objective functions for optimization. This paper presents the team members, approach, goals, plans, and progress to date for development of COSTADE (Cost Optimization Software for Transport Aircraft Design Evaluation).
Software Technology for Adaptable, Reliable Systems (STARS)
1994-03-25
Tmeline(3), SECOMO(3), SEER(3), GSFC Software Engineering Lab Model(l), SLIM(4), SEER-SEM(l), SPQR (2), PRICE-S(2), internally-developed models(3), APMSS(1...3 " Timeline - 3 " SASET (Software Architecture Sizing Estimating Tool) - 2 " MicroMan 11- 2 * LCM (Logistics Cost Model) - 2 * SPQR - 2 * PRICE-S - 2
Managers Handbook for Software Development
NASA Technical Reports Server (NTRS)
Agresti, W.; Mcgarry, F.; Card, D.; Page, J.; Church, V.; Werking, R.
1984-01-01
Methods and aids for the management of software development projects are presented. The recommendations are based on analyses and experiences with flight dynamics software development. The management aspects of organizing the project, producing a development plan, estimation costs, scheduling, staffing, preparing deliverable documents, using management tools, monitoring the project, conducting reviews, auditing, testing, and certifying are described.
ERIC Educational Resources Information Center
Weaver, Dave, Ed.
This document consists of 30 microcomputer software package evaluations prepared for the MicroSIFT (Microcomputer Software and Information for Teachers) Clearinghouse at the Northwest Regional Educational Laboratory (NWREL). The concise, single-sheet resume describing and evaluating each software package includes source, cost, ability level,…
An Ontology for Software Engineering Education
ERIC Educational Resources Information Center
Ling, Thong Chee; Jusoh, Yusmadi Yah; Adbullah, Rusli; Alwi, Nor Hayati
2013-01-01
Software agents communicate using ontology. It is important to build an ontology for specific domain such as Software Engineering Education. Building an ontology from scratch is not only hard, but also incur much time and cost. This study aims to propose an ontology through adaptation of the existing ontology which is originally built based on a…
Integrating Model-Based Verification into Software Design Education
ERIC Educational Resources Information Center
Yilmaz, Levent; Wang, Shuo
2005-01-01
Proper design analysis is indispensable to assure quality and reduce emergent costs due to faulty software. Teaching proper design verification skills early during pedagogical development is crucial, as such analysis is the only tractable way of resolving software problems early when they are easy to fix. The premise of the presented strategy is…
Systems Engineering and Integration (SE and I)
NASA Technical Reports Server (NTRS)
Chevers, ED; Haley, Sam
1990-01-01
The issue of technology advancement and future space transportation vehicles is addressed. The challenge is to develop systems which can be evolved and improved in small incremental steps where each increment reduces present cost, improves, reliability, or does neither but sets the stage for a second incremental upgrade that does. Future requirements are interface standards for commercial off the shelf products to aid in the development of integrated facilities; enhanced automated code generation system slightly coupled to specification and design documentation; modeling tools that support data flow analysis; and shared project data bases consisting of technical characteristics cast information, measurement parameters, and reusable software programs. Topics addressed include: advanced avionics development strategy; risk analysis and management; tool quality management; low cost avionics; cost estimation and benefits; computer aided software engineering; computer systems and software safety; system testability; and advanced avionics laboratories - and rapid prototyping. This presentation is represented by viewgraphs only.
Office-Based Three-Dimensional Printing Workflow for Craniomaxillofacial Fracture Repair.
Elegbede, Adekunle; Diaconu, Silviu C; McNichols, Colton H L; Seu, Michelle; Rasko, Yvonne M; Grant, Michael P; Nam, Arthur J
2018-03-08
Three-dimensional printing of patient-specific models is being used in various aspects of craniomaxillofacial reconstruction. Printing is typically outsourced to off-site vendors, with the main disadvantages being increased costs and time for production. Office-based 3-dimensional printing has been proposed as a means to reduce costs and delays, but remains largely underused because of the perception among surgeons that it is futuristic, highly technical, and prohibitively expensive. The goal of this report is to demonstrate the feasibility and ease of incorporating in-office 3-dimensional printing into the standard workflow for facial fracture repair.Patients with complex mandible fractures requiring open repair were identified. Open-source software was used to create virtual 3-dimensional skeletal models of the, initial injury pattern, and then the ideally reduced fractures based on preoperative computed tomography (CT) scan images. The virtual 3-dimensional skeletal models were then printed in our office using a commercially available 3-dimensional printer and bioplastic filament. The 3-dimensional skeletal models were used as templates to bend and shape titanium plates that were subsequently used for intraoperative fixation.Average print time was 6 hours. Excluding the 1-time cost of the 3-dimensional printer of $2500, roughly the cost of a single commercially produced model, the average material cost to print 1 model mandible was $4.30. Postoperative CT imaging demonstrated precise, predicted reduction in all patients.Office-based 3-dimensional printing of skeletal models can be routinely used in repair of facial fractures in an efficient and cost-effective manner.
Digital radiography: optimization of image quality and dose using multi-frequency software.
Precht, H; Gerke, O; Rosendahl, K; Tingberg, A; Waaler, D
2012-09-01
New developments in processing of digital radiographs (DR), including multi-frequency processing (MFP), allow optimization of image quality and radiation dose. This is particularly promising in children as they are believed to be more sensitive to ionizing radiation than adults. To examine whether the use of MFP software reduces the radiation dose without compromising quality at DR of the femur in 5-year-old-equivalent anthropomorphic and technical phantoms. A total of 110 images of an anthropomorphic phantom were imaged on a DR system (Canon DR with CXDI-50 C detector and MLT[S] software) and analyzed by three pediatric radiologists using Visual Grading Analysis. In addition, 3,500 images taken of a technical contrast-detail phantom (CDRAD 2.0) provide an objective image-quality assessment. Optimal image-quality was maintained at a dose reduction of 61% with MLT(S) optimized images. Even for images of diagnostic quality, MLT(S) provided a dose reduction of 88% as compared to the reference image. Software impact on image quality was found significant for dose (mAs), dynamic range dark region and frequency band. By optimizing image processing parameters, a significant dose reduction is possible without significant loss of image quality.
ERIC Educational Resources Information Center
McWilliams, Peter
1982-01-01
Describes the unique features, available software, performance capabilities, system options, costs, advantages, disadvantages, and eccentricities of the Osborne 1 microcomputer. A table summarizes specifications, features, and costs. (JL)
System integration test plan for HANDI 2000 business management system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wilson, D.
This document presents the system integration test plan for the Commercial-Off-The-Shelf, PassPort and PeopleSoft software, and custom software created to work with the COTS products. The PP software is an integrated application for AP, Contract Management, Inventory Management, Purchasing and Material Safety Data Sheet. The PS software is an integrated application for Project Costing, General Ledger, Human Resources/Training, Payroll, and Base Benefits.
Inertial Upper Stage (IUS) software analysis
NASA Technical Reports Server (NTRS)
Grayson, W. L.; Nickel, C. E.; Rose, P. L.; Singh, R. P.
1979-01-01
The Inertial Upper Stage (IUS) System, an extension of the Space Transportation System (STS) operating regime to include higher orbits, orbital plane changes, geosynchronous orbits, and interplanetary trajectories is presented. The IUS software design, the IUS software interfaces with other systems, and the cost effectiveness in software verification are described. Tasks of the IUS discussed include: (1) design analysis; (2) validation requirements analysis; (3) interface analysis; and (4) requirements analysis.
Improvement of Computer Software Quality through Software Automated Tools.
1986-08-31
requirement for increased emphasis on software quality assurance has lead to the creation of various methods of verification and validation. Experience...result was a vast array of methods , systems, languages and automated tools to assist in the process. Given that the primary role of quality assurance is...Unfortunately, there is no single method , tool or technique that can insure accurate, reliable and cost effective software. Therefore, government and industry
NASA Technical Reports Server (NTRS)
Briand, Lionel C.; Basili, Victor R.; Hetmanski, Christopher J.
1993-01-01
Applying equal testing and verification effort to all parts of a software system is not very efficient, especially when resources are limited and scheduling is tight. Therefore, one needs to be able to differentiate low/high fault frequency components so that testing/verification effort can be concentrated where needed. Such a strategy is expected to detect more faults and thus improve the resulting reliability of the overall system. This paper presents the Optimized Set Reduction approach for constructing such models, intended to fulfill specific software engineering needs. Our approach to classification is to measure the software system and build multivariate stochastic models for predicting high risk system components. We present experimental results obtained by classifying Ada components into two classes: is or is not likely to generate faults during system and acceptance test. Also, we evaluate the accuracy of the model and the insights it provides into the error making process.
NASA Astrophysics Data System (ADS)
Fomina, E. V.; Kozhukhova, N. I.; Sverguzova, S. V.; Fomin, A. E.
2018-05-01
In this paper, the regression equations method for design of construction material was studied. Regression and polynomial equations representing the correlation between the studied parameters were proposed. The logic design and software interface of the regression equations method focused on parameter optimization to provide the energy saving effect at the stage of autoclave aerated concrete design considering the replacement of traditionally used quartz sand by coal mining by-product such as argillite. The mathematical model represented by a quadric polynomial for the design of experiment was obtained using calculated and experimental data. This allowed the estimation of relationship between the composition and final properties of the aerated concrete. The surface response graphically presented in a nomogram allowed the estimation of concrete properties in response to variation of composition within the x-space. The optimal range of argillite content was obtained leading to a reduction of raw materials demand, development of target plastic strength of aerated concrete as well as a reduction of curing time before autoclave treatment. Generally, this method allows the design of autoclave aerated concrete with required performance without additional resource and time costs.
Using the Microcomputer to Teach about Nuclear Energy.
ERIC Educational Resources Information Center
Saltinski, Ronald
1984-01-01
Examines various types of software useful in teaching about nuclear energy. Includes a list of 11 software resources (including program name, source and cost, system requirements, and brief comments about the program). (JN)
ERIC Educational Resources Information Center
Classroom Computer Learning, 1990
1990-01-01
Reviewed are three computer software packages including "Martin Luther King, Jr.: Instant Replay of History,""Weeds to Trees," and "The New Print Shop, School Edition." Discussed are hardware requirements, costs, grade levels, availability, emphasis, strengths, and weaknesses. (CW)
The use of hypermedia to increase the productivity of software development teams
NASA Technical Reports Server (NTRS)
Coles, L. Stephen
1991-01-01
Rapid progress in low-cost commercial PC-class multimedia workstation technology will potentially have a dramatic impact on the productivity of distributed work groups of 50-100 software developers. Hypermedia/multimedia involves the seamless integration in a graphical user interface (GUI) of a wide variety of data structures, including high-resolution graphics, maps, images, voice, and full-motion video. Hypermedia will normally require the manipulation of large dynamic files for which relational data base technology and SQL servers are essential. Basic machine architecture, special-purpose video boards, video equipment, optical memory, software needed for animation, network technology, and the anticipated increase in productivity that will result for the introduction of hypermedia technology are covered. It is suggested that the cost of the hardware and software to support an individual multimedia workstation will be on the order of $10,000.
NASA Astrophysics Data System (ADS)
Abdullah, Johari Yap; Omar, Marzuki; Pritam, Helmi Mohd Hadi; Husein, Adam; Rajion, Zainul Ahmad
2016-12-01
3D printing of mandible is important for pre-operative planning, diagnostic purposes, as well as for education and training. Currently, the processing of CT data is routinely performed with commercial software which increases the cost of operation and patient management for a small clinical setting. Usage of open-source software as an alternative to commercial software for 3D reconstruction of the mandible from CT data is scarce. The aim of this study is to compare two methods of 3D reconstruction of the mandible using commercial Materialise Mimics software and open-source Medical Imaging Interaction Toolkit (MITK) software. Head CT images with a slice thickness of 1 mm and a matrix of 512x512 pixels each were retrieved from the server located at the Radiology Department of Hospital Universiti Sains Malaysia. The CT data were analysed and the 3D models of mandible were reconstructed using both commercial Materialise Mimics and open-source MITK software. Both virtual 3D models were saved in STL format and exported to 3matic and MeshLab software for morphometric and image analyses. Both models were compared using Wilcoxon Signed Rank Test and Hausdorff Distance. No significant differences were obtained between the 3D models of the mandible produced using Mimics and MITK software. The 3D model of the mandible produced using MITK open-source software is comparable to the commercial MIMICS software. Therefore, open-source software could be used in clinical setting for pre-operative planning to minimise the operational cost.
Rivera-Rodriguez, Claudia L; Resch, Stephen; Haneuse, Sebastien
2018-01-01
In many low- and middle-income countries, the costs of delivering public health programs such as for HIV/AIDS, nutrition, and immunization are not routinely tracked. A number of recent studies have sought to estimate program costs on the basis of detailed information collected on a subsample of facilities. While unbiased estimates can be obtained via accurate measurement and appropriate analyses, they are subject to statistical uncertainty. Quantification of this uncertainty, for example, via standard errors and/or 95% confidence intervals, provides important contextual information for decision-makers and for the design of future costing studies. While other forms of uncertainty, such as that due to model misspecification, are considered and can be investigated through sensitivity analyses, statistical uncertainty is often not reported in studies estimating the total program costs. This may be due to a lack of awareness/understanding of (1) the technical details regarding uncertainty estimation and (2) the availability of software with which to calculate uncertainty for estimators resulting from complex surveys. We provide an overview of statistical uncertainty in the context of complex costing surveys, emphasizing the various potential specific sources that contribute to overall uncertainty. We describe how analysts can compute measures of uncertainty, either via appropriately derived formulae or through resampling techniques such as the bootstrap. We also provide an overview of calibration as a means of using additional auxiliary information that is readily available for the entire program, such as the total number of doses administered, to decrease uncertainty and thereby improve decision-making and the planning of future studies. A recent study of the national program for routine immunization in Honduras shows that uncertainty can be reduced by using information available prior to the study. This method can not only be used when estimating the total cost of delivering established health programs but also to decrease uncertainty when the interest lies in assessing the incremental effect of an intervention. Measures of statistical uncertainty associated with survey-based estimates of program costs, such as standard errors and 95% confidence intervals, provide important contextual information for health policy decision-making and key inputs for the design of future costing studies. Such measures are often not reported, possibly because of technical challenges associated with their calculation and a lack of awareness of appropriate software. Modern statistical analysis methods for survey data, such as calibration, provide a means to exploit additional information that is readily available but was not used in the design of the study to significantly improve the estimation of total cost through the reduction of statistical uncertainty.
Resch, Stephen
2018-01-01
Objectives: In many low- and middle-income countries, the costs of delivering public health programs such as for HIV/AIDS, nutrition, and immunization are not routinely tracked. A number of recent studies have sought to estimate program costs on the basis of detailed information collected on a subsample of facilities. While unbiased estimates can be obtained via accurate measurement and appropriate analyses, they are subject to statistical uncertainty. Quantification of this uncertainty, for example, via standard errors and/or 95% confidence intervals, provides important contextual information for decision-makers and for the design of future costing studies. While other forms of uncertainty, such as that due to model misspecification, are considered and can be investigated through sensitivity analyses, statistical uncertainty is often not reported in studies estimating the total program costs. This may be due to a lack of awareness/understanding of (1) the technical details regarding uncertainty estimation and (2) the availability of software with which to calculate uncertainty for estimators resulting from complex surveys. We provide an overview of statistical uncertainty in the context of complex costing surveys, emphasizing the various potential specific sources that contribute to overall uncertainty. Methods: We describe how analysts can compute measures of uncertainty, either via appropriately derived formulae or through resampling techniques such as the bootstrap. We also provide an overview of calibration as a means of using additional auxiliary information that is readily available for the entire program, such as the total number of doses administered, to decrease uncertainty and thereby improve decision-making and the planning of future studies. Results: A recent study of the national program for routine immunization in Honduras shows that uncertainty can be reduced by using information available prior to the study. This method can not only be used when estimating the total cost of delivering established health programs but also to decrease uncertainty when the interest lies in assessing the incremental effect of an intervention. Conclusion: Measures of statistical uncertainty associated with survey-based estimates of program costs, such as standard errors and 95% confidence intervals, provide important contextual information for health policy decision-making and key inputs for the design of future costing studies. Such measures are often not reported, possibly because of technical challenges associated with their calculation and a lack of awareness of appropriate software. Modern statistical analysis methods for survey data, such as calibration, provide a means to exploit additional information that is readily available but was not used in the design of the study to significantly improve the estimation of total cost through the reduction of statistical uncertainty. PMID:29636964
Modeling Physical Systems Using Vensim PLE Systems Dynamics Software
NASA Astrophysics Data System (ADS)
Widmark, Stephen
2012-02-01
Many physical systems are described by time-dependent differential equations or systems of such equations. This makes it difficult for students in an introductory physics class to solve many real-world problems since these students typically have little or no experience with this kind of mathematics. In my high school physics classes, I address this problem by having my students use a variety of software solutions to model physical systems described by differential equations. These include spreadsheets, applets, software my students themselves create, and systems dynamics software. For the latter, cost is often the main issue in choosing a solution for use in a public school and so I researched no-cost software. I found Sphinx SD,2OptiSim,3 Systems Dynamics,4 Simile (Trial Edition),5 and Vensim PLE.6 In evaluating each of these solutions, I looked for the fewest restrictions in the license for educational use, ease of use by students, power, and versatility. In my opinion, Vensim PLE best fulfills these criteria.7
Cost of Sawing Timber (COST) Module (Version 1.0) for Windows®
A. Jefferson, Jr. Palmer; Janice K. Wiedenbeck; Robert W. Mayer; Robert W. Mayer
2005-01-01
The Cost of Sawing Timber (COST) Module calculates the cost of operations per minute and per thousand board feet for a hardwood sawmill. It may be used independently or as a source of cost information for use in sawmill efficiency software such as the SOLVE program. Cost figures are calculated on the basis of information entered by the user. Sawmill managers use these...
Highway User Benefit Analysis System Research Project #128
DOT National Transportation Integrated Search
2000-10-01
In this research, a methodology for estimating road user costs of various competing alternatives was developed. Also, software was developed to calculate the road user cost, perform economic analysis and update cost tables. The methodology is based o...
``Big Bang" for NASA's Buck: Nearly Three Years of EUVE Mission Operations at UCB
NASA Astrophysics Data System (ADS)
Stroozas, B. A.; Nevitt, R.; McDonald, K. E.; Cullison, J.; Malina, R. F.
1999-12-01
After over seven years in orbit, NASA's Extreme Ultraviolet Explorer (EUVE) satellite continues to perform flawlessly and with no significant loss of science capabilities. EUVE continues to produce important and exciting science results and, with reentry not expected until 2003-2004, many more such discoveries await. In the nearly three years since the outsourcing of EUVE from NASA's Goddard Space Flight Center, the small EUVE operations team at the University of California at Berkeley (UCB) has successfully conducted all aspects of the EUVE mission -- from satellite operations, science and mission planning, and data processing, delivery, and archival, to software support, systems administration, science management, and overall mission direction. This paper discusses UCB's continued focus on automation and streamlining, in all aspects of the Project, as the means to maximize EUVE's overall scientific productivity while minimizing costs. Multitasking, non-traditional work roles, and risk management have led to expanded observing capabilities while achieving significant cost reductions and maintaining the mission's historical 99 return. This work was funded under NASA Cooperative Agreement NCC5-138.
VDA, a Method of Choosing a Better Algorithm with Fewer Validations
Kluger, Yuval
2011-01-01
The multitude of bioinformatics algorithms designed for performing a particular computational task presents end-users with the problem of selecting the most appropriate computational tool for analyzing their biological data. The choice of the best available method is often based on expensive experimental validation of the results. We propose an approach to design validation sets for method comparison and performance assessment that are effective in terms of cost and discrimination power. Validation Discriminant Analysis (VDA) is a method for designing a minimal validation dataset to allow reliable comparisons between the performances of different algorithms. Implementation of our VDA approach achieves this reduction by selecting predictions that maximize the minimum Hamming distance between algorithmic predictions in the validation set. We show that VDA can be used to correctly rank algorithms according to their performances. These results are further supported by simulations and by realistic algorithmic comparisons in silico. VDA is a novel, cost-efficient method for minimizing the number of validation experiments necessary for reliable performance estimation and fair comparison between algorithms. Our VDA software is available at http://sourceforge.net/projects/klugerlab/files/VDA/ PMID:22046256
On a Formal Tool for Reasoning About Flight Software Cost Analysis
NASA Technical Reports Server (NTRS)
Spagnuolo, John N., Jr.; Stukes, Sherry A.
2013-01-01
A report focuses on the development of flight software (FSW) cost estimates for 16 Discovery-class missions at JPL. The techniques and procedures developed enabled streamlining of the FSW analysis process, and provided instantaneous confirmation that the data and processes used for these estimates were consistent across all missions. The research provides direction as to how to build a prototype rule-based system for FSW cost estimation that would provide (1) FSW cost estimates, (2) explanation of how the estimates were arrived at, (3) mapping of costs, (4) mathematical trend charts with explanations of why the trends are what they are, (5) tables with ancillary FSW data of interest to analysts, (6) a facility for expert modification/enhancement of the rules, and (7) a basis for conceptually convenient expansion into more complex, useful, and general rule-based systems.
Telemedicine in Cardiology - Perspectives in Bosnia and Herzegovina
Naser, Nabil; Tandir, Salih; Begic, Edin
2017-01-01
Introduction: Aim of article was to present perspectives of telemedicine in the field of cardiology in Bosnia and Herzegovina. Material and methods: Article has descriptive character and present review of literature. Results: Information technology can have the application in the education of students, starting from basic medical sciences up to clinical subjects. Information technologies are used for ECG analysis, 24h ECG Holter monitoring, which detects different rhythm disorders. By developing software packages for electrocardiogram analysis, which can be divided and interpreted by mobile phones, and complete the whole of the patient in the ambulance, specialist, experienced specialists, or even consultations in various illnesses and cities. Image segmentation algorithms have significance in the quantization and diagnostics of anatomic and pathological structures, and 3D representation has an important role in education, topography and clinical anatomy, radiology, pathology, as well as in clinical cardiology itself, especially in the sphere of coronary arteries identification in the multislice computerized angiography of coronary arteries. Interactive video consultations with subspecialists from the state and the region in adult cardiology, adult interventional cardiology, cardiovascular surgery, pediatric invasive and non-invasive cardiology enable better access to heart specialists and subspecialist, accurate diagnosis, better treatment, reduction of mortality, and a significant reduction in costs. Conclusion: Telemedicine by slow steps in entering the soil of Bosnia and Herzegovina, but the potential exists. It is necessary to educate the medical staff, as well as to provide a tempting environment for software engineers. Investing in infrastructure and equipment is imperative, as well as a positive climate for the its implementation. PMID:29284918
Solar Asset Management Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Iverson, Aaron; Zviagin, George
Ra Power Management (RPM) has developed a cloud based software platform that manages the financial and operational functions of third party financed solar projects throughout their lifecycle. RPM’s software streamlines and automates the sales, financing, and management of a portfolio of solar assets. The software helps solar developers automate the most difficult aspects of asset management, leading to increased transparency, efficiency, and reduction in human error. More importantly, our platform will help developers save money by improving their operating margins.
Preliminary design of the HARMONI science software
NASA Astrophysics Data System (ADS)
Piqueras, Laure; Jarno, Aurelien; Pécontal-Rousset, Arlette; Loupias, Magali; Richard, Johan; Schwartz, Noah; Fusco, Thierry; Sauvage, Jean-François; Neichel, Benoît; Correia, Carlos M.
2016-08-01
This paper introduces the science software of HARMONI. The Instrument Numerical Model simulates the instrument from the optical point of view and provides synthetic exposures simulating detector readouts from data-cubes containing astrophysical scenes. The Data Reduction Software converts raw-data frames into a fully calibrated, scientifically usable data cube. We present the functionalities and the preliminary design of this software, describe some of the methods and algorithms used and highlight the challenges that we will have to face.
Enough to Go 'Round? Thinking Smart about Total Cost of Ownership
ERIC Educational Resources Information Center
McIntire, Todd
2006-01-01
Total cost of ownership or TCO refers to the life cycle of costs for technology, including both direct and indirect expenses. TCO includes costs incurred by capital (hardware, software, and facilities); administration and operation (planning, upgrade, replacement, and technical support); and end-user operation (staff development and user…
Gallagher, James; O'Sullivan, David; McCarthy, Suzanne; Gillespie, Paddy; Woods, Noel; O'Mahony, Denis; Byrne, Stephen
2016-04-01
A recent cluster randomised controlled trial (RCT) conducted in an Irish hospital evaluating a structured pharmacist review of medication (SPRM), supported by computerised clinical decision support software (CDSS), demonstrated positive outcomes in terms of reduction of adverse drug reactions (ADR). The aim of this study was to examine the cost effectiveness of pharmacists applying an SPRM in conjunction with CDSS to older hospitalised patients compared with usual pharmaceutical care. Cost-effectiveness analysis alongside a cluster RCT. The trial was conducted in a tertiary hospital in the south of Ireland. Patients in the intervention arm (n = 361) received a multifactorial intervention consisting of medicines reconciliation, deployment of CDSS and generation of a pharmaceutical care plan. Patients in the control arm (n = 376) received usual care from the hospital pharmacy team. Incremental cost effectiveness was examined in terms of costs to the healthcare system and an outcome measure of ADRs during an inpatient hospital stay. Uncertainty in the analysis was explored using a cost-effectiveness acceptability curve (CEAC). On average, the intervention arm was the dominant strategy in terms of cost effectiveness. Compared with usual care (control), the intervention was associated with a decrease of €807 [95% confidence interval (CI) -3443 to 1829; p = 0.548) in mean healthcare cost, and a decrease in the mean number of ADR events per patient of -0.064 (95% CI -0.135 to 0.008; p = 0.081). The probability of the intervention being cost effective at respective threshold values of €0, €250, €500, €750, €1000 and €5000 was 0.707, 0.713, 0.716, 0.718, 0.722 and 0.784, respectively. Based on the evidence presented, SPRM/CDSS is likely to be determined to be cost effective compared with usual pharmaceutical care. However, neither incremental costs nor effects demonstrated a statistically significant difference, therefore the results of this single-site study should be interpreted with caution.
Software Defined Radios - Architectures, Systems and Functions
NASA Technical Reports Server (NTRS)
Sims, Herb
2017-01-01
Software Defined Radio (SDR) technology has been proven in the commercial sector since the early 90's. Today's rapid advancement in mobile telephone reliability and power management capabilities exemplifies the effectiveness of the SDR technology for the modern communications market. SDR technology offers potential to revolutionize satellite transponder technology by increasing science data through-put capability by at least an order of magnitude. While the SDR is adaptive in nature and is "One-size-fits-all" by design, conventional transponders are built to a specific platform and must be redesigned for every new bus. The SDR uses a minimum amount of analog/Radio Frequency (RF) components to up/down-convert the RF signal to/from a digital format. Once analog data is digitized, all processing is performed using hardware logic. Typical SDR processes include; filtering, modulation, up/down converting and demodulation. These innovations have reduced the cost of transceivers, a decrease in power requirements and a commensurate reduction in volume. An additional pay-off is the increased flexibility of the SDR: allowing the same hardware to implement multiple transponder types by altering hardware logic -no change of analog hardware is required -all of which can be ultimately accomplished in orbit.
NASA Astrophysics Data System (ADS)
Guarnieri, Vittorio; Francini, Franco
1997-12-01
Last generation of digital printer is usually characterized by a spatial resolution enough high to allow the designer to realize a binary CGH directly on a transparent film avoiding photographic reduction techniques. These devices are able to produce slides or offset prints. Furthermore, services supplied by commercial printing company provide an inexpensive method to rapidly verify the validity of the design by means of a test-and-trial process. Notably, this low-cost approach appears to be suitable for a didactical environment. On the basis of these considerations, a set of software tools able to design CGH's has been developed. The guidelines inspiring the work have been the following ones: (1) ray-tracing approach, considering the object to be reproduced as source of spherical waves; (2) Optimization and speed-up of the algorithms used, in order to produce a portable code, runnable on several hardware platforms. In this paper calculation methods to obtain some fundamental geometric functions (points, lines, curves) are described. Furthermore, by the juxtaposition of these primitives functions it is possible to produce the holograms of more complex objects. Many examples of generated CGHs are presented.
The environmental control and life support system advanced automation project
NASA Technical Reports Server (NTRS)
Dewberry, Brandon S.
1991-01-01
The objective of the ECLSS Advanced Automation project includes reduction of the risk associated with the integration of new, beneficial software techniques. Demonstrations of this software to baseline engineering and test personnel will show the benefits of these techniques. The advanced software will be integrated into ground testing and ground support facilities, familiarizing its usage by key personnel.
Starlink Software Developments
NASA Astrophysics Data System (ADS)
Bly, M. J.; Giaretta, D.; Currie, M. J.; Taylor, M.
Some current and upcoming software developments from Starlink were demonstrated. These included invoking traditional Starlink applications via web services, the current version of the ORAC-DR reduction pipeline, and some new Java-based tools including Treeview, an interactive explorer of hierarchical data structures.
Delgado, M. Kit; Staudenmayer, Kristan L.; Wang, N. Ewen; Spain, David A.; Weir, Sharada; Owens, Douglas K.; Goldhaber-Fiebert, Jeremy D.
2014-01-01
Objective We determined the minimum mortality reduction that helicopter emergency medical services (HEMS) should provide relative to ground EMS for the scene transport of trauma victims to offset higher costs, inherent transport risks, and inevitable overtriage of minor injury patients. Methods We developed a decision-analytic model to compare the costs and outcomes of helicopter versus ground EMS transport to a trauma center from a societal perspective over a patient's lifetime. We determined the mortality reduction needed to make helicopter transport cost less than $100,000 and $50,000 per quality adjusted life year (QALY) gained compared to ground EMS. Model inputs were derived from the National Study on the Costs and Outcomes of Trauma (NSCOT), National Trauma Data Bank, Medicare reimbursements, and literature. We assessed robustness with probabilistic sensitivity analyses. Results HEMS must provide a minimum of a 17% relative risk reduction in mortality (1.6 lives saved/100 patients with the mean characteristics of the NSCOT cohort) to cost less than $100,000 per QALY gained and a reduction of at least 33% (3.7 lives saved/100 patients) to cost less than $50,000 per QALY. HEMS becomes more cost-effective with significant reductions in minor injury patients triaged to air transport or if long-term disability outcomes are improved. Conclusions HEMS needs to provide at least a 17% mortality reduction or a measurable improvement in long-term disability to compare favorably to other interventions considered cost-effective. Given current evidence, it is not clear that HEMS achieves this mortality or disability reduction. Reducing overtriage of minor injury patients to HEMS would improve its cost-effectiveness. PMID:23582619
Software Review. Macintosh Laboratory Automation: Three Software Packages.
ERIC Educational Resources Information Center
Jezl, Barbara Ann
1990-01-01
Reviewed are "LABTECH NOTEBOOK,""LabVIEW," and "Parameter Manager pmPLUS/pmTALK." Each package is described including functions, uses, hardware, and costs. Advantages and disadvantages of this type of laboratory approach are discussed. (CW)
NASA Technical Reports Server (NTRS)
Mayer, Richard J.; Blinn, Thomas M.; Dewitte, Paul S.; Crump, John W.; Ackley, Keith A.
1992-01-01
The Framework Programmable Software Development Platform (FPP) is a project aimed at effectively combining tool and data integration mechanisms with a model of the software development process to provide an intelligent integrated software development environment. Guided by the model, this system development framework will take advantage of an integrated operating environment to automate effectively the management of the software development process so that costly mistakes during the development phase can be eliminated. The Advanced Software Development Workstation (ASDW) program is conducting research into development of advanced technologies for Computer Aided Software Engineering (CASE).
NASA Technical Reports Server (NTRS)
Oluwole, Oluwayemisi O.; Wong, Hsi-Wu; Green, William
2012-01-01
AdapChem software enables high efficiency, low computational cost, and enhanced accuracy on computational fluid dynamics (CFD) numerical simulations used for combustion studies. The software dynamically allocates smaller, reduced chemical models instead of the larger, full chemistry models to evolve the calculation while ensuring the same accuracy to be obtained for steady-state CFD reacting flow simulations. The software enables detailed chemical kinetic modeling in combustion CFD simulations. AdapChem adapts the reaction mechanism used in the CFD to the local reaction conditions. Instead of a single, comprehensive reaction mechanism throughout the computation, a dynamic distribution of smaller, reduced models is used to capture accurately the chemical kinetics at a fraction of the cost of the traditional single-mechanism approach.
An Inquiry into the Cost of Post Deployment Software Support (PDSS)
1989-09-01
Equations .......... ii vi AFIT/GLM/LSY/835- I0 The increasing cost of software maintenance is taking a larger share of the military bidget each year... increments as needed (3:59). The second page of tne Form 75 starts with a section stating how the hours, and consequently the funds, will be allocated to...length of time required, the timeline can be in hourly, weekly, mnunthly, or quarterly increments . Some milestones included are formal approval, test
Xie, Yujing; Zhao, Laijun; Xue, Jian; Hu, Qingmi; Xu, Xiang; Wang, Hongbo
2016-12-15
How to effectively control severe regional air pollution has become a focus of global concern recently. The non-cooperative reduction model (NCRM) is still the main air pollution control pattern in China, but it is both ineffective and costly, because each province must independently fight air pollution. Thus, we proposed a cooperative reduction model (CRM), with the goal of maximizing the reduction in adverse health effects (AHEs) at the lowest cost by encouraging neighboring areas to jointly control air pollution. CRM has two parts: a model of optimal pollutant removal rates using two optimization objectives (maximizing the reduction in AHEs and minimizing pollutant reduction cost) while meeting the regional pollution control targets set by the central government, and a model that allocates the cooperation benefits (i.e., health improvement and cost reduction) among the participants according to their contributions using the Shapley value method. We applied CRM to the case of sulfur dioxide (SO 2 ) reduction in Yangtze River Delta region. Based on data from 2003 to 2013, and using mortality due to respiratory and cardiovascular diseases as the health endpoints, CRM saves 437 more lives than NCRM, amounting to 12.1% of the reduction under NCRM. CRM also reduced costs by US $65.8×10 6 compared with NCRM, which is 5.2% of the total cost of NCRM. Thus, CRM performs significantly better than NCRM. Each province obtains significant benefits from cooperation, which can motivate them to actively cooperate in the long term. A sensitivity analysis was performed to quantify the effects of parameter values on the cooperation benefits. Results shown that the CRM is not sensitive to the changes in each province's pollutant carrying capacity and the minimum pollutant removal capacity, but sensitive to the maximum pollutant reduction capacity. Moreover, higher cooperation benefits will be generated when a province's maximum pollutant reduction capacity increases. Copyright © 2016 Elsevier B.V. All rights reserved.
Bartolucci, Veronica
2017-01-01
This work presents a hardware/software data acquisition system developed for monitoring the temperature in real time of the cells in Air-Cooled Polymer Electrolyte Fuel Cells (AC-PEFC). These fuel cells are of great interest because they can carry out, in a single operation, the processes of oxidation and refrigeration. This allows reduction of weight, volume, cost and complexity of the control system in the AC-PEFC. In this type of PEFC (and in general in any PEFC), the reliable monitoring of temperature along the entire surface of the stack is fundamental, since a suitable temperature and a regular distribution thereof, are key for a better performance of the stack and a longer lifetime under the best operating conditions. The developed data acquisition (DAQ) system can perform non-intrusive temperature measurements of each individual cell of an AC-PEFC stack of any power (from watts to kilowatts). The stack power is related to the temperature gradient; i.e., a higher power corresponds to a higher stack surface, and consequently higher temperature difference between the coldest and the hottest point. The developed DAQ system has been implemented with the low-cost open-source platform Arduino, and it is completed with a modular virtual instrument that has been developed using NI LabVIEW. Temperature vs time evolution of all the cells of an AC-PEFC both together and individually can be registered and supervised. The paper explains comprehensively the developed DAQ system together with experimental results that demonstrate the suitability of the system. PMID:28698497
Clark, Glenn T; Mulligan, Roseann; Baba, Kazuyoshi
2011-04-01
This article reports on the lessons learned while teaching an 8-week-long online course about the principles of clinical research design in Japan. Student activity data and how it relates to performance in the course are presented. As prolog, this article focuses on the barriers and solutions to creating and delivering a web-based course and it lists and discusses the most common concerns that educators often have about this process, namely, cost of the system and time requirement of the faculty. Options that must be considered when selecting the support software and hardware needed to conduct live streaming lecture, online video-based conference course are presented. The ancillary role of e-mail based distribution lists as an essential instruction tool within an interactive, instructor-supervised online course is discussed. This article then discusses the inclusion of active learning elements within an online course as well as the pros and cons regarding open-book versus closed book, proctored testing. Lastly, copyright issues the online instructor should know about are discussed. The student tracking data show that as the course progresses, students will reduce the number for page viewings. We speculate that this reduction is due to a combination of conflicting priorities plus increasing efficiency of the students at extracting the critical information. The article also concludes that software and hardware costs to deliver an online course are relatively minor but the faculty's time requirement is initially substantially higher than teaching in a conventional face-to-face course. Copyright © 2011 Japan Prosthodontic Society. Published by Elsevier Ltd. All rights reserved.
Segura, Francisca; Bartolucci, Veronica; Andújar, José Manuel
2017-07-09
This work presents a hardware/software data acquisition system developed for monitoring the temperature in real time of the cells in Air-Cooled Polymer Electrolyte Fuel Cells (AC-PEFC). These fuel cells are of great interest because they can carry out, in a single operation, the processes of oxidation and refrigeration. This allows reduction of weight, volume, cost and complexity of the control system in the AC-PEFC. In this type of PEFC (and in general in any PEFC), the reliable monitoring of temperature along the entire surface of the stack is fundamental, since a suitable temperature and a regular distribution thereof, are key for a better performance of the stack and a longer lifetime under the best operating conditions. The developed data acquisition (DAQ) system can perform non-intrusive temperature measurements of each individual cell of an AC-PEFC stack of any power (from watts to kilowatts). The stack power is related to the temperature gradient; i.e., a higher power corresponds to a higher stack surface, and consequently higher temperature difference between the coldest and the hottest point. The developed DAQ system has been implemented with the low-cost open-source platform Arduino, and it is completed with a modular virtual instrument that has been developed using NI LabVIEW. Temperature vs time evolution of all the cells of an AC-PEFC both together and individually can be registered and supervised. The paper explains comprehensively the developed DAQ system together with experimental results that demonstrate the suitability of the system.
The NASA Mission Operations and Control Architecture Program
NASA Technical Reports Server (NTRS)
Ondrus, Paul J.; Carper, Richard D.; Jeffries, Alan J.
1994-01-01
The conflict between increases in space mission complexity and rapidly declining space mission budgets has created strong pressures to radically reduce the costs of designing and operating spacecraft. A key approach to achieving such reductions is through reducing the development and operations costs of the supporting mission operations systems. One of the efforts which the Communications and Data Systems Division at NASA Headquarters is using to meet this challenge is the Mission Operations Control Architecture (MOCA) project. Technical direction of this effort has been delegated to the Mission Operations Division (MOD) of the Goddard Space Flight Center (GSFC). MOCA is to develop a mission control and data acquisition architecture, and supporting standards, to guide the development of future spacecraft and mission control facilities at GSFC. The architecture will reduce the need for around-the-clock operations staffing, obtain a high level of reuse of flight and ground software elements from mission to mission, and increase overall system flexibility by enabling the migration of appropriate functions from the ground to the spacecraft. The end results are to be an established way of designing the spacecraft-ground system interface for GSFC's in-house developed spacecraft, and a specification of the end to end spacecraft control process, including data structures, interfaces, and protocols, suitable for inclusion in solicitation documents for future flight spacecraft. A flight software kernel may be developed and maintained in a condition that it can be offered as Government Furnished Equipment in solicitations. This paper describes the MOCA project, its current status, and the results to date.