Software Development as Music Education Research
ERIC Educational Resources Information Center
Brown, Andrew R.
2007-01-01
This paper discusses how software development can be used as a method for music education research. It explains how software development can externalize ideas, stimulate action and reflection, and provide evidence to support the educative value of new software-based experiences. Parallels between the interactive software development process and…
Communal Resources in Open Source Software Development
ERIC Educational Resources Information Center
Spaeth, Sebastian; Haefliger, Stefan; von Krogh, Georg; Renzl, Birgit
2008-01-01
Introduction: Virtual communities play an important role in innovation. The paper focuses on the particular form of collective action in virtual communities underlying as Open Source software development projects. Method: Building on resource mobilization theory and private-collective innovation, we propose a theory of collective action in…
75 FR 10439 - Cognitive Radio Technologies and Software Defined Radios
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-08
... Technologies and Software Defined Radios AGENCY: Federal Communications Commission. ACTION: Final rule. SUMMARY... concerning the use of open source software to implement security features in software defined radios (SDRs... ongoing technical developments in cognitive and software defined radio (SDR) technologies. 2. On April 20...
NASA Astrophysics Data System (ADS)
Yakovlev, V. V.; Shakirov, S. R.; Gilyov, V. M.; Shpak, S. I.
2017-10-01
In this paper, we propose a variant of constructing automation systems for aerodynamic experiments on the basis of modern hardware-software means of domestic development. The structure of the universal control and data collection system for performing experiments in wind tunnels of continuous, periodic or short-term action is proposed. The proposed hardware and software development tools for ICT SB RAS and ITAM SB RAS, as well as subsystems based on them, can be widely applied to any scientific and experimental installations, as well as to the automation of technological processes in production.
[The Strategic Organization of Skill
NASA Technical Reports Server (NTRS)
Roberts, Ralph
1996-01-01
Eye-movement software was developed in addition to several studies that focused on expert-novice differences in the acquisition and organization of skill. These studies focused on how increasingly complex strategies utilize and incorporate visual look-ahead to calibrate action. Software for collecting, calibrating, and scoring eye-movements was refined and updated. Some new algorithms were developed for analyzing corneal-reflection eye movement data that detect the location of saccadic eye movements in space and time. Two full-scale studies were carried out which examined how experts use foveal and peripheral vision to acquire information about upcoming environmental circumstances in order to plan future action(s) accordingly.
NASA Technical Reports Server (NTRS)
Miller, Sharon E.; Tucker, George T.; Verducci, Anthony J., Jr.
1992-01-01
Software process assessments (SPA's) are part of an ongoing program of continuous quality improvements in AT&T. Their use was found to be very beneficial by software development organizations in identifying the issues facing the organization and the actions required to increase both quality and productivity in the organization.
NASA Technical Reports Server (NTRS)
1992-01-01
This standard specifies the software assurance program for the provider of software. It also delineates the assurance activities for the provider and the assurance data that are to be furnished by the provider to the acquirer. In any software development effort, the provider is the entity or individual that actually designs, develops, and implements the software product, while the acquirer is the entity or individual who specifies the requirements and accepts the resulting products. This standard specifies at a high level an overall software assurance program for software developed for and by NASA. Assurance includes the disciplines of quality assurance, quality engineering, verification and validation, nonconformance reporting and corrective action, safety assurance, and security assurance. The application of these disciplines during a software development life cycle is called software assurance. Subsequent lower-level standards will specify the specific processes within these disciplines.
Read, Sue; Nte, Sol; Corcoran, Patsy; Stephens, Richard
2013-05-01
Loss is a universal experience and death is perceived as the ultimate loss. The overarching aim of this research is to produce a qualitative, flexible, interactive, computerised tool to support the facilitation of emotional expressions around loss for people with intellectual disabilities. This paper explores the process of using Participatory Action Research (PAR) to develop this tool. Participator Action Research provided the indicative framework for the process of developing a software tool that is likely to be used in practice. People with intellectual disability worked alongside researchers to produce an accessible, flexible piece of software that can facilitate storytelling around loss and bereavement and promote spontaneous expression that can be shared with others. This tool has the capacity to enable individuals to capture experiences in a storyboard format; that can be stored; is easily retrievable; can be printed out; and could feasibly be personalised by the insertion of photographs. © 2012 Blackwell Publishing Ltd.
Ohmann, C; Eich, H P; Sippel, H
1998-01-01
This paper describes the design and development of a multilingual documentation and decision support system for the diagnosis of acute abdominal pain. The work was performed within a multi-national COPERNICUS European concerted action dealing with information technology for quality assurance in acute abdominal pain in Europe (EURO-AAP, 555). The software engineering was based on object-oriented analysis design and programming. The program cover three modules: a data dictionary, a documentation program and a knowledge based system. National versions of the software were provided and introduced into 16 centers from Central and Eastern Europe. A prospective data collection was performed in which 4020 patients were recruited. The software design has been proven to be very efficient and useful for the development of multilingual software.
Software for Teaching Physiology and Biophysics.
ERIC Educational Resources Information Center
Weiss, Thomas F.; And Others
1992-01-01
Describes a software library developed to teach biophysics and physiology undergraduates that includes software on (1) the Hodgkin-Huxley model for excitation of action potentials in electrically excitable cells; (2) a random-walk model of diffusion; (3) single voltage-gated ion channels; (4) steady-state chemically mediated transport; and (5)…
Modernization of software quality assurance
NASA Technical Reports Server (NTRS)
Bhaumik, Gokul
1988-01-01
The customers satisfaction depends not only on functional performance, it also depends on the quality characteristics of the software products. An examination of this quality aspect of software products will provide a clear, well defined framework for quality assurance functions, which improve the life-cycle activities of software development. Software developers must be aware of the following aspects which have been expressed by many quality experts: quality cannot be added on; the level of quality built into a program is a function of the quality attributes employed during the development process; and finally, quality must be managed. These concepts have guided our development of the following definition for a Software Quality Assurance function: Software Quality Assurance is a formal, planned approach of actions designed to evaluate the degree of an identifiable set of quality attributes present in all software systems and their products. This paper is an explanation of how this definition was developed and how it is used.
A Framework for Teaching Software Development Methods
ERIC Educational Resources Information Center
Dubinsky, Yael; Hazzan, Orit
2005-01-01
This article presents a study that aims at constructing a teaching framework for software development methods in higher education. The research field is a capstone project-based course, offered by the Technion's Department of Computer Science, in which Extreme Programming is introduced. The research paradigm is an Action Research that involves…
NASA Astrophysics Data System (ADS)
Downs, R. R.; Lenhardt, W. C.; Robinson, E.
2014-12-01
Science software is integral to the scientific process and must be developed and managed in a sustainable manner to ensure future access to scientific data and related resources. Organizations that are part of the scientific enterprise, as well as members of the scientific community who work within these entities, can contribute to the sustainability of science software and to practices that improve scientific community capabilities for science software sustainability. As science becomes increasingly digital and therefore, dependent on software, improving community practices for sustainable science software will contribute to the sustainability of science. Members of the Earth science informatics community, including scientific data producers and distributers, end-user scientists, system and application developers, and data center managers, use science software regularly and face the challenges and the opportunities that science software presents for the sustainability of science. To gain insight on practices needed for the sustainability of science software from the science software experiences of the Earth science informatics community, an interdisciplinary group of 300 community members were asked to engage in simultaneous roundtable discussions and report on their answers to questions about the requirements for improving scientific software sustainability. This paper will present an analysis of the issues reported and the conclusions offered by the participants. These results provide perspectives for science software sustainability practices and have implications for actions that organizations and their leadership can initiate to improve the sustainability of science software.
2014-10-01
designed an Internet-based and mobile application (software) to assist with the following domains pertinent to diabetes self-management: 1...management that provides education, reminders, and support. The new tool is an internet-based and mobile application (software), now called Tracking...is mobile , provides decision support with actionable options, and is based on user input, will enhance diabetes self-care, improve glycemic control
ERIC Educational Resources Information Center
Shih, Ching-Hsiang; Chang, Man-Ling; Shih, Ching-Tien
2010-01-01
This study assessed whether two persons with multiple disabilities would be able to control environmental stimulation using limb action with a Nintendo Wii Remote Controller and a newly developed limb action detection program (LADP, i.e., a new software program that turns a Wii Remote Controller into a precise limb action detector). This study was…
Development of a Unix/VME data acquisition system
NASA Astrophysics Data System (ADS)
Miller, M. C.; Ahern, S.; Clark, S. M.
1992-01-01
The current status of a Unix-based VME data acquisition development project is described. It is planned to use existing Fortran data collection software to drive the existing CAMAC electronics via a VME CAMAC branch driver card and associated Daresbury Unix driving software. The first usable Unix driver has been written and produces single-action CAMAC cycles from test software. The data acquisition code has been implemented in test mode under Unix with few problems and effort is now being directed toward finalizing calls to the CAMAC-driving software and ultimate evaluation of the complete system.
NASA Astrophysics Data System (ADS)
Presti, Giovambattista; Messina, Concetta; Mongelli, Francesca; Sireci, Maria Josè; Collotta, Mario
2017-11-01
Relational Frame Theory is a post-skinnerian theory of language and cognition based on more than thirty years of basic and applied research. It defines language and cognitive skills as an operant repertoire of responses to arbitrarily related stimuli specific, as far as is now known, of the human species. RFT has been proved useful in addressing cognitive barriers to human action in psychotherapy and also improving children skills in reading, IQ testing, and in metaphoric and categorical repertoires. We present a frame of action where RFT can be used in programming software to help autistic children to develop cognitive skills within a developmental vision.
WILDFIRE IGNITION RESISTANCE ESTIMATOR WIZARD SOFTWARE DEVELOPMENT REPORT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Phillips, M.; Robinson, C.; Gupta, N.
2012-10-10
This report describes the development of a software tool, entitled “WildFire Ignition Resistance Estimator Wizard” (WildFIRE Wizard, Version 2.10). This software was developed within the Wildfire Ignition Resistant Home Design (WIRHD) program, sponsored by the U. S. Department of Homeland Security, Science and Technology Directorate, Infrastructure Protection & Disaster Management Division. WildFIRE Wizard is a tool that enables homeowners to take preventive actions that will reduce their home’s vulnerability to wildfire ignition sources (i.e., embers, radiant heat, and direct flame impingement) well in advance of a wildfire event. This report describes the development of the software, its operation, its technicalmore » basis and calculations, and steps taken to verify its performance.« less
ERIC Educational Resources Information Center
Argyropoulos, Vassilios; Nikolaraizi, Magda; Tsiakali, Thomai; Kountrias, Polychronis; Koutsogiorgou, Sofia-Marina; Martos, Aineias
2014-01-01
This paper highlights the framework and discusses the results of an action research project which aimed to facilitate the adoption of assistive technology devices and specialized software by teachers of students with visual impairment via a digital educational game, developed specifically for this project. The persons involved in this…
Proposal for a CLIPS software library
NASA Technical Reports Server (NTRS)
Porter, Ken
1991-01-01
This paper is a proposal to create a software library for the C Language Integrated Production System (CLIPS) expert system shell developed by NASA. Many innovative ideas for extending CLIPS were presented at the First CLIPS Users Conference, including useful user and database interfaces. CLIPS developers would benefit from a software library of reusable code. The CLIPS Users Group should establish a software library-- a course of action to make that happen is proposed. Open discussion to revise this library concept is essential, since only a group effort is likely to succeed. A response form intended to solicit opinions and support from the CLIPS community is included.
ActionMap: A web-based software that automates loci assignments to framework maps.
Albini, Guillaume; Falque, Matthieu; Joets, Johann
2003-07-01
Genetic linkage computation may be a repetitive and time consuming task, especially when numerous loci are assigned to a framework map. We thus developed ActionMap, a web-based software that automates genetic mapping on a fixed framework map without adding the new markers to the map. Using this tool, hundreds of loci may be automatically assigned to the framework in a single process. ActionMap was initially developed to map numerous ESTs with a small plant mapping population and is limited to inbred lines and backcrosses. ActionMap is highly configurable and consists of Perl and PHP scripts that automate command steps for the MapMaker program. A set of web forms were designed for data import and mapping settings. Results of automatic mapping can be displayed as tables or drawings of maps and may be exported. The user may create personal access-restricted projects to store raw data, settings and mapping results. All data may be edited, updated or deleted. ActionMap may be used either online or downloaded for free (http://moulon.inra.fr/~bioinfo/).
ActionMap: a web-based software that automates loci assignments to framework maps
Albini, Guillaume; Falque, Matthieu; Joets, Johann
2003-01-01
Genetic linkage computation may be a repetitive and time consuming task, especially when numerous loci are assigned to a framework map. We thus developed ActionMap, a web-based software that automates genetic mapping on a fixed framework map without adding the new markers to the map. Using this tool, hundreds of loci may be automatically assigned to the framework in a single process. ActionMap was initially developed to map numerous ESTs with a small plant mapping population and is limited to inbred lines and backcrosses. ActionMap is highly configurable and consists of Perl and PHP scripts that automate command steps for the MapMaker program. A set of web forms were designed for data import and mapping settings. Results of automatic mapping can be displayed as tables or drawings of maps and may be exported. The user may create personal access-restricted projects to store raw data, settings and mapping results. All data may be edited, updated or deleted. ActionMap may be used either online or downloaded for free (http://moulon.inra.fr/~bioinfo/). PMID:12824426
Pawlik, Aleksandra; van Gelder, Celia W.G.; Nenadic, Aleksandra; Palagi, Patricia M.; Korpelainen, Eija; Lijnzaad, Philip; Marek, Diana; Sansone, Susanna-Assunta; Hancock, John; Goble, Carole
2017-01-01
Quality training in computational skills for life scientists is essential to allow them to deliver robust, reproducible and cutting-edge research. A pan-European bioinformatics programme, ELIXIR, has adopted a well-established and progressive programme of computational lab and data skills training from Software and Data Carpentry, aimed at increasing the number of skilled life scientists and building a sustainable training community in this field. This article describes the Pilot action, which introduced the Carpentry training model to the ELIXIR community. PMID:28781745
Pawlik, Aleksandra; van Gelder, Celia W G; Nenadic, Aleksandra; Palagi, Patricia M; Korpelainen, Eija; Lijnzaad, Philip; Marek, Diana; Sansone, Susanna-Assunta; Hancock, John; Goble, Carole
2017-01-01
Quality training in computational skills for life scientists is essential to allow them to deliver robust, reproducible and cutting-edge research. A pan-European bioinformatics programme, ELIXIR, has adopted a well-established and progressive programme of computational lab and data skills training from Software and Data Carpentry, aimed at increasing the number of skilled life scientists and building a sustainable training community in this field. This article describes the Pilot action, which introduced the Carpentry training model to the ELIXIR community.
Silva, Kenya de Lima; Évora, Yolanda Dora Martinez; Cintra, Camila Santana Justo
2015-01-01
Objective: to report the development of a software to support decision-making for the selection of nursing diagnoses and interventions for children and adolescents, based on the nomenclature of nursing diagnoses, outcomes and interventions of a university hospital in Paraiba. Method: a methodological applied study based on software engineering, as proposed by Pressman, developed in three cycles, namely: flow chart construction, development of the navigation interface, and construction of functional expressions and programming development. Result: the software consists of administrative and nursing process screens. The assessment is automatically selected according to age group, the nursing diagnoses are suggested by the system after information is inserted, and can be indicated by the nurse. The interventions for the chosen diagnosis are selected by structuring the care plan. Conclusion: the development of this tool used to document the nursing actions will contribute to decision-making and quality of care. PMID:26487144
Silva, Kenya de Lima; Évora, Yolanda Dora Martinez; Cintra, Camila Santana Justo
2015-01-01
to report the development of a software to support decision-making for the selection of nursing diagnoses and interventions for children and adolescents, based on the nomenclature of nursing diagnoses, outcomes and interventions of a university hospital in Paraiba. a methodological applied study based on software engineering, as proposed by Pressman, developed in three cycles, namely: flow chart construction, development of the navigation interface, and construction of functional expressions and programming development. the software consists of administrative and nursing process screens. The assessment is automatically selected according to age group, the nursing diagnoses are suggested by the system after information is inserted, and can be indicated by the nurse. The interventions for the chosen diagnosis are selected by structuring the care plan. the development of this tool used to document the nursing actions will contribute to decision-making and quality of care.
ERIC Educational Resources Information Center
Christelle, Andrea; Dillard, Kara N.; Lindaman, Kara
2018-01-01
The Common Ground for Action (CGA) online deliberation platform is a dynamic tool designed to encourage diverse group members to identify collective responses to deeply controversial or "wicked" public problems that have no simple solution. The program promotes authentic deliberation, while minimizing the tactics of horse-trading and…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Branson, Donald
The KCNSC Automated RAIL (Rolling Action Item List) system provides an electronic platform to manage and escalate rolling action items within an business and manufacturing environment at Honeywell. The software enables a tiered approach to issue management where issues are escalated up a management chain based on team input and compared to business metrics. The software manages action items at different levels of the organization and allows all users to discuss action items concurrently. In addition, the software drives accountability through timely emails and proper visibility during team meetings.
Risk Mitigation for the Development of the New Ariane 5 On-Board Computer
NASA Astrophysics Data System (ADS)
Stransky, Arnaud; Chevalier, Laurent; Dubuc, Francois; Conde-Reis, Alain; Ledoux, Alain; Miramont, Philippe; Johansson, Leif
2010-08-01
In the frame of the Ariane 5 production, some equipment will become obsolete and need to be redesigned and redeveloped. This is the case for the On-Board Computer, which has to be completely redesigned and re-qualified by RUAG Space, as well as all its on-board software and associated development tools by ASTRIUM ST. This paper presents this obsolescence treatment, which has started in 2007 under an ESA contract, in the frame of ACEP and ARTA accompaniment programmes, and is very critical in technical term but also from schedule point of view: it gives the context and overall development plan, and details the risk mitigation actions agreed with ESA, especially those related to the development of the input/output ASIC, and also the on-board software porting and revalidation strategy. The efficiency of these risk mitigation actions has been proven by the outcome schedule; this development constitutes an up-to-date case for good practices, including some experience report and feedback for future other developments.
Development of a Software Tool to Automate ADCO Flight Controller Console Planning Tasks
NASA Technical Reports Server (NTRS)
Anderson, Mark G.
2011-01-01
This independent study project covers the development of the International Space Station (ISS) Attitude Determination and Control Officer (ADCO) Planning Exchange APEX Tool. The primary goal of the tool is to streamline existing manual and time-intensive planning tools into a more automated, user-friendly application that interfaces with existing products and allows the ADCO to produce accurate products and timelines more effectively. This paper will survey the current ISS attitude planning process and its associated requirements, goals, documentation and software tools and how a software tool could simplify and automate many of the planning actions which occur at the ADCO console. The project will be covered from inception through the initial prototype delivery in November 2011 and will include development of design requirements and software as well as design verification and testing.
Modeling and managing risk early in software development
NASA Technical Reports Server (NTRS)
Briand, Lionel C.; Thomas, William M.; Hetmanski, Christopher J.
1993-01-01
In order to improve the quality of the software development process, we need to be able to build empirical multivariate models based on data collectable early in the software process. These models need to be both useful for prediction and easy to interpret, so that remedial actions may be taken in order to control and optimize the development process. We present an automated modeling technique which can be used as an alternative to regression techniques. We show how it can be used to facilitate the identification and aid the interpretation of the significant trends which characterize 'high risk' components in several Ada systems. Finally, we evaluate the effectiveness of our technique based on a comparison with logistic regression based models.
Product Definition Data Interface (PDDI) Product Specification
1991-07-01
syntax of the language gives a precise specification of the data without interpretation of it. M - Constituent Read Block. CSECT - Control Section, the...to conform to the PDDI Access Software’s internal data representation so that it may be further processed. JCL - Job Control Language - IBM language...software development and life cycle * phases. OUALITY CONTROL - The planned and systematic application of all actions (management/technical) necessary to
Information and communication systems for the assistance of carers based on ACTION.
Kraner, M; Emery, D; Cvetkovic, S R; Procter, P; Smythe, C
1999-01-01
Recent advances in telecommunication technologies allow the design of information and communication systems for people who are caring for others in the home as family members or as professionals in the health or community centres. The present paper analyses and classifies the information flow and maps it to an information life cycle, which governs the design of the deployed hardware, software and the data-structure. This is based on the initial findings of ACTION (assisting carers using telematics interventions to meet older persons' needs) a European Union funded project. The proposed information architecture discusses different designs such as centralized or decentralized Web and Client server solutions. A user interface is developed reflecting the special requirements of the targeted user group, which influences the functionality and design of the software, data architecture and the integrated communication system using video-conferencing. ACTION has engineered a system using plain Web technology based on HTML, extended with JavaScript and ActiveX and a software switch enabling the integration of different types of videoconferencing and other applications providing manufacturer independence.
Applications of Formal Methods to Specification and Safety of Avionics Software
NASA Technical Reports Server (NTRS)
Hoover, D. N.; Guaspari, David; Humenn, Polar
1996-01-01
This report treats several topics in applications of formal methods to avionics software development. Most of these topics concern decision tables, an orderly, easy-to-understand format for formally specifying complex choices among alternative courses of action. The topics relating to decision tables include: generalizations fo decision tables that are more concise and support the use of decision tables in a refinement-based formal software development process; a formalism for systems of decision tables with behaviors; an exposition of Parnas tables for users of decision tables; and test coverage criteria and decision tables. We outline features of a revised version of ORA's decision table tool, Tablewise, which will support many of the new ideas described in this report. We also survey formal safety analysis of specifications and software.
Software Fault Tolerance: A Tutorial
NASA Technical Reports Server (NTRS)
Torres-Pomales, Wilfredo
2000-01-01
Because of our present inability to produce error-free software, software fault tolerance is and will continue to be an important consideration in software systems. The root cause of software design errors is the complexity of the systems. Compounding the problems in building correct software is the difficulty in assessing the correctness of software for highly complex systems. After a brief overview of the software development processes, we note how hard-to-detect design faults are likely to be introduced during development and how software faults tend to be state-dependent and activated by particular input sequences. Although component reliability is an important quality measure for system level analysis, software reliability is hard to characterize and the use of post-verification reliability estimates remains a controversial issue. For some applications software safety is more important than reliability, and fault tolerance techniques used in those applications are aimed at preventing catastrophes. Single version software fault tolerance techniques discussed include system structuring and closure, atomic actions, inline fault detection, exception handling, and others. Multiversion techniques are based on the assumption that software built differently should fail differently and thus, if one of the redundant versions fails, it is expected that at least one of the other versions will provide an acceptable output. Recovery blocks, N-version programming, and other multiversion techniques are reviewed.
A Validation Metrics Framework for Safety-Critical Software-Intensive Systems
2009-03-01
so does its definition, tools, and techniques, including means for measuring the validation activity, its outputs, and impact on development...independent of the SDLP. When considering the above SDLPs from the safety engineering team’s perspective, there are also large impacts on the way... impact . Interpretation of any actionable metric data will need to be undertaken in the context of the SDLP. 2. Safety Input The software safety
Diagnosis and Prognosis of Weapon Systems
NASA Technical Reports Server (NTRS)
Nolan, Mary; Catania, Rebecca; deMare, Gregory
2005-01-01
The Prognostics Framework is a set of software tools with an open architecture that affords a capability to integrate various prognostic software mechanisms and to provide information for operational and battlefield decision-making and logistical planning pertaining to weapon systems. The Prognostics NASA Tech Briefs, February 2005 17 Framework is also a system-level health -management software system that (1) receives data from performance- monitoring and built-in-test sensors and from other prognostic software and (2) processes the received data to derive a diagnosis and a prognosis for a weapon system. This software relates the diagnostic and prognostic information to the overall health of the system, to the ability of the system to perform specific missions, and to needed maintenance actions and maintenance resources. In the development of the Prognostics Framework, effort was focused primarily on extending previously developed model-based diagnostic-reasoning software to add prognostic reasoning capabilities, including capabilities to perform statistical analyses and to utilize information pertaining to deterioration of parts, failure modes, time sensitivity of measured values, mission criticality, historical data, and trends in measurement data. As thus extended, the software offers an overall health-monitoring capability.
ERIC Educational Resources Information Center
Casey, Gail
2013-01-01
This article discusses the development of the online spaces that were used to create a learning framework: a student-centred framework that combined face-to-face teaching with online social and participatory media. The author, as part of her Doctoral research study, used action research as a mechanism for continual improvement as she redesigned…
NASA Technical Reports Server (NTRS)
1997-01-01
Coryphaeus Software, founded in 1989 by former NASA electronic engineer Steve Lakowske, creates real-time 3D software. Designer's Workbench, the company flagship product, is a modeling and simulation tool for the development of both static and dynamic 3D databases. Other products soon followed. Activation, specifically designed for game developers, allows developers to play and test the 3D games before they commit to a target platform. Game publishers can shorten development time and prove the "playability" of the title, maximizing their chances of introducing a smash hit. Another product, EasyT, lets users create massive, realistic representation of Earth terrains that can be viewed and traversed in real time. Finally, EasyScene software control the actions among interactive objects within a virtual world. Coryphaeus products are used on Silican Graphics workstation and supercomputers to simulate real-world performance in synthetic environments. Customers include aerospace, aviation, architectural and engineering firms, game developers, and the entertainment industry.
Munoz, Maria Isabel; Bouldi, Nadia; Barcellini, Flore; Nascimento, Adelaide
2012-01-01
This communication deals with the involvement of ergonomists in a research-action design process of a software platform in radiotherapy. The goal of the design project is to enhance patient safety by designing a workflow software that supports cooperation between professionals producing treatment in radiotherapy. The general framework of our approach is the ergonomics management of a design process, which is based in activity analysis and grounded in participatory design. Two fields are concerned by the present action: a design environment which is a participatory design process that involves software designers, caregivers as future users and ergonomists; and a reference real work setting in radiotherapy. Observations, semi-structured interviews and participatory workshops allow the characterization of activity in radiotherapy dealing with uses of cooperative tools, sources of variability and non-ruled strategies to manage the variability of the situations. This production of knowledge about work searches to enhance the articulation between technocentric and anthropocentric approaches, and helps in clarifying design requirements. An issue of this research-action is to develop a framework to define the parameters of the workflow tool, and the conditions of its deployment.
REVEAL: Software Documentation and Platform Migration
NASA Technical Reports Server (NTRS)
Wilson, Michael A.; Veibell, Victoir T.
2011-01-01
The Research Environment for Vehicle Embedded Analysis on Linux (REVEAL) is reconfigurable data acquisition software designed for network-distributed test and measurement applications. In development since 2001, it has been successfully demonstrated in support of a number of actual missions within NASA's Suborbital Science Program. Improvements to software configuration control were needed to properly support both an ongoing transition to operational status and continued evolution of REVEAL capabilities. For this reason the project described in this report targets REVEAL software source documentation and deployment of the software on a small set of hardware platforms different from what is currently used in the baseline system implementation. This presentation specifically describes the actions taken over a ten week period by two undergraduate student interns and serves as an overview of the content of the final report for that internship.
Hysteretic Models Considering Axial-Shear-Flexure Interaction
NASA Astrophysics Data System (ADS)
Ceresa, Paola; Negrisoli, Giorgio
2017-10-01
Most of the existing numerical models implemented in finite element (FE) software, at the current state of the art, are not capable to describe, with enough reliability, the interaction between axial, shear and flexural actions under cyclic loading (e.g. seismic actions), neglecting crucial effects for predicting the nature of the collapse of reinforced concrete (RC) structural elements. Just a few existing 3D volume models or fibre beam models can lead to a quite accurate response, but they are still computationally inefficient for typical applications in earthquake engineering and also characterized by very complex formulation. Thus, discrete models with lumped plasticity hinges may be the preferred choice for modelling the hysteretic behaviour due to cyclic loading conditions, in particular with reference to its implementation in a commercial software package. These considerations lead to this research work focused on the development of a model for RC beam-column elements able to consider degradation effects and interaction between the actions under cyclic loading conditions. In order to develop a model for a general 3D discrete hinge element able to take into account the axial-shear-flexural interaction, it is necessary to provide an implementation which involves a corrector-predictor iterative scheme. Furthermore, a reliable constitutive model based on damage plasticity theory is formulated and implemented for its numerical validation. Aim of this research work is to provide the formulation of a numerical model, which will allow implementation within a FE software package for nonlinear cyclic analysis of RC structural members. The developed model accounts for stiffness degradation effect and stiffness recovery for loading reversal.
NASA Technical Reports Server (NTRS)
Snell, William H.; Turner, Anne M.; Gifford, Luther; Stites, William
2010-01-01
A quality system database (QSD), and software to administer the database, were developed to support recording of administrative nonconformance activities that involve requirements for documentation of corrective and/or preventive actions, which can include ISO 9000 internal quality audits and customer complaints.
Bionic models for identification of biological systems
NASA Astrophysics Data System (ADS)
Gerget, O. M.
2017-01-01
This article proposes a clinical decision support system that processes biomedical data. For this purpose a bionic model has been designed based on neural networks, genetic algorithms and immune systems. The developed system has been tested on data from pregnant women. The paper focuses on the approach to enable selection of control actions that can minimize the risk of adverse outcome. The control actions (hyperparameters of a new type) are further used as an additional input signal. Its values are defined by a hyperparameter optimization method. A software developed with Python is briefly described.
CASE tools and UML: state of the ART.
Agarwal, S
2001-05-01
With increasing need for automated tools to assist complex systems development, software design methods are becoming popular. This article analyzes the state of art in computer-aided software engineering (CASE) tools and unified modeling language (UML), focusing on their evolution, merits, and industry usage. It identifies managerial issues for the tools' adoption and recommends an action plan to select and implement them. While CASE and UML offer inherent advantages like cheaper, shorter, and efficient development cycles, they suffer from poor user satisfaction. The critical success factors for their implementation include, among others, management and staff commitment, proper corporate infrastructure, and user training.
Analysis of Wastewater and Water System Renewal Decision-Making Tools and Approaches
In regards to the development of software for decision support for pipeline renewal, most of the attention to date has been paid to the development of asset management models which help an owner decide on which portions of a system to prioritize for needed actions. There has not ...
Gaps of Decision Support Models for Pipeline Renewal and Recommendations for Improvement
In terms of the development of software for decision support for pipeline renewal, more attention to date has been paid to the development of asset management models that help an owner decide on which portions of a system to prioritize needed actions. There has been much less w...
GAPS OF DECISION SUPPORT MODELS FOR PIPELINE RENEWAL AND RECOMMENDATIONS FOR IMPROVEMENT (SLIDE)
In terms of the development of software for decision support for pipeline renewal, more attention to date has been paid to the development of asset management models that help an owner decide on which portions of a system to prioritize needed actions. There has been much less wor...
Software for Probabilistic Risk Reduction
NASA Technical Reports Server (NTRS)
Hensley, Scott; Michel, Thierry; Madsen, Soren; Chapin, Elaine; Rodriguez, Ernesto
2004-01-01
A computer program implements a methodology, denoted probabilistic risk reduction, that is intended to aid in planning the development of complex software and/or hardware systems. This methodology integrates two complementary prior methodologies: (1) that of probabilistic risk assessment and (2) a risk-based planning methodology, implemented in a prior computer program known as Defect Detection and Prevention (DDP), in which multiple requirements and the beneficial effects of risk-mitigation actions are taken into account. The present methodology and the software are able to accommodate both process knowledge (notably of the efficacy of development practices) and product knowledge (notably of the logical structure of a system, the development of which one seeks to plan). Estimates of the costs and benefits of a planned development can be derived. Functional and non-functional aspects of software can be taken into account, and trades made among them. It becomes possible to optimize the planning process in the sense that it becomes possible to select the best suite of process steps and design choices to maximize the expectation of success while remaining within budget.
Strengthening Interprofessional Requirements Engineering Through Action Sheets: A Pilot Study.
Kunz, Aline; Pohlmann, Sabrina; Heinze, Oliver; Brandner, Antje; Reiß, Christina; Kamradt, Martina; Szecsenyi, Joachim; Ose, Dominik
2016-10-18
The importance of information and communication technology for healthcare is steadily growing. Newly developed tools are addressing different user groups: physicians, other health care professionals, social workers, patients, and family members. Since often many different actors with different expertise and perspectives are involved in the development process it can be a challenge to integrate the user-reported requirements of those heterogeneous user groups. Nevertheless, the understanding and consideration of user requirements is the prerequisite of building a feasible technical solution. In the course of the presented project it proved to be difficult to gain clear action steps and priorities for the development process out of the primary requirements compilation. Even if a regular exchange between involved teams took place there was a lack of a common language. The objective of this paper is to show how the already existing requirements catalog was subdivided into specific, prioritized, and coherent working packages and the cooperation of multiple interprofessional teams within one development project was reorganized at the same time. In the case presented, the manner of cooperation was reorganized and a new instrument called an Action Sheet was implemented. This paper introduces the newly developed methodology which was meant to smooth the development of a user-centered software product and to restructure interprofessional cooperation. There were 10 focus groups in which views of patients with colorectal cancer, physicians, and other health care professionals were collected in order to create a requirements catalog for developing a personal electronic health record. Data were audio- and videotaped, transcribed verbatim, and thematically analyzed. Afterwards, the requirements catalog was reorganized in the form of Action Sheets which supported the interprofessional cooperation referring to the development process of a personal electronic health record for the Rhine-Neckar region. In order to improve the interprofessional cooperation the idea arose to align the requirements arising from the implementation project with the method of software development applied by the technical development team. This was realized by restructuring the original requirements set in a standardized way and under continuous adjustment between both teams. As a result not only the way of displaying the user demands but also of interprofessional cooperation was steered in a new direction. User demands must be taken into account from the very beginning of the development process, but it is not always obvious how to bring them together with IT knowhow and knowledge of the contextual factors of the health care system. Action Sheets seem to be an effective tool for making the software development process more tangible and convertible for all connected disciplines. Furthermore, the working method turned out to support interprofessional ideas exchange.
Strengthening Interprofessional Requirements Engineering Through Action Sheets: A Pilot Study
Pohlmann, Sabrina; Heinze, Oliver; Brandner, Antje; Reiß, Christina; Kamradt, Martina; Szecsenyi, Joachim; Ose, Dominik
2016-01-01
Background The importance of information and communication technology for healthcare is steadily growing. Newly developed tools are addressing different user groups: physicians, other health care professionals, social workers, patients, and family members. Since often many different actors with different expertise and perspectives are involved in the development process it can be a challenge to integrate the user-reported requirements of those heterogeneous user groups. Nevertheless, the understanding and consideration of user requirements is the prerequisite of building a feasible technical solution. In the course of the presented project it proved to be difficult to gain clear action steps and priorities for the development process out of the primary requirements compilation. Even if a regular exchange between involved teams took place there was a lack of a common language. Objective The objective of this paper is to show how the already existing requirements catalog was subdivided into specific, prioritized, and coherent working packages and the cooperation of multiple interprofessional teams within one development project was reorganized at the same time. In the case presented, the manner of cooperation was reorganized and a new instrument called an Action Sheet was implemented. This paper introduces the newly developed methodology which was meant to smooth the development of a user-centered software product and to restructure interprofessional cooperation. Methods There were 10 focus groups in which views of patients with colorectal cancer, physicians, and other health care professionals were collected in order to create a requirements catalog for developing a personal electronic health record. Data were audio- and videotaped, transcribed verbatim, and thematically analyzed. Afterwards, the requirements catalog was reorganized in the form of Action Sheets which supported the interprofessional cooperation referring to the development process of a personal electronic health record for the Rhine-Neckar region. Results In order to improve the interprofessional cooperation the idea arose to align the requirements arising from the implementation project with the method of software development applied by the technical development team. This was realized by restructuring the original requirements set in a standardized way and under continuous adjustment between both teams. As a result not only the way of displaying the user demands but also of interprofessional cooperation was steered in a new direction. Conclusions User demands must be taken into account from the very beginning of the development process, but it is not always obvious how to bring them together with IT knowhow and knowledge of the contextual factors of the health care system. Action Sheets seem to be an effective tool for making the software development process more tangible and convertible for all connected disciplines. Furthermore, the working method turned out to support interprofessional ideas exchange. PMID:27756716
Baldwin, Krystal L; Kannan, Vaishnavi; Flahaven, Emily L; Parks, Cassandra J; Ott, Jason M; Willett, Duwayne L
2018-01-01
Background Moving to electronic health records (EHRs) confers substantial benefits but risks unintended consequences. Modern EHRs consist of complex software code with extensive local configurability options, which can introduce defects. Defects in clinical decision support (CDS) tools are surprisingly common. Feasible approaches to prevent and detect defects in EHR configuration, including CDS tools, are needed. In complex software systems, use of test–driven development and automated regression testing promotes reliability. Test–driven development encourages modular, testable design and expanding regression test coverage. Automated regression test suites improve software quality, providing a “safety net” for future software modifications. Each automated acceptance test serves multiple purposes, as requirements (prior to build), acceptance testing (on completion of build), regression testing (once live), and “living” design documentation. Rapid-cycle development or “agile” methods are being successfully applied to CDS development. The agile practice of automated test–driven development is not widely adopted, perhaps because most EHR software code is vendor-developed. However, key CDS advisory configuration design decisions and rules stored in the EHR may prove amenable to automated testing as “executable requirements.” Objective We aimed to establish feasibility of acceptance test–driven development of clinical decision support advisories in a commonly used EHR, using an open source automated acceptance testing framework (FitNesse). Methods Acceptance tests were initially constructed as spreadsheet tables to facilitate clinical review. Each table specified one aspect of the CDS advisory’s expected behavior. Table contents were then imported into a test suite in FitNesse, which queried the EHR database to automate testing. Tests and corresponding CDS configuration were migrated together from the development environment to production, with tests becoming part of the production regression test suite. Results We used test–driven development to construct a new CDS tool advising Emergency Department nurses to perform a swallowing assessment prior to administering oral medication to a patient with suspected stroke. Test tables specified desired behavior for (1) applicable clinical settings, (2) triggering action, (3) rule logic, (4) user interface, and (5) system actions in response to user input. Automated test suite results for the “executable requirements” are shown prior to building the CDS alert, during build, and after successful build. Conclusions Automated acceptance test–driven development and continuous regression testing of CDS configuration in a commercial EHR proves feasible with open source software. Automated test–driven development offers one potential contribution to achieving high-reliability EHR configuration. Vetting acceptance tests with clinicians elicits their input on crucial configuration details early during initial CDS design and iteratively during rapid-cycle optimization. PMID:29653922
Basit, Mujeeb A; Baldwin, Krystal L; Kannan, Vaishnavi; Flahaven, Emily L; Parks, Cassandra J; Ott, Jason M; Willett, Duwayne L
2018-04-13
Moving to electronic health records (EHRs) confers substantial benefits but risks unintended consequences. Modern EHRs consist of complex software code with extensive local configurability options, which can introduce defects. Defects in clinical decision support (CDS) tools are surprisingly common. Feasible approaches to prevent and detect defects in EHR configuration, including CDS tools, are needed. In complex software systems, use of test-driven development and automated regression testing promotes reliability. Test-driven development encourages modular, testable design and expanding regression test coverage. Automated regression test suites improve software quality, providing a "safety net" for future software modifications. Each automated acceptance test serves multiple purposes, as requirements (prior to build), acceptance testing (on completion of build), regression testing (once live), and "living" design documentation. Rapid-cycle development or "agile" methods are being successfully applied to CDS development. The agile practice of automated test-driven development is not widely adopted, perhaps because most EHR software code is vendor-developed. However, key CDS advisory configuration design decisions and rules stored in the EHR may prove amenable to automated testing as "executable requirements." We aimed to establish feasibility of acceptance test-driven development of clinical decision support advisories in a commonly used EHR, using an open source automated acceptance testing framework (FitNesse). Acceptance tests were initially constructed as spreadsheet tables to facilitate clinical review. Each table specified one aspect of the CDS advisory's expected behavior. Table contents were then imported into a test suite in FitNesse, which queried the EHR database to automate testing. Tests and corresponding CDS configuration were migrated together from the development environment to production, with tests becoming part of the production regression test suite. We used test-driven development to construct a new CDS tool advising Emergency Department nurses to perform a swallowing assessment prior to administering oral medication to a patient with suspected stroke. Test tables specified desired behavior for (1) applicable clinical settings, (2) triggering action, (3) rule logic, (4) user interface, and (5) system actions in response to user input. Automated test suite results for the "executable requirements" are shown prior to building the CDS alert, during build, and after successful build. Automated acceptance test-driven development and continuous regression testing of CDS configuration in a commercial EHR proves feasible with open source software. Automated test-driven development offers one potential contribution to achieving high-reliability EHR configuration. Vetting acceptance tests with clinicians elicits their input on crucial configuration details early during initial CDS design and iteratively during rapid-cycle optimization. ©Mujeeb A Basit, Krystal L Baldwin, Vaishnavi Kannan, Emily L Flahaven, Cassandra J Parks, Jason M Ott, Duwayne L Willett. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 13.04.2018.
Integrating automated support for a software management cycle into the TAME system
NASA Technical Reports Server (NTRS)
Sunazuka, Toshihiko; Basili, Victor R.
1989-01-01
Software managers are interested in the quantitative management of software quality, cost and progress. An integrated software management methodology, which can be applied throughout the software life cycle for any number purposes, is required. The TAME (Tailoring A Measurement Environment) methodology is based on the improvement paradigm and the goal/question/metric (GQM) paradigm. This methodology helps generate a software engineering process and measurement environment based on the project characteristics. The SQMAR (software quality measurement and assurance technology) is a software quality metric system and methodology applied to the development processes. It is based on the feed forward control principle. Quality target setting is carried out before the plan-do-check-action activities are performed. These methodologies are integrated to realize goal oriented measurement, process control and visual management. A metric setting procedure based on the GQM paradigm, a management system called the software management cycle (SMC), and its application to a case study based on NASA/SEL data are discussed. The expected effects of SMC are quality improvement, managerial cost reduction, accumulation and reuse of experience, and a highly visual management reporting system.
NASA Technical Reports Server (NTRS)
1989-01-01
C Language Integrated Production System (CLIPS) is a software shell for developing expert systems is designed to allow research and development of artificial intelligence on conventional computers. Originally developed by Johnson Space Center, it enables highly efficient pattern matching. A collection of conditions and actions to be taken if the conditions are met is built into a rule network. Additional pertinent facts are matched to the rule network. Using the program, E.I. DuPont de Nemours & Co. is monitoring chemical production machines; California Polytechnic State University is investigating artificial intelligence in computer aided design; Mentor Graphics has built a new Circuit Synthesis system, and Brooke and Brooke, a law firm, can determine which facts from a file are most important.
An Extensible, User- Modifiable Framework for Planning Activities
NASA Technical Reports Server (NTRS)
Joshing, Joseph C.; Abramyan, Lucy; Mickelson, Megan C.; Wallick, Michael N.; Kurien, James A.; Crockett, Thomasa M.; Powell, Mark W.; Pyrzak, Guy; Aghevli, Arash
2013-01-01
This software provides a development framework that allows planning activities for the Mars Science Laboratory rover to be altered at any time, based on changes of the Activity Dictionary. The Activity Dictionary contains the definition of all activities that can be carried out by a particular asset (robotic or human). These definitions (and combinations of these definitions) are used by mission planners to give a daily plan of what a mission should do. During the development and course of the mission, the Activity Dictionary and actions that are going to be carried out will often be changed. Previously, such changes would require a change to the software and redeployment. Now, the Activity Dictionary authors are able to customize activity definitions, parameters, and resource usage without requiring redeployment. This software provides developers and end users the ability to modify the behavior of automatically generated activities using a script. This allows changes to the software behavior without incurring the burden of redeployment. This software is currently being used for the Mars Science Laboratory, and is in the process of being integrated into the LADEE (Lunar Atmosphere and Dust Environment Explorer) mission, as well as the International Space Station.
TBell: A mathematical tool for analyzing decision tables
NASA Technical Reports Server (NTRS)
Hoover, D. N.; Chen, Zewei
1994-01-01
This paper describes the development of mathematical theory and software to analyze specifications that are developed using decision tables. A decision table is a tabular format for specifying a complex set of rules that chooses one of a number of alternative actions. The report also describes a prototype tool, called TBell, that automates certain types of analysis.
Shih, Ching-Hsiang; Chang, Man-Ling; Shih, Ching-Tien
2010-01-01
This study assessed whether two persons with multiple disabilities would be able to control environmental stimulation using limb action with a Nintendo Wii Remote Controller and a newly developed limb action detection program (LADP, i.e., a new software program that turns a Wii Remote Controller into a precise limb action detector). This study was carried out according to an ABAB sequence in which A represented baseline and B represented intervention phases. Data showed that both participants significantly increased their target response, thus increasing the level of environmental stimulation by activating the control system through limb action, during the intervention phases. Practical and developmental implications of the findings are discussed. Copyright (c) 2010 Elsevier Ltd. All rights reserved.
Carter, D
1996-01-01
The Canada Center for Remote Sensing, in collaboration with the International Development Research Center, is developing an electronic atlas of Agenda 21, the Earth Summit action plan. This initiative promises to ease access for researchers and practitioners to implement the Agenda 21-action plan, which in its pilot study will focus on biological diversity. Known as the Biodiversity Volume of the Electronic Atlas of Agenda 21 (ELADA 21), this computer software technology will contain information and data on biodiversity, genetics, species, ecosystems, and ecosystem services. Specifically, it includes several country studies, documentation, as well as interactive scenarios linking biodiversity to socioeconomic issues. ELADA 21 will empower countries and agencies to report on and better manage biodiversity and related information. The atlas can be used to develop and test various scenarios and to exchange information within the South and with industrialized countries. At present, ELADA 21 has generated interest and becomes more available in the market. The challenge confronting the project team, however, is to find the atlas a permanent home, a country or agency willing to assume responsibility for maintaining, upgrading, and updating the software.
Understanding How the "Open" of Open Source Software (OSS) Will Improve Global Health Security.
Hahn, Erin; Blazes, David; Lewis, Sheri
2016-01-01
Improving global health security will require bold action in all corners of the world, particularly in developing settings, where poverty often contributes to an increase in emerging infectious diseases. In order to mitigate the impact of emerging pandemic threats, enhanced disease surveillance is needed to improve early detection and rapid response to outbreaks. However, the technology to facilitate this surveillance is often unattainable because of high costs, software and hardware maintenance needs, limited technical competence among public health officials, and internet connectivity challenges experienced in the field. One potential solution is to leverage open source software, a concept that is unfortunately often misunderstood. This article describes the principles and characteristics of open source software and how it may be applied to solve global health security challenges.
REVEAL: Software Documentation and Platform Migration
NASA Technical Reports Server (NTRS)
Wilson, Michael A.; Veibell, Victoir T.; Freudinger, Lawrence C.
2008-01-01
The Research Environment for Vehicle Embedded Analysis on Linux (REVEAL) is reconfigurable data acquisition software designed for network-distributed test and measurement applications. In development since 2001, it has been successfully demonstrated in support of a number of actual missions within NASA s Suborbital Science Program. Improvements to software configuration control were needed to properly support both an ongoing transition to operational status and continued evolution of REVEAL capabilities. For this reason the project described in this report targets REVEAL software source documentation and deployment of the software on a small set of hardware platforms different from what is currently used in the baseline system implementation. This report specifically describes the actions taken over a ten week period by two undergraduate student interns and serves as a final report for that internship. The topics discussed include: the documentation of REVEAL source code; the migration of REVEAL to other platforms; and an end-to-end field test that successfully validates the efforts.
Toward Baseline Software Anomalies in NASA Missions
NASA Technical Reports Server (NTRS)
Layman, Lucas; Zelkowitz, Marvin; Basili, Victor; Nikora, Allen P.
2012-01-01
In this fast abstract, we provide preliminary findings an analysis of 14,500 spacecraft anomalies from unmanned NASA missions. We provide some baselines for the distributions of software vs. non-software anomalies in spaceflight systems, the risk ratings of software anomalies, and the corrective actions associated with software anomalies.
A Mechanism of Modeling and Verification for SaaS Customization Based on TLA
NASA Astrophysics Data System (ADS)
Luan, Shuai; Shi, Yuliang; Wang, Haiyang
With the gradually mature of SOA and the rapid development of Internet, SaaS has become a popular software service mode. The customized action of SaaS is usually subject to internal and external dependency relationships. This paper first introduces a method for modeling customization process based on Temporal Logic of Actions, and then proposes a verification algorithm to assure that each step in customization will not cause unpredictable influence on system and follow the related rules defined by SaaS provider.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peck, T; Sparkman, D; Storch, N
''The LLNL Site-Specific Advanced Simulation and Computing (ASCI) Software Quality Engineering Recommended Practices VI.I'' document describes a set of recommended software quality engineering (SQE) practices for ASCI code projects at Lawrence Livermore National Laboratory (LLNL). In this context, SQE is defined as the process of building quality into software products by applying the appropriate guiding principles and management practices. Continual code improvement and ongoing process improvement are expected benefits. Certain practices are recommended, although projects may select the specific activities they wish to improve, and the appropriate time lines for such actions. Additionally, projects can rely on the guidance ofmore » this document when generating ASCI Verification and Validation (VSrV) deliverables. ASCI program managers will gather information about their software engineering practices and improvement. This information can be shared to leverage the best SQE practices among development organizations. It will further be used to ensure the currency and vitality of the recommended practices. This Overview is intended to provide basic information to the LLNL ASCI software management and development staff from the ''LLNL Site-Specific ASCI Software Quality Engineering Recommended Practices VI.I'' document. Additionally the Overview provides steps to using the ''LLNL Site-Specific ASCI Software Quality Engineering Recommended Practices VI.I'' document. For definitions of terminology and acronyms, refer to the Glossary and Acronyms sections in the ''LLNL Site-Specific ASCI Software Quality Engineering Recommended Practices VI.I''.« less
PNNL Future Power Grid Initiative-developed GridOPTICS Software System (GOSS)
None
2018-01-16
The power grid is changing and evolving. One aspect of this change is the growing use of smart meters and other devices, which are producing large volumes of useful data. However, in many cases, the data canât be translated quickly into actionable guidance to improve grid performance. There's a need for innovative tools. The GridOPTICS(TM) Software System, or GOSS, developed through PNNL's Future Power Grid Initiative, is open source and became publicly available in spring 2014. The value of this middleware is that it easily integrates grid applications with sources of data and facilitates communication between them. Such a capability provides a foundation for developing a range of applications to improve grid management.
Water Distribution System Risk Tool for Investment Planning (WaterRF Report 4332)
Product Description/Abstract The product consists of the Pipe Risk Screening Tool (PRST), and a report on the development and use of the tool. The PRST is a software-based screening aid to identify and rank candidate pipes for actions that range from active monitoring (including...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-16
... DEPARTMENT OF HEALTH AND HUMAN SERVICES Agency for Healthcare Research and Quality Meeting for... and Event Reporting AGENCY: Agency for Healthcare Research and Quality (AHRQ), HHS. ACTION: Notice of... reports; and Metadata registry--includes descriptive facts about information contained in the data...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-04
... returns, (3) tax software developers, (4) large and small business, (5) employers and payroll service... DEPARTMENT OF THE TREASURY Internal Revenue Service Open Season for Membership to the Electronic Tax Administration Advisory Committee (ETAAC) AGENCY: Internal Revenue Service (IRS), Treasury. ACTION...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-14
... returns, (3) tax software developers, (4) large and small business, (5) employers and payroll service... DEPARTMENT OF THE TREASURY Internal Revenue Service Open Season for Membership to the Electronic Tax Administration Advisory Committee (ETAAC) AGENCY: Internal Revenue Service (IRS), Treasury. ACTION...
Adaptable Computing Environment/Self-Assembling Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Osbourn, Gordon C.; Bouchard, Ann M.; Bartholomew, John W.
Complex software applications are difficult to learn to use and to remember how to use. Further, the user has no control over the functionality available in a given application. The software we use can be created and modified only by a relatively small group of elite, highly skilled artisans known as programmers. "Normal users" are powerless to create and modify software themselves, because the tools for software development, designed by and for programmers, are a barrier to entry. This software, when completed, will be a user-adaptable computing environment in which the user is really in control of his/her own software,more » able to adapt the system, make new parts of the system interactive, and even modify the behavior of the system itself. Som key features of the basic environment that have been implemented are (a) books in bookcases, where all data is stored, (b) context-sensitive compass menus (compass, because the buttons are located in compass directions relative to the mouose cursor position), (c) importing tabular data and displaying it in a book, (d) light-weight table querying/sorting, (e) a Reach&Get capability (sort of a "smart" copy/paste that prevents the user from copying invalid data), and (f) a LogBook that automatically logs all user actions that change data or the system itself. To bootstrap toward full end-user adaptability, we implemented a set of development tools. With the development tools, compass menus can be made and customized.« less
Self-service for software development projects and HPC activities
NASA Astrophysics Data System (ADS)
Husejko, M.; Høimyr, N.; Gonzalez, A.; Koloventzos, G.; Asbury, D.; Trzcinska, A.; Agtzidis, I.; Botrel, G.; Otto, J.
2014-05-01
This contribution describes how CERN has implemented several essential tools for agile software development processes, ranging from version control (Git) to issue tracking (Jira) and documentation (Wikis). Running such services in a large organisation like CERN requires many administrative actions both by users and service providers, such as creating software projects, managing access rights, users and groups, and performing tool-specific customisation. Dealing with these requests manually would be a time-consuming task. Another area of our CERN computing services that has required dedicated manual support has been clusters for specific user communities with special needs. Our aim is to move all our services to a layered approach, with server infrastructure running on the internal cloud computing infrastructure at CERN. This contribution illustrates how we plan to optimise the management of our of services by means of an end-user facing platform acting as a portal into all the related services for software projects, inspired by popular portals for open-source developments such as Sourceforge, GitHub and others. Furthermore, the contribution will discuss recent activities with tests and evaluations of High Performance Computing (HPC) applications on different hardware and software stacks, and plans to offer a dynamically scalable HPC service at CERN, based on affordable hardware.
NASA Astrophysics Data System (ADS)
Katz, Daniel S.; Choi, Sou-Cheng T.; Wilkins-Diehr, Nancy; Chue Hong, Neil; Venters, Colin C.; Howison, James; Seinstra, Frank; Jones, Matthew; Cranston, Karen; Clune, Thomas L.; de Val-Borro, Miguel; Littauer, Richard
2016-02-01
This technical report records and discusses the Second Workshop on Sustainable Software for Science: Practice and Experiences (WSSSPE2). The report includes a description of the alternative, experimental submission and review process, two workshop keynote presentations, a series of lightning talks, a discussion on sustainability, and five discussions from the topic areas of exploring sustainability; software development experiences; credit & incentives; reproducibility & reuse & sharing; and code testing & code review. For each topic, the report includes a list of tangible actions that were proposed and that would lead to potential change. The workshop recognized that reliance on scientific software is pervasive in all areas of world-leading research today. The workshop participants then proceeded to explore different perspectives on the concept of sustainability. Key enablers and barriers of sustainable scientific software were identified from their experiences. In addition, recommendations with new requirements such as software credit files and software prize frameworks were outlined for improving practices in sustainable software engineering. There was also broad consensus that formal training in software development or engineering was rare among the practitioners. Significant strides need to be made in building a sense of community via training in software and technical practices, on increasing their size and scope, and on better integrating them directly into graduate education programs. Finally, journals can define and publish policies to improve reproducibility, whereas reviewers can insist that authors provide sufficient information and access to data and software to allow them reproduce the results in the paper. Hence a list of criteria is compiled for journals to provide to reviewers so as to make it easier to review software submitted for publication as a "Software Paper."
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abbott, Robert G.; Forsythe, James C.
Adaptive Thinking has been defined here as the capacity to recognize when a course of action that may have previously been effective is no longer effective and there is need to adjust strategy. Research was undertaken with human test subjects to identify the factors that contribute to adaptive thinking. It was discovered that those most effective in settings that call for adaptive thinking tend to possess a superior capacity to quickly and effectively generate possible courses of action, as measured using the Category Generation test. Software developed for this research has been applied to develop capabilities enabling analysts to identifymore » crucial factors that are predictive of outcomes in fore-on-force simulation exercises.« less
NASA Software Engineering Benchmarking Study
NASA Technical Reports Server (NTRS)
Rarick, Heather L.; Godfrey, Sara H.; Kelly, John C.; Crumbley, Robert T.; Wifl, Joel M.
2013-01-01
To identify best practices for the improvement of software engineering on projects, NASA's Offices of Chief Engineer (OCE) and Safety and Mission Assurance (OSMA) formed a team led by Heather Rarick and Sally Godfrey to conduct this benchmarking study. The primary goals of the study are to identify best practices that: Improve the management and technical development of software intensive systems; Have a track record of successful deployment by aerospace industries, universities [including research and development (R&D) laboratories], and defense services, as well as NASA's own component Centers; and Identify candidate solutions for NASA's software issues. Beginning in the late fall of 2010, focus topics were chosen and interview questions were developed, based on the NASA top software challenges. Between February 2011 and November 2011, the Benchmark Team interviewed a total of 18 organizations, consisting of five NASA Centers, five industry organizations, four defense services organizations, and four university or university R and D laboratory organizations. A software assurance representative also participated in each of the interviews to focus on assurance and software safety best practices. Interviewees provided a wealth of information on each topic area that included: software policy, software acquisition, software assurance, testing, training, maintaining rigor in small projects, metrics, and use of the Capability Maturity Model Integration (CMMI) framework, as well as a number of special topics that came up in the discussions. NASA's software engineering practices compared favorably with the external organizations in most benchmark areas, but in every topic, there were ways in which NASA could improve its practices. Compared to defense services organizations and some of the industry organizations, one of NASA's notable weaknesses involved communication with contractors regarding its policies and requirements for acquired software. One of NASA's strengths was its software assurance practices, which seemed to rate well in comparison to the other organizational groups and also seemed to include a larger scope of activities. An unexpected benefit of the software benchmarking study was the identification of many opportunities for collaboration in areas including metrics, training, sharing of CMMI experiences and resources such as instructors and CMMI Lead Appraisers, and even sharing of assets such as documented processes. A further unexpected benefit of the study was the feedback on NASA practices that was received from some of the organizations interviewed. From that feedback, other potential areas where NASA could improve were highlighted, such as accuracy of software cost estimation and budgetary practices. The detailed report contains discussion of the practices noted in each of the topic areas, as well as a summary of observations and recommendations from each of the topic areas. The resulting 24 recommendations from the topic areas were then consolidated to eliminate duplication and culled into a set of 14 suggested actionable recommendations. This final set of actionable recommendations, listed below, are items that can be implemented to improve NASA's software engineering practices and to help address many of the items that were listed in the NASA top software engineering issues. 1. Develop and implement standard contract language for software procurements. 2. Advance accurate and trusted software cost estimates for both procured and in-house software and improve the capture of actual cost data to facilitate further improvements. 3. Establish a consistent set of objectives and expectations, specifically types of metrics at the Agency level, so key trends and models can be identified and used to continuously improve software processes and each software development effort. 4. Maintain the CMMI Maturity Level requirement for critical NASA projects and use CMMI to measure organizations developing software for NASA. 5.onsolidate, collect and, if needed, develop common processes principles and other assets across the Agency in order to provide more consistency in software development and acquisition practices and to reduce the overall cost of maintaining or increasing current NASA CMMI maturity levels. 6. Provide additional support for small projects that includes: (a) guidance for appropriate tailoring of requirements for small projects, (b) availability of suitable tools, including support tool set-up and training, and (c) training for small project personnel, assurance personnel and technical authorities on the acceptable options for tailoring requirements and performing assurance on small projects. 7. Develop software training classes for the more experienced software engineers using on-line training, videos, or small separate modules of training that can be accommodated as needed throughout a project. 8. Create guidelines to structure non-classroom training opportunities such as mentoring, peer reviews, lessons learned sessions, and on-the-job training. 9. Develop a set of predictive software defect data and a process for assessing software testing metric data against it. 10. Assess Agency-wide licenses for commonly used software tools. 11. Fill the knowledge gap in common software engineering practices for new hires and co-ops.12. Work through the Science, Technology, Engineering and Mathematics (STEM) program with universities in strengthening education in the use of common software engineering practices and standards. 13. Follow up this benchmark study with a deeper look into what both internal and external organizations perceive as the scope of software assurance, the value they expect to obtain from it, and the shortcomings they experience in the current practice. 14. Continue interactions with external software engineering environment through collaborations, knowledge sharing, and benchmarking.
OpenROCS: a software tool to control robotic observatories
NASA Astrophysics Data System (ADS)
Colomé, Josep; Sanz, Josep; Vilardell, Francesc; Ribas, Ignasi; Gil, Pere
2012-09-01
We present the Open Robotic Observatory Control System (OpenROCS), an open source software platform developed for the robotic control of telescopes. It acts as a software infrastructure that executes all the necessary processes to implement responses to the system events that appear in the routine and non-routine operations associated to data-flow and housekeeping control. The OpenROCS software design and implementation provides a high flexibility to be adapted to different observatory configurations and event-action specifications. It is based on an abstract model that is independent of the specific hardware or software and is highly configurable. Interfaces to the system components are defined in a simple manner to achieve this goal. We give a detailed description of the version 2.0 of this software, based on a modular architecture developed in PHP and XML configuration files, and using standard communication protocols to interface with applications for hardware monitoring and control, environment monitoring, scheduling of tasks, image processing and data quality control. We provide two examples of how it is used as the core element of the control system in two robotic observatories: the Joan Oró Telescope at the Montsec Astronomical Observatory (Catalonia, Spain) and the SuperWASP Qatar Telescope at the Roque de los Muchachos Observatory (Canary Islands, Spain).
Overview of technology modeling in the Remedial Action Assessment System (RAAS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, C.D.; Bagaasen, L.M.; Chan, T.C.
1994-08-01
There are numerous hazardous waste sites under the jurisdiction of the US Department of Energy (DOE). To assist the cleanup of these sites in a more consistent, timely, and cost-effective manner, the Remedial Action Assessment System (RAAS) is being developed by the Pacific Northwest Laboratory (PNL). RAAS is a software tool designed to automate the initial technology selection within the remedial investigation/feasibility study (RI/FS) process. The software does several things for the user: (1) provides information about available remedial technologies, (2) sorts possible technologies to recommend a list of technologies applicable to a given site, (3) points out technical issuesmore » that may prevent the implementation of a technology, and (4) provides an estimate of the effectiveness of a given technology at a particular site. Information from RAAS can be used to compare remediation options and guide selection of technologies for further study.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-11
... Special Committee 205/EUROCAE WG 71: Software Considerations in Aeronautical Systems AGENCY: Federal Aviation Administration (FAA), DOT. ACTION: Notice of RTCA Special Committee 205/EUROCAE WG 71: Software... meeting of RTCA Special Committee 205/EUROCAE WG 71: Software Considerations in Aeronautical Systems...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-19
... Special Committee 205/EUROCAE WG 71: Software Considerations in Aeronautical Systems AGENCY: Federal Aviation Administration (FAA), DOT. ACTION: Notice of RTCA Special Committee 205/EUROCAE WG 71: Software... meeting of RTCA Special Committee 205/EUROCAE WG 71: Software Considerations in Aeronautical Systems...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-16
... Committee 205/EUROCAE WG-71: Software Considerations in Aeronautical Systems AGENCY: Federal Aviation Administration (FAA), DOT. ACTION: Notice of RTCA Special Committee 205/EUROCAE WG-71 meeting: Software... of RTCA Special Committee 205/EUROCAE WG-71: Software Considerations in Aeronautical Systems Agenda...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-23
... Committee 205/EUROCAE WG-71: Software Considerations in Aeronautical Systems AGENCY: Federal Aviation Administration (FAA), DOT. ACTION: Notice of RTCA Special Committee 205/EUROCAE WG-71 meeting: Software... of RTCA Special Committee 205/EUROCAE WG-71: Software Considerations in Aeronautical Systems Agenda...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-02
... Software Used in Safety Systems of Nuclear Power Plants AGENCY: Nuclear Regulatory Commission. ACTION... Computer Software Used in Safety Systems of Nuclear Power Plants.'' This RG endorses, with clarifications... Electrical and Electronic Engineers (IEEE) Standard 828-2005, ``IEEE Standard for Software Configuration...
[Development of integrated support software for clinical nutrition].
Siquier Homar, Pedro; Pinteño Blanco, Manel; Calleja Hernández, Miguel Ángel; Fernández Cortés, Francisco; Martínez Sotelo, Jesús
2015-09-01
to develop an integrated computer software application for specialized nutritional support, integrated in the electronic clinical record, which detects automatically and early those undernourished patients or at risk of developing undernourishment, determining points of opportunity for improvement and evaluation of the results. the quality standards published by the Nutrition Work Group of the Spanish Society of Hospital Pharmacy (SEFH) and the recommendations by the Pharmacy Group of the Spanish Society of Parenteral and Enteral Nutrition (SENPE) have been taken into account. According to these quality standards, the nutritional support has to include the following healthcare stages or sub-processes: nutritional screening, nutritional assessment, plan for nutritional care, prescription, preparation and administration. this software allows to conduct, in an automated way, a specific nutritional assessment for those patients with nutritional risk, implementing, if necessary, a nutritional treatment plan, conducting follow-up and traceability of outcomes derived from the implementation of improvement actions, and quantifying to what extent our practice is close to the established standard. this software allows to standardize the specialized nutritional support from a multidisciplinary point of view, introducing the concept of quality control per processes, and including patient as the main customer. Copyright AULA MEDICA EDICIONES 2014. Published by AULA MEDICA. All rights reserved.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-21
... DEPARTMENT OF HEALTH AND HUMAN SERVICES Agency for Healthcare Research and Quality Meeting for... and Event Reporting AGENCY: Agency for Healthcare Research and Quality (AHRQ), HHS. ACTION: Notice of... held at the John M. Eisenberg Conference Center, Agency for Healthcare Research and Quality, 540...
ERIC Educational Resources Information Center
Wulfson, Stephen
1988-01-01
Presents reviews of six computer software programs for teaching science. Provides the publisher, grade level, cost, and descriptions of software, including: (1) "Recycling Logic"; (2) "Introduction to Biochemistry"; (3) "Food for Thought"; (4) "Watts in a Home"; (5) "Geology in Action"; and (6)…
Design and implementation of a Windows NT network to support CNC activities
NASA Technical Reports Server (NTRS)
Shearrow, C. A.
1996-01-01
The Manufacturing, Materials, & Processes Technology Division is undergoing dramatic changes to bring it's manufacturing practices current with today's technological revolution. The Division is developing Computer Automated Design and Computer Automated Manufacturing (CAD/CAM) abilities. The development of resource tracking is underway in the form of an accounting software package called Infisy. These two efforts will bring the division into the 1980's in relationship to manufacturing processes. Computer Integrated Manufacturing (CIM) is the final phase of change to be implemented. This document is a qualitative study and application of a CIM application capable of finishing the changes necessary to bring the manufacturing practices into the 1990's. The documentation provided in this qualitative research effort includes discovery of the current status of manufacturing in the Manufacturing, Materials, & Processes Technology Division including the software, hardware, network and mode of operation. The proposed direction of research included a network design, computers to be used, software to be used, machine to computer connections, estimate a timeline for implementation, and a cost estimate. Recommendation for the division's improvement include action to be taken, software to utilize, and computer configurations.
Object-oriented software design in semiautomatic building extraction
NASA Astrophysics Data System (ADS)
Guelch, Eberhard; Mueller, Hardo
1997-08-01
Developing a system for semiautomatic building acquisition is a complex process, that requires constant integration and updating of software modules and user interfaces. To facilitate these processes we apply an object-oriented design not only for the data but also for the software involved. We use the unified modeling language (UML) to describe the object-oriented modeling of the system in different levels of detail. We can distinguish between use cases from the users point of view, that represent a sequence of actions, yielding in an observable result and the use cases for the programmers, who can use the system as a class library to integrate the acquisition modules in their own software. The structure of the system is based on the model-view-controller (MVC) design pattern. An example from the integration of automated texture extraction for the visualization of results demonstrate the feasibility of this approach.
Parallel software for lattice N = 4 supersymmetric Yang-Mills theory
NASA Astrophysics Data System (ADS)
Schaich, David; DeGrand, Thomas
2015-05-01
We present new parallel software, SUSY LATTICE, for lattice studies of four-dimensional N = 4 supersymmetric Yang-Mills theory with gauge group SU(N). The lattice action is constructed to exactly preserve a single supersymmetry charge at non-zero lattice spacing, up to additional potential terms included to stabilize numerical simulations. The software evolved from the MILC code for lattice QCD, and retains a similar large-scale framework despite the different target theory. Many routines are adapted from an existing serial code (Catterall and Joseph, 2012), which SUSY LATTICE supersedes. This paper provides an overview of the new parallel software, summarizing the lattice system, describing the applications that are currently provided and explaining their basic workflow for non-experts in lattice gauge theory. We discuss the parallel performance of the code, and highlight some notable aspects of the documentation for those interested in contributing to its future development.
Toward an integrated software platform for systems pharmacology
Ghosh, Samik; Matsuoka, Yukiko; Asai, Yoshiyuki; Hsin, Kun-Yi; Kitano, Hiroaki
2013-01-01
Understanding complex biological systems requires the extensive support of computational tools. This is particularly true for systems pharmacology, which aims to understand the action of drugs and their interactions in a systems context. Computational models play an important role as they can be viewed as an explicit representation of biological hypotheses to be tested. A series of software and data resources are used for model development, verification and exploration of the possible behaviors of biological systems using the model that may not be possible or not cost effective by experiments. Software platforms play a dominant role in creativity and productivity support and have transformed many industries, techniques that can be applied to biology as well. Establishing an integrated software platform will be the next important step in the field. © 2013 The Authors. Biopharmaceutics & Drug Disposition published by John Wiley & Sons, Ltd. PMID:24150748
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-17
... Devices and Related Software; Notice of Investigation AGENCY: U.S. International Trade Commission. ACTION... of certain portable electronic devices and related software by reason of infringement of certain... after importation of certain portable electronic devices or related software that infringe one or more...
Evidence of early development of action planning in the human foetus: a kinematic study.
Zoia, Stefania; Blason, Laura; D'Ottavio, Giuseppina; Bulgheroni, Maria; Pezzetta, Eva; Scabar, Aldo; Castiello, Umberto
2007-01-01
The aim of the present study was to investigate whether foetal hand movements are planned and how they are executed. We performed a kinematic analysis of hand movements directed towards the mouth and the eyes in the foetuses of eight women with normally evolving pregnancies. At 14, 18 and 22 weeks of gestation, eight foetuses underwent a 20-min four-dimensional-ultrasound session. The video recordings for these movements were then imported into in-house software developed to perform kinematic analysis. We found that spatial and temporal characteristics of foetal movements are by no means uncoordinated or unpatterned. By 22 weeks of gestation the movements seem to show the recognizable form of intentional actions, with kinematic patterns that depend on the goal of the action, suggesting a surprisingly advanced level of motor planning.
Integrated System for Autonomous Science
NASA Technical Reports Server (NTRS)
Chien, Steve; Sherwood, Robert; Tran, Daniel; Cichy, Benjamin; Davies, Ashley; Castano, Rebecca; Rabideau, Gregg; Frye, Stuart; Trout, Bruce; Shulman, Seth;
2006-01-01
The New Millennium Program Space Technology 6 Project Autonomous Sciencecraft software implements an integrated system for autonomous planning and execution of scientific, engineering, and spacecraft-coordination actions. A prior version of this software was reported in "The TechSat 21 Autonomous Sciencecraft Experiment" (NPO-30784), NASA Tech Briefs, Vol. 28, No. 3 (March 2004), page 33. This software is now in continuous use aboard the Earth Orbiter 1 (EO-1) spacecraft mission and is being adapted for use in the Mars Odyssey and Mars Exploration Rovers missions. This software enables EO-1 to detect and respond to such events of scientific interest as volcanic activity, flooding, and freezing and thawing of water. It uses classification algorithms to analyze imagery onboard to detect changes, including events of scientific interest. Detection of such events triggers acquisition of follow-up imagery. The mission-planning component of the software develops a response plan that accounts for visibility of targets and operational constraints. The plan is then executed under control by a task-execution component of the software that is capable of responding to anomalies.
Empirical analysis on the human dynamics of blogging behavior on GitHub
NASA Astrophysics Data System (ADS)
Yan, Deng-Cheng; Wei, Zong-Wen; Han, Xiao-Pu; Wang, Bing-Hong
2017-01-01
GitHub is a social collaborative coding platform on which software developers not only collaborate on codes but also share knowledge through blogs using GitHub Pages. In this article, we analyze the blogging behavior of software developers on GitHub Pages. The results show that both the commit number and the inter-event time of two consecutive blogging actions follow heavy-tailed distribution. We further observe a significant variety of activity among individual developers, and a strongly positive correlation between the activity and the power-law exponent of the inter-event time distribution. We also find a difference between the user behaviors of GitHub Pages and other online systems which is driven by the diversity of users and length of contents. In addition, our result shows an obvious difference between the majority of developers and elite developers in their burstiness property.
NASA Technical Reports Server (NTRS)
Quintana, Rolando
2003-01-01
The goal of this research was to integrate a previously validated and reliable safety model, called Continuous Hazard Tracking and Failure Prediction Methodology (CHTFPM), into a software application. This led to the development of a safety management information system (PSMIS). This means that the theory or principles of the CHTFPM were incorporated in a software package; hence, the PSMIS is referred to as CHTFPM management information system (CHTFPM MIS). The purpose of the PSMIS is to reduce the time and manpower required to perform predictive studies as well as to facilitate the handling of enormous quantities of information in this type of studies. The CHTFPM theory encompasses the philosophy of looking at the concept of safety engineering from a new perspective: from a proactive, than a reactive, viewpoint. That is, corrective measures are taken before a problem instead of after it happened. That is why the CHTFPM is a predictive safety because it foresees or anticipates accidents, system failures and unacceptable risks; therefore, corrective action can be taken in order to prevent all these unwanted issues. Consequently, safety and reliability of systems or processes can be further improved by taking proactive and timely corrective actions.
3D visualization software to analyze topological outcomes of topoisomerase reactions
Darcy, I. K.; Scharein, R. G.; Stasiak, A.
2008-01-01
The action of various DNA topoisomerases frequently results in characteristic changes in DNA topology. Important information for understanding mechanistic details of action of these topoisomerases can be provided by investigating the knot types resulting from topoisomerase action on circular DNA forming a particular knot type. Depending on the topological bias of a given topoisomerase reaction, one observes different subsets of knotted products. To establish the character of topological bias, one needs to be aware of all possible topological outcomes of intersegmental passages occurring within a given knot type. However, it is not trivial to systematically enumerate topological outcomes of strand passage from a given knot type. We present here a 3D visualization software (TopoICE-X in KnotPlot) that incorporates topological analysis methods in order to visualize, for example, knots that can be obtained from a given knot by one intersegmental passage. The software has several other options for the topological analysis of mechanisms of action of various topoisomerases. PMID:18440983
NASA Astrophysics Data System (ADS)
Ahmadia, A. J.; Kees, C. E.
2014-12-01
Developing scientific software is a continuous balance between not reinventing the wheel and getting fragile codes to interoperate with one another. Binary software distributions such as Anaconda provide a robust starting point for many scientific software packages, but this solution alone is insufficient for many scientific software developers. HashDist provides a critical component of the development workflow, enabling highly customizable, source-driven, and reproducible builds for scientific software stacks, available from both the IPython Notebook and the command line. To address these issues, the Coastal and Hydraulics Laboratory at the US Army Engineer Research and Development Center has funded the development of HashDist in collaboration with Simula Research Laboratories and the University of Texas at Austin. HashDist is motivated by a functional approach to package build management, and features intelligent caching of sources and builds, parametrized build specifications, and the ability to interoperate with system compilers and packages. HashDist enables the easy specification of "software stacks", which allow both the novice user to install a default environment and the advanced user to configure every aspect of their build in a modular fashion. As an advanced feature, HashDist builds can be made relocatable, allowing the easy redistribution of binaries on all three major operating systems as well as cloud, and supercomputing platforms. As a final benefit, all HashDist builds are reproducible, with a build hash specifying exactly how each component of the software stack was installed. This talk discusses the role of HashDist in the hydrological sciences, including its use by the Coastal and Hydraulics Laboratory in the development and deployment of the Proteus Toolkit as well as the Rapid Operational Access and Maneuver Support project. We demonstrate HashDist in action, and show how it can effectively support development, deployment, teaching, and reproducibility for scientists working in the hydrological sciences. The HashDist documentation is available from: http://hashdist.readthedocs.org/en/latest/ HashDist is currently hosted at: https://github.com/hashdist/hashdist
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-15
... DEPARTMENT OF JUSTICE Antitrust Division United States, et al. v. Election Systems and Software... Columbia in United States, et al. v. Election Systems and Software Inc., Civil Action No. 10-00380. On... Systems and Software, Inc., (``ES&S'') of Premier Election Services, Inc., and PES Holdings, Inc. violated...
Absorbing Software Testing into the Scrum Method
NASA Astrophysics Data System (ADS)
Tuomikoski, Janne; Tervonen, Ilkka
In this paper we study, how to absorb software testing into the Scrum method. We conducted the research as an action research during the years 2007-2008 with three iterations. The result showed that testing can and even should be absorbed to the Scrum method. The testing team was merged into the Scrum teams. The teams can now deliver better working software in a shorter time, because testing keeps track of the progress of the development. Also the team spirit is higher, because the Scrum team members are committed to the same goal. The biggest change from test manager’s point of view was the organized Product Owner Team. Test manager don’t have testing team anymore, and in the future all the testing tasks have to be assigned through the Product Backlog.
ERIC Educational Resources Information Center
Read, Sue; Nte, Sol; Corcoran, Patsy; Stephens, Richard
2013-01-01
Background: Loss is a universal experience and death is perceived as the ultimate loss. The overarching aim of this research is to produce a qualitative, flexible, interactive, computerised tool to support the facilitation of emotional expressions around loss for people with intellectual disabilities. This paper explores the process of using…
Software System Architecture Modeling Methodology for Naval Gun Weapon Systems
2010-12-01
Weapon System HAR Hazard Action Report HERO Hazards of Electromagnetic Radiation to Ordnance IOC Initial Operational Capability... radiation to ordnance ; and combinations therein. Equipment, systems, or procedures and processes whose malfunction would hazard the safe manufacturing...NDI Non-Development Item OPEVAL Operational Evaluation ORDALTS Ordnance Alterations O&SHA Operating and Support Hazard Analysis PDA
Games as an Artistic Medium: Investigating Complexity Thinking in Game-Based Art Pedagogy
ERIC Educational Resources Information Center
Patton, Ryan M.
2013-01-01
This action research study examines the making of video games, using an integrated development environment software program called GameMaker, as art education curriculum for students between the ages of 8-13. Through a method I designed, students created video games using the concepts of move, avoid, release, and contact (MARC) to explore their…
Odom, Laura; Christenbery, Tom
2016-11-01
Asthma burden affects mortality, morbidity, quality of life, and the economy. Written asthma action plans are standard of care according to national guidelines, but these plans are often not prescribed. The purpose of this project was to develop an asthma action plan application for smartphones. A development studio was consulted for support in developing a smartphone application to code the software for the asthma action plan and assist in the design process. During development of the application, a survey was conducted to assist in design of the application and functionality. All survey participants agreed that the application was easy to use, could be used without written instruction, and was designed for adolescents with asthma of any severity. Patients and providers mostly agreed that the app would help provide information about what to do in the event of an asthma exacerbation, and the application would be used frequently. There was consensus from both patients and providers that this application is not only functional but also helpful in the event of an asthma exacerbation. The project met the goal of designing a mobile phone application that would improve patient access to asthma action plans. ©2016 American Association of Nurse Practitioners.
75 FR 74081 - In the Matter of Certain Mobile Devices and Related Software; Notice of Investigation
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-30
... Related Software; Notice of Investigation AGENCY: U.S. International Trade Commission. ACTION: Institution... certain mobile devices and related software by reason of infringement of certain claims of U.S. Patent No... mobile devices and related software that infringe one or more of claims 1, 2, 10, 11, 24-26 and 29 of the...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-19
... Devices and Related Software; Notice of Investigation AGENCY: U.S. International Trade Commission. ACTION... certain digital imaging devices and related software by reason of infringement of certain claims of U.S... digital imaging devices and related software that infringe one or more of claim 1-3 and 5-8 of U.S. Patent...
Proscene: A feature-rich framework for interactive environments
NASA Astrophysics Data System (ADS)
Charalambos, Jean Pierre
We introduce Proscene, a feature-rich, open-source framework for interactive environments. The design of Proscene comprises a three-layered onion-like software architecture, promoting different possible development scenarios. The framework innermost layer decouples user gesture parsing from user-defined actions. The in-between layer implements a feature-rich set of widely-used motion actions allowing the selection and manipulation of objects, including the scene viewpoint. The outermost layer exposes those features as a Processing library. The results have shown the feasibility of our approach together with the simplicity and flexibility of the Proscene framework API.
Guidance on Software Maintenance. Final Report. Reports on Computer Science and Technology.
ERIC Educational Resources Information Center
Martin, Roger J.; Osborne, Wilma M.
Based on informal discussions with personnel at selected federal agencies and private sector organizations and on additional research, this publication addresses issues and problems of software maintenance and suggests actions and procedures which can help software maintenance organizations meet the growing demands of maintaining existing systems.…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-09
..., Associated Software, and Products Containing the Same AGENCY: U.S. International Trade Commission. ACTION..., components thereof, associated software, and products containing the same by reason of infringement of..., components thereof, associated software, and products containing the same that infringe one or more of claims...
77 FR 31758 - Airworthiness Directives; the Boeing Company Airplanes
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-30
.... That NPRM proposed to inspect for part numbers of the operational program software of the flight... operational program software (OPS) of the flight control computers (FCC), and doing corrective actions if... previous NPRM (75 FR 57885, September 23, 2010), we have determined that the software installation required...
DOE Office of Scientific and Technical Information (OSTI.GOV)
A. Alfonsi; C. Rabiti; D. Mandelli
The Reactor Analysis and Virtual control ENviroment (RAVEN) code is a software tool that acts as the control logic driver and post-processing engine for the newly developed Thermal-Hydraulic code RELAP-7. RAVEN is now a multi-purpose Probabilistic Risk Assessment (PRA) software framework that allows dispatching different functionalities: Derive and actuate the control logic required to simulate the plant control system and operator actions (guided procedures), allowing on-line monitoring/controlling in the Phase Space Perform both Monte-Carlo sampling of random distributed events and Dynamic Event Tree based analysis Facilitate the input/output handling through a Graphical User Interface (GUI) and a post-processing data miningmore » module« less
Toward a practical mobile robotic aid system for people with severe physical disabilities.
Regalbuto, M A; Krouskop, T A; Cheatham, J B
1992-01-01
A simple, relatively inexpensive robotic system that can aid severely disabled persons by providing pick-and-place manipulative abilities to augment the functions of human or trained animal assistants is under development at Rice University and the Baylor College of Medicine. A stand-alone software application program runs on a Macintosh personal computer and provides the user with a selection of interactive windows for commanding the mobile robot via cursor action. A HERO 2000 robot has been modified such that its workspace extends from the floor to tabletop heights, and the robot is interfaced to a Macintosh SE via a wireless communications link for untethered operation. Integrated into the system are hardware and software which allow the user to control household appliances in addition to the robot. A separate Machine Control Interface device converts breath action and head or other three-dimensional motion inputs into cursor signals. Preliminary in-home and laboratory testing has demonstrated the utility of the system to perform useful navigational and manipulative tasks.
ERIC Educational Resources Information Center
Zecca, Mark S.
2010-01-01
Business managers who look for ways to cut costs face difficult questions about the efficiency and effectiveness of software engineering practices that are used to complete projects on time, on specification, and within budget (Johnson, 1995; Lindstrom & Jeffries, 2004). Theoretical models such as the Theory of Reasoned Action (TRA) have linked…
Measurement of Software Project Management Effectiveness
2008-12-01
with technical, financial, policy, and non -technical concerns of stakeholders, to develop and select suitable risk control actions, and implementation...not intervene in their projects and therefore affect their views. In most research experiments, researchers apply a controlled event, method or...enable a consistency check among the responses and for other research purposes. Therefore, for the risk control area model, only the responses from
DOE Office of Scientific and Technical Information (OSTI.GOV)
Klise, Katherine A.; Murray, Regan; Bynum, Michael
Water utilities are vulnerable to a wide variety of human-caused and natural disasters. These disruptive events can result in loss of water service, contaminated water, pipe breaks, and failed equipment. Furthermore, long term changes in water supply and customer demand can have a large impact on the operating conditions of the network. The ability to maintain drinking water service during and following these types of events is critical. Simulation and analysis tools can help water utilities explore how their network will respond to disruptive events and plan effective mitigation strategies. The U.S. Environmental Protection Agency and Sandia National Laboratories aremore » developing new software tools to meet this need. The Water Network Tool for Resilience (WNTR, pronounced winter) is a Python package designed to help water utilities investigate resilience of water distribution systems over a wide range of hazardous scenarios and to evaluate resilience-enhancing actions. The following documentation includes installation instructions and examples, description of software features, and software license. It is assumed that the reader is familiar with the Python Programming Language. References are included for additional background on software components. Online documentation, hosted at http://wntr.readthedocsio/, will be updated as new features are added. The online version includes API documentation and information for developers.« less
NASA Technical Reports Server (NTRS)
Trevino, Luis; Brown, Terry; Crumbley, R. T. (Technical Monitor)
2001-01-01
The problem to be addressed in this paper is to explore how the use of Soft Computing Technologies (SCT) could be employed to improve overall vehicle system safety, reliability, and rocket engine performance by development of a qualitative and reliable engine control system (QRECS). Specifically, this will be addressed by enhancing rocket engine control using SCT, innovative data mining tools, and sound software engineering practices used in Marshall's Flight Software Group (FSG). The principle goals for addressing the issue of quality are to improve software management, software development time, software maintenance, processor execution, fault tolerance and mitigation, and nonlinear control in power level transitions. The intent is not to discuss any shortcomings of existing engine control methodologies, but to provide alternative design choices for control, implementation, performance, and sustaining engineering, all relative to addressing the issue of reliability. The approaches outlined in this paper will require knowledge in the fields of rocket engine propulsion (system level), software engineering for embedded flight software systems, and soft computing technologies (i.e., neural networks, fuzzy logic, data mining, and Bayesian belief networks); some of which are briefed in this paper. For this effort, the targeted demonstration rocket engine testbed is the MC-1 engine (formerly FASTRAC) which is simulated with hardware and software in the Marshall Avionics & Software Testbed (MAST) laboratory that currently resides at NASA's Marshall Space Flight Center, building 4476, and is managed by the Avionics Department. A brief plan of action for design, development, implementation, and testing a Phase One effort for QRECS is given, along with expected results. Phase One will focus on development of a Smart Start Engine Module and a Mainstage Engine Module for proper engine start and mainstage engine operations. The overall intent is to demonstrate that by employing soft computing technologies, the quality and reliability of the overall scheme to engine controller development is further improved and vehicle safety is further insured. The final product that this paper proposes is an approach to development of an alternative low cost engine controller that would be capable of performing in unique vision spacecraft vehicles requiring low cost advanced avionics architectures for autonomous operations from engine pre-start to engine shutdown.
An Exploratory Study of Software Cost Estimating at the Electronic Systems Division.
1976-07-01
action’. to improve the software cost Sestimating proces., While thin research was limited to the M.nD onvironment, the same types of problema may exist...Methods in Social Science. Now York: Random House, 1969. 57. Smith, Ronald L. Structured Programming Series (Vol. XI) - Estimating Software Project
Management Aspects of Software Maintenance.
1984-09-01
educated in * the complex nature of software maintenance to be able to properly evaluate and manage the software maintenance effort. In this...maintenance and improvement may be called "software evolution". The soft- ware manager must be Educated in the complex nature cf soft- Iware maintenance to be...complaint of error or request for modification is also studied in order to determine what action needs tc be taken. 2. Define Objective and Approach :
NASA Astrophysics Data System (ADS)
Kunkel, K.; Dissen, J.; Easterling, D. R.; Kulkarni, A.; Akhtar, F. H.; Hayhoe, K.; Stoner, A. M. K.; Swaminathan, R.; Thrasher, B. L.
2017-12-01
s part of the Department of State U.S.-India Partnership for Climate Resilience (PCR), scientists from NOAA NCEI, CICS-NC, Texas Tech University (TTU), Stanford University (SU), and the Indian Institute of Tropical Meteorology (IITM) held a workshop at IITM in Pune, India during 7-9 March 2017 on the development, techniques and applications of downscaled climate projections. Workshop participants from TTU, SU, and IITM presented state-of-the-art climate downscaling techniques using the ARRM method, NASA NEX climate products, CORDEX-South Asia and analysis tools for resilience planning and sustainable development. PCR collaborators in attendance included Indian practitioners, researchers and other NGO including the WRI Partnership for Resilience and Preparedness (PREP), The Energy and Resources Institute (TERI), and NIH. The scientific techniques were provided to workshop participants in a software package written in R by TTU scientists and several sessions were devoted to hands-on experience with the software package. The workshop further examined case studies on the use of downscaled climate data for decision making in a range of sectors, including human health, agriculture, and water resources management as well as to inform the development of the India State Action Plans. This talk will discuss key outcomes including information needs for downscaling climate projections, importance of QA/QC of the data, key findings from select case studies, and the importance of collaborations and partnerships to apply downscaling projections to help inform the development of the India State Action Plans.
An integrated ball projection technology for the study of dynamic interceptive actions.
Stone, J A; Panchuk, D; Davids, K; North, J S; Fairweather, I; Maynard, I W
2014-12-01
Dynamic interceptive actions, such as catching or hitting a ball, are important task vehicles for investigating the complex relationship between cognition, perception, and action in performance environments. Representative experimental designs have become more important recently, highlighting the need for research methods to ensure that the coupling of information and movement is faithfully maintained. However, retaining representative design while ensuring systematic control of experimental variables is challenging, due to the traditional tendency to employ methods that typically involve use of reductionist motor responses such as buttonpressing or micromovements. Here, we outline the methodology behind a custom-built, integrated ball projection technology that allows images of advanced visual information to be synchronized with ball projection. This integrated technology supports the controlled presentation of visual information to participants while they perform dynamic interceptive actions. We discuss theoretical ideas behind the integration of hardware and software, along with practical issues resolved in technological design, and emphasize how the system can be integrated with emerging developments such as mixed reality environments. We conclude by considering future developments and applications of the integrated projection technology for research in human movement behaviors.
2017-03-22
Department of Defense Biotechnology High Performance Computing Software Applications Institute, Telemedicine and Advanced Technology Research Center, US...2017 Available online 22 March 2017 Keywords: Plasmodium Chloroquine Metabolic network modeling Redox metabolism Carbon fixation* Corresponding... available (Antony and Parija, 2016), their efficacy has declined appreciably in the last few decades owing to widespread drug resistance developed by the
ERIC Educational Resources Information Center
Williams, Lynda Patterson
The purpose of the study was to compare two methods of learning multiplication facts in order to develop speed and accuracy. The researcher conducted the action research project with a seventh grade enrichment class, which met for seven weeks during the school year. As part of the curriculum students were provided with activities to refine their…
Insider Threats in the Software Development Lifecycle
2014-11-05
employee, contractor, or other business partner who • has or had authorized access to an organization’s network , system or data and • intentionally...organization’s network , system, or data and who, through • their action/inaction without malicious intent • cause harm or substantially increase...and female Male Target Network , systems, or data PII or Customer Information IP (trade secrets) or Customer Information Access Used
A Lyapunov Function Based Remedial Action Screening Tool Using Real-Time Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mitra, Joydeep; Ben-Idris, Mohammed; Faruque, Omar
This report summarizes the outcome of a research project that comprised the development of a Lyapunov function based remedial action screening tool using real-time data (L-RAS). The L-RAS is an advanced computational tool that is intended to assist system operators in making real-time redispatch decisions to preserve power grid stability. The tool relies on screening contingencies using a homotopy method based on Lyapunov functions to avoid, to the extent possible, the use of time domain simulations. This enables transient stability evaluation at real-time speed without the use of massively parallel computational resources. The project combined the following components. 1. Developmentmore » of a methodology for contingency screening using a homotopy method based on Lyapunov functions and real-time data. 2. Development of a methodology for recommending remedial actions based on the screening results. 3. Development of a visualization and operator interaction interface. 4. Testing of screening tool, validation of control actions, and demonstration of project outcomes on a representative real system simulated on a Real-Time Digital Simulator (RTDS) cluster. The project was led by Michigan State University (MSU), where the theoretical models including homotopy-based screening, trajectory correction using real-time data, and remedial action were developed and implemented in the form of research-grade software. Los Alamos National Laboratory (LANL) contributed to the development of energy margin sensitivity dynamics, which constituted a part of the remedial action portfolio. Florida State University (FSU) and Southern California Edison (SCE) developed a model of the SCE system that was implemented on FSU's RTDS cluster to simulate real-time data that was streamed over the internet to MSU where the L-RAS tool was executed and remedial actions were communicated back to FSU to execute stabilizing controls on the simulated system. LCG Consulting developed the visualization and operator interaction interface, based on specifications provided by MSU. The project was performed from October 2012 to December 2016, at the end of which the L-RAS tool, as described above, was completed and demonstrated. The project resulted in the following innovations and contributions: (a) the L-RAS software prototype, tested on a simulated system, vetted by utility personnel, and potentially ready for wider testing and commercialization; (b) an RTDS-based test bed that can be used for future research in the field; (c) a suite of breakthrough theoretical contributions to the field of power system stability and control; and (d) a new tool for visualization of power system stability margins. While detailed descriptions of the development and implementation of the various project components have been provided in the quarterly reports, this final report provides an overview of the complete project, and is demonstrated using public domain test systems commonly used in the literature. The SCE system, and demonstrations thereon, are not included in this report due to Critical Energy Infrastructure Information (CEII) restrictions.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-18
... Thereof and Associated Software AGENCY: U.S. International Trade Commission. ACTION: Notice, Institution... thereof and associated software, by reason of infringement of U.S. Patent No. 8,028,323 (``the `323 patent... mobile phones, components thereof and associated software by reason of infringement of one or more of...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-03
... define the boundaries of the patent property right. Software by its nature is operation-based and is... elements of software are often defined using functional language. While it is permissible to use functional... Software-Related Patents AGENCY: United States Patent and Trademark Office, Commerce. ACTION: Request for...
The relational database model and multiple multicenter clinical trials.
Blumenstein, B A
1989-12-01
The Southwest Oncology Group (SWOG) chose to use a relational database management system (RDBMS) for the management of data from multiple clinical trials because of the underlying relational model's inherent flexibility and the natural way multiple entity types (patients, studies, and participants) can be accommodated. The tradeoffs to using the relational model as compared to using the hierarchical model include added computing cycles due to deferred data linkages and added procedural complexity due to the necessity of implementing protections against referential integrity violations. The SWOG uses its RDBMS as a platform on which to build data operations software. This data operations software, which is written in a compiled computer language, allows multiple users to simultaneously update the database and is interactive with respect to the detection of conditions requiring action and the presentation of options for dealing with those conditions. The relational model facilitates the development and maintenance of data operations software.
Gesture Based Control and EMG Decomposition
NASA Technical Reports Server (NTRS)
Wheeler, Kevin R.; Chang, Mindy H.; Knuth, Kevin H.
2005-01-01
This paper presents two probabilistic developments for use with Electromyograms (EMG). First described is a new-electric interface for virtual device control based on gesture recognition. The second development is a Bayesian method for decomposing EMG into individual motor unit action potentials. This more complex technique will then allow for higher resolution in separating muscle groups for gesture recognition. All examples presented rely upon sampling EMG data from a subject's forearm. The gesture based recognition uses pattern recognition software that has been trained to identify gestures from among a given set of gestures. The pattern recognition software consists of hidden Markov models which are used to recognize the gestures as they are being performed in real-time from moving averages of EMG. Two experiments were conducted to examine the feasibility of this interface technology. The first replicated a virtual joystick interface, and the second replicated a keyboard. Moving averages of EMG do not provide easy distinction between fine muscle groups. To better distinguish between different fine motor skill muscle groups we present a Bayesian algorithm to separate surface EMG into representative motor unit action potentials. The algorithm is based upon differential Variable Component Analysis (dVCA) [l], [2] which was originally developed for Electroencephalograms. The algorithm uses a simple forward model representing a mixture of motor unit action potentials as seen across multiple channels. The parameters of this model are iteratively optimized for each component. Results are presented on both synthetic and experimental EMG data. The synthetic case has additive white noise and is compared with known components. The experimental EMG data was obtained using a custom linear electrode array designed for this study.
Tessera: Open source software for accelerated data science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sego, Landon H.; Hafen, Ryan P.; Director, Hannah M.
2014-06-30
Extracting useful, actionable information from data can be a formidable challenge for the safeguards, nonproliferation, and arms control verification communities. Data scientists are often on the “front-lines” of making sense of complex and large datasets. They require flexible tools that make it easy to rapidly reformat large datasets, interactively explore and visualize data, develop statistical algorithms, and validate their approaches—and they need to perform these activities with minimal lines of code. Existing commercial software solutions often lack extensibility and the flexibility required to address the nuances of the demanding and dynamic environments where data scientists work. To address this need,more » Pacific Northwest National Laboratory developed Tessera, an open source software suite designed to enable data scientists to interactively perform their craft at the terabyte scale. Tessera automatically manages the complicated tasks of distributed storage and computation, empowering data scientists to do what they do best: tackling critical research and mission objectives by deriving insight from data. We illustrate the use of Tessera with an example analysis of computer network data.« less
Air Traffic Control: Immature Software Acquisition Processes Increase FAA System Acquisition Risks
DOT National Transportation Integrated Search
1997-03-01
The General Accounting Office (GAO) at the request of Congress reviewed (1) : the maturity of Federal Aviation Administration's (FAA's) Air Traffic Control : (ATC) modernization software acquisition processes, and (2) the steps/actions : FAA has unde...
Platform for Postprocessing Waveform-Based NDE
NASA Technical Reports Server (NTRS)
Roth, Don
2008-01-01
Taking advantage of the similarities that exist among all waveform-based non-destructive evaluation (NDE) methods, a common software platform has been developed containing multiple- signal and image-processing techniques for waveforms and images. The NASA NDE Signal and Image Processing software has been developed using the latest versions of LabVIEW, and its associated Advanced Signal Processing and Vision Toolkits. The software is useable on a PC with Windows XP and Windows Vista. The software has been designed with a commercial grade interface in which two main windows, Waveform Window and Image Window, are displayed if the user chooses a waveform file to display. Within these two main windows, most actions are chosen through logically conceived run-time menus. The Waveform Window has plots for both the raw time-domain waves and their frequency- domain transformations (fast Fourier transform and power spectral density). The Image Window shows the C-scan image formed from information of the time-domain waveform (such as peak amplitude) or its frequency-domain transformation at each scan location. The user also has the ability to open an image, or series of images, or a simple set of X-Y paired data set in text format. Each of the Waveform and Image Windows contains menus from which to perform many user actions. An option exists to use raw waves obtained directly from scan, or waves after deconvolution if system wave response is provided. Two types of deconvolution, time-based subtraction or inverse-filter, can be performed to arrive at a deconvolved wave set. Additionally, the menu on the Waveform Window allows preprocessing of waveforms prior to image formation, scaling and display of waveforms, formation of different types of images (including non-standard types such as velocity), gating of portions of waves prior to image formation, and several other miscellaneous and specialized operations. The menu available on the Image Window allows many further image processing and analysis operations, some of which are found in commercially-available image-processing software programs (such as Adobe Photoshop), and some that are not (removing outliers, Bscan information, region-of-interest analysis, line profiles, and precision feature measurements).
NASA Astrophysics Data System (ADS)
Tsyba, E.; Kaufman, M.
2015-08-01
Preparatory works for resuming operational calculations of the Earth rotation parameters based on the results of satellite laser ranging data processing (LAGEOS 1, LAGEOS 2) are to be completed in the Main Metrology Centre Of The State Time And Frequency Service (VNIIFTRI) in 2014. For this purpose BERNESE 5.2 software (Dach & Walser, 2014) was chosen as a base software which has been used for many years in the Main Metrological Centre of the State Time and Frequency Service to process phase observations of GLONASS and GPS satellites. Although in the BERNESE 5.2 software announced presentation the possibility of the SLR data processing is declared, it has not been fully implemented. In particular there is no such an essential element as corrective action (as input or resulting parameters) in the local time scale ("time bias"), etc. Therefore, additional program blocks have been developed and integrated into the BERNESE 5.2 software environment. The program blocks are written in Perl and Matlab program languages and can be used both for Windows and Linux, 32-bit and 64-bit platforms.
Hernández, Oscar E; Zurek, Eduardo E
2013-05-15
We present a software tool called SENB, which allows the geometric and biophysical neuronal properties in a simple computational model of a Hodgkin-Huxley (HH) axon to be changed. The aim of this work is to develop a didactic and easy-to-use computational tool in the NEURON simulation environment, which allows graphical visualization of both the passive and active conduction parameters and the geometric characteristics of a cylindrical axon with HH properties. The SENB software offers several advantages for teaching and learning electrophysiology. First, SENB offers ease and flexibility in determining the number of stimuli. Second, SENB allows immediate and simultaneous visualization, in the same window and time frame, of the evolution of the electrophysiological variables. Third, SENB calculates parameters such as time and space constants, stimuli frequency, cellular area and volume, sodium and potassium equilibrium potentials, and propagation velocity of the action potentials. Furthermore, it allows the user to see all this information immediately in the main window. Finally, with just one click SENB can save an image of the main window as evidence. The SENB software is didactic and versatile, and can be used to improve and facilitate the teaching and learning of the underlying mechanisms in the electrical activity of an axon using the biophysical properties of the squid giant axon.
1992-04-01
contractor’s existing data collection, analysis and corrective action system shall be utilized, with modification only as necessary to meet the...either from test or from analysis of field data . The procedures of MIL-STD-756B assume that the reliability of a 18 DEFINE IDENTIFY SOFTWARE LIFE CYCLE...to generate sufficient data to report a statistically valid reliability figure for a class of software. Casual data gathering accumulates data more
Design implications for task-specific search utilities for retrieval and re-engineering of code
NASA Astrophysics Data System (ADS)
Iqbal, Rahat; Grzywaczewski, Adam; Halloran, John; Doctor, Faiyaz; Iqbal, Kashif
2017-05-01
The importance of information retrieval systems is unquestionable in the modern society and both individuals as well as enterprises recognise the benefits of being able to find information effectively. Current code-focused information retrieval systems such as Google Code Search, Codeplex or Koders produce results based on specific keywords. However, these systems do not take into account developers' context such as development language, technology framework, goal of the project, project complexity and developer's domain expertise. They also impose additional cognitive burden on users in switching between different interfaces and clicking through to find the relevant code. Hence, they are not used by software developers. In this paper, we discuss how software engineers interact with information and general-purpose information retrieval systems (e.g. Google, Yahoo!) and investigate to what extent domain-specific search and recommendation utilities can be developed in order to support their work-related activities. In order to investigate this, we conducted a user study and found that software engineers followed many identifiable and repeatable work tasks and behaviours. These behaviours can be used to develop implicit relevance feedback-based systems based on the observed retention actions. Moreover, we discuss the implications for the development of task-specific search and collaborative recommendation utilities embedded with the Google standard search engine and Microsoft IntelliSense for retrieval and re-engineering of code. Based on implicit relevance feedback, we have implemented a prototype of the proposed collaborative recommendation system, which was evaluated in a controlled environment simulating the real-world situation of professional software engineers. The evaluation has achieved promising initial results on the precision and recall performance of the system.
Idri, Ali; Bachiri, Mariam; Fernández-Alemán, José Luis
2016-03-01
Stakeholders' needs and expectations are identified by means of software quality requirements, which have an impact on software product quality. In this paper, we present a set of requirements for mobile personal health records (mPHRs) for pregnancy monitoring, which have been extracted from literature and existing mobile apps on the market. We also use the ISO/IEC 25030 standard to suggest the requirements that should be considered during the quality evaluation of these mPHRs. We then go on to design a checklist in which we contrast the mPHRs for pregnancy monitoring requirements with software product quality characteristics and sub-characteristics in order to calculate the impact of these requirements on software product quality, using the ISO/IEC 25010 software product quality standard. The results obtained show that the requirements related to the user's actions and the app's features have the most impact on the external sub-characteristics of the software product quality model. The only sub-characteristic affected by all the requirements is Appropriateness of Functional suitability. The characteristic Operability is affected by 95% of the requirements while the lowest degrees of impact were identified for Compatibility (15%) and Transferability (6%). Lastly, the degrees of the impact of the mPHRs for pregnancy monitoring requirements are discussed in order to provide appropriate recommendations for the developers and stakeholders of mPHRs for pregnancy monitoring.
Programming biological models in Python using PySB.
Lopez, Carlos F; Muhlich, Jeremy L; Bachman, John A; Sorger, Peter K
2013-01-01
Mathematical equations are fundamental to modeling biological networks, but as networks get large and revisions frequent, it becomes difficult to manage equations directly or to combine previously developed models. Multiple simultaneous efforts to create graphical standards, rule-based languages, and integrated software workbenches aim to simplify biological modeling but none fully meets the need for transparent, extensible, and reusable models. In this paper we describe PySB, an approach in which models are not only created using programs, they are programs. PySB draws on programmatic modeling concepts from little b and ProMot, the rule-based languages BioNetGen and Kappa and the growing library of Python numerical tools. Central to PySB is a library of macros encoding familiar biochemical actions such as binding, catalysis, and polymerization, making it possible to use a high-level, action-oriented vocabulary to construct detailed models. As Python programs, PySB models leverage tools and practices from the open-source software community, substantially advancing our ability to distribute and manage the work of testing biochemical hypotheses. We illustrate these ideas using new and previously published models of apoptosis.
Programming biological models in Python using PySB
Lopez, Carlos F; Muhlich, Jeremy L; Bachman, John A; Sorger, Peter K
2013-01-01
Mathematical equations are fundamental to modeling biological networks, but as networks get large and revisions frequent, it becomes difficult to manage equations directly or to combine previously developed models. Multiple simultaneous efforts to create graphical standards, rule-based languages, and integrated software workbenches aim to simplify biological modeling but none fully meets the need for transparent, extensible, and reusable models. In this paper we describe PySB, an approach in which models are not only created using programs, they are programs. PySB draws on programmatic modeling concepts from little b and ProMot, the rule-based languages BioNetGen and Kappa and the growing library of Python numerical tools. Central to PySB is a library of macros encoding familiar biochemical actions such as binding, catalysis, and polymerization, making it possible to use a high-level, action-oriented vocabulary to construct detailed models. As Python programs, PySB models leverage tools and practices from the open-source software community, substantially advancing our ability to distribute and manage the work of testing biochemical hypotheses. We illustrate these ideas using new and previously published models of apoptosis. PMID:23423320
SDDL- SOFTWARE DESIGN AND DOCUMENTATION LANGUAGE
NASA Technical Reports Server (NTRS)
Kleine, H.
1994-01-01
Effective, efficient communication is an essential element of the software development process. The Software Design and Documentation Language (SDDL) provides an effective communication medium to support the design and documentation of complex software applications. SDDL supports communication between all the members of a software design team and provides for the production of informative documentation on the design effort. Even when an entire development task is performed by a single individual, it is important to explicitly express and document communication between the various aspects of the design effort including concept development, program specification, program development, and program maintenance. SDDL ensures that accurate documentation will be available throughout the entire software life cycle. SDDL offers an extremely valuable capability for the design and documentation of complex programming efforts ranging from scientific and engineering applications to data management and business sytems. Throughout the development of a software design, the SDDL generated Software Design Document always represents the definitive word on the current status of the ongoing, dynamic design development process. The document is easily updated and readily accessible in a familiar, informative form to all members of the development team. This makes the Software Design Document an effective instrument for reconciling misunderstandings and disagreements in the development of design specifications, engineering support concepts, and the software design itself. Using the SDDL generated document to analyze the design makes it possible to eliminate many errors that might not be detected until coding and testing is attempted. As a project management aid, the Software Design Document is useful for monitoring progress and for recording task responsibilities. SDDL is a combination of language, processor, and methodology. The SDDL syntax consists of keywords to invoke design structures and a collection of directives which control processor actions. The designer has complete control over the choice of keywords, commanding the capabilities of the processor in a way which is best suited to communicating the intent of the design. The SDDL processor translates the designer's creative thinking into an effective document for communication. The processor performs as many automatic functions as possible, thereby freeing the designer's energy for the creative effort. Document formatting includes graphical highlighting of structure logic, accentuation of structure escapes and module invocations, logic error detection, and special handling of title pages and text segments. The SDDL generated document contains software design summary information including module invocation hierarchy, module cross reference, and cross reference tables of user selected words or phrases appearing in the document. The basic forms of the methodology are module and block structures and the module invocation statement. A design is stated in terms of modules that represent problem abstractions which are complete and independent enough to be treated as separate problem entities. Blocks are lower-level structures used to build the modules. Both kinds of structures may have an initiator part, a terminator part, an escape segment, or a substructure. The SDDL processor is written in PASCAL for batch execution on a DEC VAX series computer under VMS. SDDL was developed in 1981 and last updated in 1984.
JPRS Report, Science & Technology, USSR: Computers.
1988-12-08
Shlezinger, G. L. Gimelfarb ; UPRAVLYAYUSHCHIYE SISTEMY 1 MASHINY, No 6, Nov-Dec 87] 1 SOFTWARE Selected Software: Concise...0——.—Z-Z. H3 Action Examples of Condition; Examples of Action: 1. A > 0 •’ 1. A: - A + 1 2. ¥a€ B • 2. F(X, Y ) 3. Input of data 3...8217 * ( ijprffc ^h^ > , \\\\i, YN , STSi, 5TRV ,^i 5". I ). . »’"A > .’ | R f J : c - r B ; K - I ; j ’ : <ff \\K> Y *A>. 1) h’-Mfi STRVzl
Software Design Document PVD CSCI (3). Volume 2, Appendices
1991-06-01
FUNCTION: Igeneric -type(Type, Overline) called~y: show Ioverline in overlineif.c, (null) boundary-action in ovline-func.c, (null) Ideparture-action in...2.8.2.2-22 Ideparture-action 2.8.2.2-28 ldeparturejabel 2.8.2.2-27 idone__create_action 2.8.2.2-39 igeneric -size 2.8.2.2-17 lgeneric-type, 2.8.2.2-24
1991-12-01
management and engineering issues common to the military-industrial complex, - to learn from past experience, - to understand future software...prospective policy documents. - :Prepare a draft issue paper and presentation for the DAE. These items should address the key implementation issues with...respect to MCCR software metrics and establish a clear need for DAE support. Long Term Actions ( past 12-18 mcnths) ... Draft final implementation
Repository-based software engineering program
NASA Technical Reports Server (NTRS)
Wilson, James
1992-01-01
The activities performed during September 1992 in support of Tasks 01 and 02 of the Repository-Based Software Engineering Program are outlined. The recommendations and implementation strategy defined at the September 9-10 meeting of the Reuse Acquisition Action Team (RAAT) are attached along with the viewgraphs and reference information presented at the Institute for Defense Analyses brief on legal and patent issues related to software reuse.
Sarkar, V; Gutierrez, A N; Stathakis, S; Swanson, G P; Papanikolaou, N
2009-01-01
The purpose of this project was to develop a software platform to produce a virtual fluoroscopic image as an aid for permanent prostate seed implants. Seed location information from a pre-plan was extracted and used as input to in-house developed software to produce a virtual fluoroscopic image. In order to account for differences in patient positioning on the day of treatment, the user was given the ability to make changes to the virtual image. The system has been shown to work as expected for all test cases. The system allows for quick (on average less than 10 sec) generation of a virtual fluoroscopic image of the planned seed pattern. The image can be used as a verification tool to aid the physician in evaluating how close the implant is to the planned distribution throughout the procedure and enable remedial action should a large deviation be observed.
Elmore, Kim; Flanagan, Barry; Jones, Nicholas F; Heitgerd, Janet L
2010-04-01
In 2008, CDC convened an expert panel to gather input on the use of geospatial science in surveillance, research and program activities focused on CDC's Healthy Communities Goal. The panel suggested six priorities: spatially enable and strengthen public health surveillance infrastructure; develop metrics for geospatial categorization of community health and health inequity; evaluate the feasibility and validity of standard metrics of community health and health inequities; support and develop GIScience and geospatial analysis; provide geospatial capacity building, training and education; and, engage non-traditional partners. Following the meeting, the strategies and action items suggested by the expert panel were reviewed by a CDC subcommittee to determine priorities relative to ongoing CDC geospatial activities, recognizing that many activities may need to occur either in parallel, or occur multiple times across phases. Phase A of the action items centers on developing leadership support. Phase B focuses on developing internal and external capacity in both physical (e.g., software and hardware) and intellectual infrastructure. Phase C of the action items plan concerns the development and integration of geospatial methods. In summary, the panel members provided critical input to the development of CDC's strategic thinking on integrating geospatial methods and research issues across program efforts in support of its Healthy Communities Goal.
A fouling monitor alarm to prevent forced outages
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thompson, R.E.; Hickinbotham, A.; Fang, T.C.
2000-07-01
Many utilities rely on coal blending to meet emissions and boiler performance goals, but the increased variability in coal quality can adversely impact ash deposition and soot blowing requirements. Other utilities are experimenting with lower quality coals and burner zone blending of coals fired from different bunkers as part of a deregulation strategy to reduce fuel costs. However, these strategies can lead to slagging/fouling episodes, a possible outage, or a decrease in unit availability if boiler operations are not carefully monitored. This paper summarizes the development of software to monitor boiler fouling and to provide an advanced warning to themore » control operators when a fouling episode is imminent. With adequate warming, preemptive action can be taken (e.g., soot blowing, a change in coal blend, etc.) to potentially avoid a costly outage. The software utilizes a unique combination of combustion diagnostic techniques and convective section heat adsorption analyses to identify boiler operating conditions where ash deposition rates may be high and conductive to triggering a fouling episode. The paper outlines the history of the fouling problem and the implementation of the software on Wabamun Unit 4, a tangentially-fired unit with relatively narrow reheat tube spacing. The unit had a tendency to foul when burning a high alkaline (but low ash) coal seam. The paper discusses the software development, implementation, and data acquisitions activities. Preliminary test results are provided for Wabamun 4 and for Sundance Units 1 and 2 where the software was recently installed.« less
A study of interactive control scheduling and economic assessment for robotic systems
NASA Technical Reports Server (NTRS)
1982-01-01
A class of interactive control systems is derived by generalizing interactive manipulator control systems. Tasks of interactive control systems can be represented as a network of a finite set of actions which have specific operational characteristics and specific resource requirements, and which are of limited duration. This has enabled the decomposition of the overall control algorithm simultaneously and asynchronously. The performance benefits of sensor referenced and computer-aided control of manipulators in a complex environment is evaluated. The first phase of the CURV arm control system software development and the basic features of the control algorithms and their software implementation are presented. An optimal solution for a production scheduling problem that will be easy to implement in practical situations is investigated.
Digitalisierung - Management Zwischen 0 und 1
NASA Astrophysics Data System (ADS)
Friedrich, Stefan; Rachholz, Josef
2017-09-01
Digitization as a process of expressing actions and values by codes 0 and 1 has already has become part of our lives. Digitization enables enterprises to improve production, sales and to increase volume of production. However, no standard digitization strategy has been yet developed. Even in the digitized business process management system, the most important position remains to a human being. The improvement of software products, their availability and the education system in the area of introduction and use of information technology is thus a striking feature in development of managing (but also other) current processes.
Embracing Open Source for NASA's Earth Science Data Systems
NASA Technical Reports Server (NTRS)
Baynes, Katie; Pilone, Dan; Boller, Ryan; Meyer, David; Murphy, Kevin
2017-01-01
The overarching purpose of NASAs Earth Science program is to develop a scientific understanding of Earth as a system. Scientific knowledge is most robust and actionable when resulting from transparent, traceable, and reproducible methods. Reproducibility includes open access to the data as well as the software used to arrive at results. Additionally, software that is custom-developed for NASA should be open to the greatest degree possible, to enable re-use across Federal agencies, reduce overall costs to the government, remove barriers to innovation, and promote consistency through the use of uniform standards. Finally, Open Source Software (OSS) practices facilitate collaboration between agencies and the private sector. To best meet these ends, NASAs Earth Science Division promotes the full and open sharing of not only all data, metadata, products, information, documentation, models, images, and research results but also the source code used to generate, manipulate and analyze them. This talk focuses on the challenges to open sourcing NASA developed software within ESD and the growing pains associated with establishing policies running the gamut of tracking issues, properly documenting build processes, engaging the open source community, maintaining internal compliance, and accepting contributions from external sources. This talk also covers the adoption of existing open source technologies and standards to enhance our custom solutions and our contributions back to the community. Finally, we will be introducing the most recent OSS contributions from NASA Earth Science program and promoting these projects for wider community review and adoption.
ERIC Educational Resources Information Center
Ramaswami, Rama
2008-01-01
Human Resources (HR) administrators are finding that as software modules are installed to automate various processes, they have more time to focus on strategic objectives. And as compliance with affirmative action and other employment regulations comes under increasing scrutiny, HR staffers are finding that software can deliver and track data with…
ERIC Educational Resources Information Center
Gladhart, Marsha A.
1994-01-01
Reviews two computer software programs for children: (1) "Ready, Set, Read with Bananas and Jack" (Sierra Discovery Series), available for Windows or Macintosh systems, which uses animation and sound to teach early reading skills; and (2) "Word Connection" (Action Software), a Macintosh program that creates word puzzles. (MDM)
78 FR 45515 - Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-29
...] Submission for OMB Review; Comment Request ACTION: Notice. The Defense Acquisition Regulations System has... Supplement (DFARS) Subpart 227.71, Rights in Technical Data, and Subpart 227.72, Rights in Computer Software and Computer Software Documentation, and related provisions and clauses of the Defense Federal...
Electronic Design Automation (EDA) Roadmap Taskforce Report, Design of Microprocessors
1999-04-01
through on time. Hence, the study is not a crystal-ball- gazing exercise, but a rigorous, schedulable plan of action to attain the goal. NTRS97...formats so as not to impose too heavy a maintenance burden on their users Object Interfaces eliminate these problems: • A tool that binds the interface ...and User Interface - Design Tool Communication - EDA System Extension Language - EDA Standards- Based Software Development Environment - Design and
Software Technology for Adaptable Reliable Systems (STARS) Workshop March 24-27 1986.
1986-03-01
syntax is aug- monitor program behavior. Trace and mented to accept design notes in arbitrary single-step facilities will provide the capability ... capabilities of these worksta- inrs tions make them a logical choice for hosting The final component of Vise is the a visual development environment. We...the following When the user picks the desired action, capabilities : graphical program display and linguistic analysis is used to extract informa
New Technologies and the World Ahead: The Top 20 Plus 5
2011-01-01
Specialized Agent Software Programs. Bots represent the next great milestone in soft- ware development. The general deployment of bots is projected to be in...knowledge and areas of interest. Powerful personal- agent 206 Moving from Vision to Action programs will search the Internet and its databases based on...language- capable chatbot and avatar interfaces that can control electronic data and also change and manipulate things in the physical world. These
The Navy’s Management of Software Licenses Needs Improvement
2013-08-07
Enterprise Software Licensing ( ESL ) as a primary DON etliciency target. Through policy and Integrated Product Team actions, this efficiency...review, as well as with DoD Enterprise Software Initiative ( ESl ) Blanket Pw·chase Agreements and any r•elated fedeml Acquisition Regulation and General...organizational and multi-functional DON ESL team. The DON is also participating in DoD level enterprise softwru·e licensing project~ through the Dol
NASA Astrophysics Data System (ADS)
Pajewski, Lara; Benedetto, Andrea; Schettini, Giuseppe; Soldovieri, Francesco
2013-04-01
Ground Penetrating Radar (GPR) is a safe, non-destructive and non-invasive imaging technique that can be effectively used for advanced inspection of composite structures and for diagnostics affecting the whole life-cycle of civil engineering works. GPR can also be successfully employed in archaeological prospecting and cultural heritage diagnostics. In many Countries, where the archeological patrimony is an outstanding value (as Egypt, Israel, Greece, Central and South America), GPR is usually employed both as a diagnostic tool for the preventive detection of archeological structures and as the most advanced instrument able to prospect geometry and shape of underground valuable sites. However many uncertainties persist, because of several difficulties and ambiguities due to the complexity of the image processing in heterogeneous environment. It is possible to identify three main areas, in GPR field, that have to be addressed in order to promote the use of this technology in archaeological prospecting and cultural heritage diagnostics. These are: a) increase of the system sensitivity to enable the usability in a wider range of conditions, archeological sites are often located in impervious and critical environments; b) research novel data processing algorithms/analysis tools for the interpretation of GPR results; c) contribute to the development of new standards and guidelines and to training of end users, that will also help to increase the awareness of operators. It is also important to further investigate and promote a combined use of GPR with other non-invasive advanced techniques, typically used in the archeological investigation. In this framework, the COST Action TU1208 "Civil Engineering Applications of Ground Penetrating Radar", proposed by a research team of "Roma Tre" University, Rome, Italy, has been approved in November 2012 and is going to start in April 2013. It is a 4-years ambitious project already involving 17 European Countries (AT, BE, CH, CZ, DE, EL, ES, FI, FR, HR, IT, NL, NO, PL, PT, TR, UK), as well as Australia and U.S.A. The project will be developed within the frame of a unique approach, based on the integrated contribution of University researchers, software developers, geophysics experts, Non-Destructive Testing equipment designers and producers, end users from private companies and public agencies. The main objective of the COST Action TU1208 is to exchange and increase scientific-technical knowledge and experience of GPR techniques, whilst promoting the effective use of this safe and non-destructive technique. In this interdisciplinary Action, advantages and limitations of GPR will be highlighted, leading to the identification of gaps in knowledge and technology. Protocols and guidelines for European Standards will be developed, for an effective use of GPR in various applications. A novel GPR will be designed and realized: a multi-static system, with dedicated software and calibration procedures, able to construct real-time lane three-dimensional high resolution images of investigated areas. Advanced electromagnetic-scattering and data-processing techniques will be developed. Freeware software will be released, for inspection and monitoring of complex structures, buried-target localization, shape reconstruction and estimation of physical parameters. Particular interest will be devoted to the combined use of GPR, together with other advanced and non-invasive sensing techniques, for a multi depth, multi-resolution and multi-scale monitoring of archaeological, architectural and artistic heritage (Working Group 4). Novel procedures and techniques will be developed and tested for the study and preservation of historical buildings, bridges, monuments, sculptures, paintings, frescoes, as well as for the mapping of sites and structures present in the subsoil. During the Action lifetime, a three-years high level training program will be organized. Mobility of early career researchers will be encouraged. The scientific work-plan of the COST Action TU1208 is open, to ensure that experts all over the world, who did not participate in the preparation of the proposal but are interested in the project, may join the Action and participate in its activities. More information about the project can be found at http://www.cost.eu/domains_actions/tud/Actions/TU1208.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dreyer, Jonathan G.; Wang, Tzu-Fang; Vo, Duc T.
Under a 2006 agreement between the Department of Energy (DOE) of the United States of America and the Institut de Radioprotection et de Sûreté Nucléaire (IRSN) of France, the National Nuclear Security Administration (NNSA) within DOE and IRSN initiated a collaboration to improve isotopic identification and analysis of nuclear material [i.e., plutonium (Pu) and uranium (U)]. The specific aim of the collaborative project was to develop new versions of two types of isotopic identification and analysis software: (1) the fixed-energy response-function analysis for multiple energies (FRAM) codes and (2) multi-group analysis (MGA) codes. The project is entitled Action Sheet 4more » – Cooperation on Improved Isotopic Identification and Analysis Software for Portable, Electrically Cooled, High-Resolution Gamma Spectrometry Systems (Action Sheet 4). FRAM and MGA/U235HI are software codes used to analyze isotopic ratios of U and Pu. FRAM is an application that uses parameter sets for the analysis of U or Pu. MGA and U235HI are two separate applications that analyze Pu or U, respectively. They have traditionally been used by safeguards practitioners to analyze gamma spectra acquired with high-resolution gamma spectrometry (HRGS) systems that are cooled by liquid nitrogen. However, it was discovered that these analysis programs were not as accurate when used on spectra acquired with a newer generation of more portable, electrically cooled HRGS (ECHRGS) systems. In response to this need, DOE/NNSA and IRSN collaborated to update the FRAM and U235HI codes to improve their performance with newer ECHRGS systems. Lawrence Livermore National Laboratory (LLNL) and Los Alamos National Laboratory (LANL) performed this work for DOE/NNSA.« less
78 FR 63135 - Airworthiness Directives; The Boeing Company Airplanes
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-23
... system. The NPRM also proposed to require replacing the operational program software of certain... condition would not be adequately addressed by the proposed action. The manufacturer has issued new service... software of certain indicating/recording systems. The NPRM was prompted by numerous operator reports of...
A Model-Driven Co-Design Framework for Fusing Control and Scheduling Viewpoints.
Sundharam, Sakthivel Manikandan; Navet, Nicolas; Altmeyer, Sebastian; Havet, Lionel
2018-02-20
Model-Driven Engineering (MDE) is widely applied in the industry to develop new software functions and integrate them into the existing run-time environment of a Cyber-Physical System (CPS). The design of a software component involves designers from various viewpoints such as control theory, software engineering, safety, etc. In practice, while a designer from one discipline focuses on the core aspects of his field (for instance, a control engineer concentrates on designing a stable controller), he neglects or considers less importantly the other engineering aspects (for instance, real-time software engineering or energy efficiency). This may cause some of the functional and non-functional requirements not to be met satisfactorily. In this work, we present a co-design framework based on timing tolerance contract to address such design gaps between control and real-time software engineering. The framework consists of three steps: controller design, verified by jitter margin analysis along with co-simulation, software design verified by a novel schedulability analysis, and the run-time verification by monitoring the execution of the models on target. This framework builds on CPAL (Cyber-Physical Action Language), an MDE design environment based on model-interpretation, which enforces a timing-realistic behavior in simulation through timing and scheduling annotations. The application of our framework is exemplified in the design of an automotive cruise control system.
A Model-Driven Co-Design Framework for Fusing Control and Scheduling Viewpoints
Navet, Nicolas; Havet, Lionel
2018-01-01
Model-Driven Engineering (MDE) is widely applied in the industry to develop new software functions and integrate them into the existing run-time environment of a Cyber-Physical System (CPS). The design of a software component involves designers from various viewpoints such as control theory, software engineering, safety, etc. In practice, while a designer from one discipline focuses on the core aspects of his field (for instance, a control engineer concentrates on designing a stable controller), he neglects or considers less importantly the other engineering aspects (for instance, real-time software engineering or energy efficiency). This may cause some of the functional and non-functional requirements not to be met satisfactorily. In this work, we present a co-design framework based on timing tolerance contract to address such design gaps between control and real-time software engineering. The framework consists of three steps: controller design, verified by jitter margin analysis along with co-simulation, software design verified by a novel schedulability analysis, and the run-time verification by monitoring the execution of the models on target. This framework builds on CPAL (Cyber-Physical Action Language), an MDE design environment based on model-interpretation, which enforces a timing-realistic behavior in simulation through timing and scheduling annotations. The application of our framework is exemplified in the design of an automotive cruise control system. PMID:29461489
System monitoring feedback in cinemas and harvesting energy of the air conditioning condenser
NASA Astrophysics Data System (ADS)
Pop, P. P.; Pop-Vadean, A.; Barz, C.; Latinovic, T.; Chiver, O.
2017-05-01
Our article monitors the degree of emotional involvement of the audience in the action film in theaters by measuring the concentration of CO2. The software performs data processing obtained dispersion sensors and displays data during the film. The software will also trigger the start of the air conditioning condenser where we can get harvesting energy by installing a piezoelectric device. Useful energy can be recovered from various waste produced in cinema. The time lag between actions and changes in environmental systems determines that decisions made now will affect subsequent generations and the future of our environment.
WTEC monograph on instrumentation, control and safety systems of Canadian nuclear facilities
NASA Technical Reports Server (NTRS)
Uhrig, Robert E.; Carter, Richard J.
1993-01-01
This report updates a 1989-90 survey of advanced instrumentation and controls (I&C) technologies and associated human factors issues in the U.S. and Canadian nuclear industries carried out by a team from Oak Ridge National Laboratory (Carter and Uhrig 1990). The authors found that the most advanced I&C systems are in the Canadian CANDU plants, where the newest plant (Darlington) has digital systems in almost 100 percent of its control systems and in over 70 percent of its plant protection system. Increased emphasis on human factors and cognitive science in modern control rooms has resulted in a reduced workload for the operators and the elimination of many human errors. Automation implemented through digital instrumentation and control is effectively changing the role of the operator to that of a systems manager. The hypothesis that properly introducing digital systems increases safety is supported by the Canadian experience. The performance of these digital systems has been achieved using appropriate quality assurance programs for both hardware and software development. Recent regulatory authority review of the development of safety-critical software has resulted in the creation of isolated software modules with well defined interfaces and more formal structure in the software generation. The ability of digital systems to detect impending failures and initiate a fail-safe action is a significant safety issue that should be of special interest to nuclear utilities and regulatory authorities around the world.
2013-01-01
Background We present a software tool called SENB, which allows the geometric and biophysical neuronal properties in a simple computational model of a Hodgkin-Huxley (HH) axon to be changed. The aim of this work is to develop a didactic and easy-to-use computational tool in the NEURON simulation environment, which allows graphical visualization of both the passive and active conduction parameters and the geometric characteristics of a cylindrical axon with HH properties. Results The SENB software offers several advantages for teaching and learning electrophysiology. First, SENB offers ease and flexibility in determining the number of stimuli. Second, SENB allows immediate and simultaneous visualization, in the same window and time frame, of the evolution of the electrophysiological variables. Third, SENB calculates parameters such as time and space constants, stimuli frequency, cellular area and volume, sodium and potassium equilibrium potentials, and propagation velocity of the action potentials. Furthermore, it allows the user to see all this information immediately in the main window. Finally, with just one click SENB can save an image of the main window as evidence. Conclusions The SENB software is didactic and versatile, and can be used to improve and facilitate the teaching and learning of the underlying mechanisms in the electrical activity of an axon using the biophysical properties of the squid giant axon. PMID:23675833
Fischbach, Martin; Wiebusch, Dennis; Latoschik, Marc Erich
2017-04-01
Modularity, modifiability, reusability, and API usability are important software qualities that determine the maintainability of software architectures. Virtual, Augmented, and Mixed Reality (VR, AR, MR) systems, modern computer games, as well as interactive human-robot systems often include various dedicated input-, output-, and processing subsystems. These subsystems collectively maintain a real-time simulation of a coherent application state. The resulting interdependencies between individual state representations, mutual state access, overall synchronization, and flow of control implies a conceptual close coupling whereas software quality asks for a decoupling to develop maintainable solutions. This article presents five semantics-based software techniques that address this contradiction: Semantic grounding, code from semantics, grounded actions, semantic queries, and decoupling by semantics. These techniques are applied to extend the well-established entity-component-system (ECS) pattern to overcome some of this pattern's deficits with respect to the implied state access. A walk-through of central implementation aspects of a multimodal (speech and gesture) VR-interface is used to highlight the techniques' benefits. This use-case is chosen as a prototypical example of complex architectures with multiple interacting subsystems found in many VR, AR and MR architectures. Finally, implementation hints are given, lessons learned regarding maintainability pointed-out, and performance implications discussed.
Workflow in interventional radiology: nerve blocks and facet blocks
NASA Astrophysics Data System (ADS)
Siddoway, Donald; Ingeholm, Mary Lou; Burgert, Oliver; Neumuth, Thomas; Watson, Vance; Cleary, Kevin
2006-03-01
Workflow analysis has the potential to dramatically improve the efficiency and clinical outcomes of medical procedures. In this study, we recorded the workflow for nerve block and facet block procedures in the interventional radiology suite at Georgetown University Hospital in Washington, DC, USA. We employed a custom client/server software architecture developed by the Innovation Center for Computer Assisted Surgery (ICCAS) at the University of Leipzig, Germany. This software runs in an internet browser, and allows the user to record the actions taken by the physician during a procedure. The data recorded during the procedure is stored as an XML document, which can then be further processed. We have successfully gathered data on a number if cases using a tablet PC, and these preliminary results show the feasibility of using this software in an interventional radiology setting. We are currently accruing additional cases and when more data has been collected we will analyze the workflow of these procedures to look for inefficiencies and potential improvements.
NASA Astrophysics Data System (ADS)
Patil, Sameer; Kobsa, Alfred; John, Ajita; Brotman, Lynne S.; Seligmann, Doree
To understand how collaborators reconcile the often conflicting needs of awareness and privacy, we studied a large software development project in a multinational corporation involving individuals at sites in the U.S. and India. We present a theoretical framework describing privacy management practices and their determinants that emerged from field visits, interviews, and questionnaire responses. The framework identifies five relevant situational characteristics: issue(s) under consideration, physical place(s) involved in interaction(s), temporal aspects, affordances and limitations presented by technology, and nature of relationships among parties. Each actor, in turn, interprets the situation based on several simultaneous influences: self, team, work site, organization, and cultural environment. This interpretation guides privacy management action(s). Past actions form a feedback loop refining and/or reinforcing the interpretive influences. The framework suggests that effective support for privacy management will require that designers follow a socio-technical approach incorporating a wider scope of situational and interpretive differences.
NASA Technical Reports Server (NTRS)
2002-01-01
Goddard Space Flight Center and Triangle Research & Development Corporation collaborated to create "Smart Eyes," a charge coupled device camera that, for the first time, could read and measure bar codes without the use of lasers. The camera operated in conjunction with software and algorithms created by Goddard and Triangle R&D that could track bar code position and direction with speed and precision, as well as with software that could control robotic actions based on vision system input. This accomplishment was intended for robotic assembly of the International Space Station, helping NASA to increase production while using less manpower. After successfully completing the two- phase SBIR project with Goddard, Triangle R&D was awarded a separate contract from the U.S. Department of Transportation (DOT), which was interested in using the newly developed NASA camera technology to heighten automotive safety standards. In 1990, Triangle R&D and the DOT developed a mask made from a synthetic, plastic skin covering to measure facial lacerations resulting from automobile accidents. By pairing NASA's camera technology with Triangle R&D's and the DOT's newly developed mask, a system that could provide repeatable, computerized evaluations of laceration injury was born.
Process Improvement in a Radically Changing Organization
NASA Technical Reports Server (NTRS)
Varga, Denise M.; Wilson, Barbara M.
2007-01-01
This presentation describes how the NASA Glenn Research Center planned and implemented a process improvement effort in response to a radically changing environment. As a result of a presidential decision to redefine the Agency's mission, many ongoing projects were canceled and future workload would be awarded based on relevance to the Exploration Initiative. NASA imposed a new Procedural Requirements standard on all future software development, and the Center needed to redesign its processes from CMM Level 2 objectives to meet the new standard and position itself for CMMI. The intended audience for this presentation is systems/software developers and managers in a large, research-oriented organization that may need to respond to imposed standards while also pursuing CMMI Maturity Level goals. A set of internally developed tools will be presented, including an overall Process Improvement Action Item database, a formal inspection/peer review tool, metrics collection spreadsheet, and other related technologies. The Center also found a need to charter Technical Working Groups (TWGs) to address particular Process Areas. In addition, a Marketing TWG was needed to communicate the process changes to the development community, including an innovative web site portal.
Newman, Eric D; Lerch, Virginia; Billet, Jon; Berger, Andrea; Kirchner, H Lester
2015-04-01
Electronic health records (EHRs) are not optimized for chronic disease management. To improve the quality of care for patients with rheumatic disease, we developed electronic data capture, aggregation, display, and documentation software. The software integrated and reassembled information from the patient (via a touchscreen questionnaire), nurse, physician, and EHR into a series of actionable views. Core functions included trends over time, rheumatology-related demographics, and documentation for patient and provider. Quality measures collected included patient-reported outcomes, disease activity, and function. The software was tested and implemented in 3 rheumatology departments, and integrated into routine care delivery. Post-implementation evaluation measured adoption, efficiency, productivity, and patient perception. Over 2 years, 6,725 patients completed 19,786 touchscreen questionnaires. The software was adopted for use by 86% of patients and rheumatologists. Chart review and documentation time trended downward, and productivity increased by 26%. Patient satisfaction, activation, and adherence remained unchanged, although pre-implementation values were high. A strong correlation was seen between use of the software and disease control (weighted Pearson's correlation coefficient 0.5927, P = 0.0095), and a relative increase in patients with low disease activity of 3% per quarter was noted. We describe innovative software that aggregates, stores, and displays information vital to improving the quality of care for patients with chronic rheumatic disease. The software was well-adopted by patients and providers. Post-implementation, significant improvements in quality of care, efficiency of care, and productivity were demonstrated. Copyright © 2015 by the American College of Rheumatology.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-15
... INTERNATIONAL TRADE COMMISSION [Investigation No. 337-TA-752] Certain Gaming and Entertainment Consoles, Related Software, and Components Thereof; Notice of Request for Statements on the Public Interest AGENCY: U.S. International Trade Commission. ACTION: Notice. SUMMARY: Notice is hereby given that the...
1997-06-17
There is Good and Bad News With CMMs8 *bad news: process improvement takes time *good news: the first benefit Is better schedule management With PSP s...e g similar supp v EURO not sudden death toolset for assessment and v EURO => Business benefits detailed analysis) . EURO could collapse (low risk...from SPI live on even after year 2000. Priority BENEFITS Actions * Improved management and application development processes * Strengthened Change
Optimal Planning and Problem-Solving
NASA Technical Reports Server (NTRS)
Clemet, Bradley; Schaffer, Steven; Rabideau, Gregg
2008-01-01
CTAEMS MDP Optimal Planner is a problem-solving software designed to command a single spacecraft/rover, or a team of spacecraft/rovers, to perform the best action possible at all times according to an abstract model of the spacecraft/rover and its environment. It also may be useful in solving logistical problems encountered in commercial applications such as shipping and manufacturing. The planner reasons around uncertainty according to specified probabilities of outcomes using a plan hierarchy to avoid exploring certain kinds of suboptimal actions. Also, planned actions are calculated as the state-action space is expanded, rather than afterward, to reduce by an order of magnitude the processing time and memory used. The software solves planning problems with actions that can execute concurrently, that have uncertain duration and quality, and that have functional dependencies on others that affect quality. These problems are modeled in a hierarchical planning language called C_TAEMS, a derivative of the TAEMS language for specifying domains for the DARPA Coordinators program. In realistic environments, actions often have uncertain outcomes and can have complex relationships with other tasks. The planner approaches problems by considering all possible actions that may be taken from any state reachable from a given, initial state, and from within the constraints of a given task hierarchy that specifies what tasks may be performed by which team member.
The maturing of the quality improvement paradigm in the SEL
NASA Technical Reports Server (NTRS)
Basili, Victor R.
1993-01-01
The Software Engineering Laboratory uses a paradigm for improving the software process and product, called the quality improvement paradigm. This paradigm has evolved over the past 18 years, along with our software development processes and product. Since 1976, when we first began the SEL, we have learned a great deal about improving the software process and product, making a great many mistakes along the way. Quality improvement paradigm, as it is currently defined, can be broken up into six steps: characterize the current project and its environment with respect to the appropriate models and metrics; set the quantifiable goals for successful project performance and improvement; choose the appropriate process model and supporting methods and tools for this project; execute the processes, construct the products, and collect, validate, and analyze the data to provide real-time feedback for corrective action; analyze the data to evaluate the current practices, determine problems, record findings, and make recommendations for future project improvements; and package the experience gained in the form of updated and refined models and other forms of structured knowledge gained from this and prior projects and save it in an experience base to be reused on future projects.
NASA Technical Reports Server (NTRS)
Pisanich, Greg; Ippolito, Corey; Plice, Laura; Young, Larry A.; Lau, Benton
2003-01-01
This paper details the development and demonstration of an autonomous aerial vehicle embodying search and find mission planning and execution srrategies inspired by foraging behaviors found in biology. It begins by describing key characteristics required by an aeria! explorer to support science and planetary exploration goals, and illustrates these through a hypothetical mission profile. It next outlines a conceptual bio- inspired search and find autonomy architecture that implements observations, decisions, and actions through an "ecology" of producer, consumer, and decomposer agents. Moving from concepts to development activities, it then presents the results of mission representative UAV aerial surveys at a Mars analog site. It next describes hardware and software enhancements made to a commercial small fixed-wing UAV system, which inc!nde a ncw dpvelopnent architecture that also provides hardware in the loop simulation capability. After presenting the results of simulated and actual flights of bioinspired flight algorithms, it concludes with a discussion of future development to include an expansion of system capabilities and field science support.
Model for Simulating a Spiral Software-Development Process
NASA Technical Reports Server (NTRS)
Mizell, Carolyn; Curley, Charles; Nayak, Umanath
2010-01-01
A discrete-event simulation model, and a computer program that implements the model, have been developed as means of analyzing a spiral software-development process. This model can be tailored to specific development environments for use by software project managers in making quantitative cases for deciding among different software-development processes, courses of action, and cost estimates. A spiral process can be contrasted with a waterfall process, which is a traditional process that consists of a sequence of activities that include analysis of requirements, design, coding, testing, and support. A spiral process is an iterative process that can be regarded as a repeating modified waterfall process. Each iteration includes assessment of risk, analysis of requirements, design, coding, testing, delivery, and evaluation. A key difference between a spiral and a waterfall process is that a spiral process can accommodate changes in requirements at each iteration, whereas in a waterfall process, requirements are considered to be fixed from the beginning and, therefore, a waterfall process is not flexible enough for some projects, especially those in which requirements are not known at the beginning or may change during development. For a given project, a spiral process may cost more and take more time than does a waterfall process, but may better satisfy a customer's expectations and needs. Models for simulating various waterfall processes have been developed previously, but until now, there have been no models for simulating spiral processes. The present spiral-process-simulating model and the software that implements it were developed by extending a discrete-event simulation process model of the IEEE 12207 Software Development Process, which was built using commercially available software known as the Process Analysis Tradeoff Tool (PATT). Typical inputs to PATT models include industry-average values of product size (expressed as number of lines of code), productivity (number of lines of code per hour), and number of defects per source line of code. The user provides the number of resources, the overall percent of effort that should be allocated to each process step, and the number of desired staff members for each step. The output of PATT includes the size of the product, a measure of effort, a measure of rework effort, the duration of the entire process, and the numbers of injected, detected, and corrected defects as well as a number of other interesting features. In the development of the present model, steps were added to the IEEE 12207 waterfall process, and this model and its implementing software were made to run repeatedly through the sequence of steps, each repetition representing an iteration in a spiral process. Because the IEEE 12207 model is founded on a waterfall paradigm, it enables direct comparison of spiral and waterfall processes. The model can be used throughout a software-development project to analyze the project as more information becomes available. For instance, data from early iterations can be used as inputs to the model, and the model can be used to estimate the time and cost of carrying the project to completion.
Operator function modeling: An approach to cognitive task analysis in supervisory control systems
NASA Technical Reports Server (NTRS)
Mitchell, Christine M.
1987-01-01
In a study of models of operators in complex, automated space systems, an operator function model (OFM) methodology was extended to represent cognitive as well as manual operator activities. Development continued on a software tool called OFMdraw, which facilitates construction of an OFM by permitting construction of a heterarchic network of nodes and arcs. Emphasis was placed on development of OFMspert, an expert system designed both to model human operation and to assist real human operators. The system uses a blackboard method of problem solving to make an on-line representation of operator intentions, called ACTIN (actions interpreter).
NASA Astrophysics Data System (ADS)
Tošić, Saša; Mitrović, Dejan; Ivanović, Mirjana
2013-10-01
Agent-oriented programming languages are designed to simplify the development of software agents, especially those that exhibit complex, intelligent behavior. This paper presents recent improvements of AgScala, an agent-oriented programming language based on Scala. AgScala includes declarative constructs for managing beliefs, actions and goals of intelligent agents. Combined with object-oriented and functional programming paradigms offered by Scala, it aims to be an efficient framework for developing both purely reactive, and more complex, deliberate agents. Instead of the Prolog back-end used initially, the new version of AgScala relies on Agent Planning Package, a more advanced system for automated planning and reasoning.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-06
... Documents Access and Management System (ADAMS): You may access publicly available documents online in the... Management Plans for Digital Computer Software used in Safety Systems of Nuclear Power Plants,'' issued for... Used in Safety Systems of Nuclear Power Plants AGENCY: Nuclear Regulatory Commission. ACTION: Revision...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-25
... INTERNATIONAL TRADE COMMISSION [Investigation No. 337-TA-750] Certain Mobile Devices and Related Software; Request for Statements on the Public Interest AGENCY: U.S. International Trade Commission. ACTION: Notice. SUMMARY: Notice is hereby given that the presiding administrative law judge has issued a Final...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-30
... INTERNATIONAL TRADE COMMISSION [Investigation No. 337-TA-750] Certain Mobile Devices and Related Software Corrected Notice of Request for Statements on the Public Interest AGENCY: U.S. International Trade Commission. ACTION: Correction to Notice. SUMMARY: This Notice corrects the notice in the same matter...
Scardovelli, Terigi Augusto; Frère, Annie France
2015-01-01
Many children with motor impairments cannot participate in games and jokes that contribute to their formation. Currently, commercial computer games there are few options of software and sufficiently flexible access devices to meet the needs of this group of children. In this study, a peripheral access device and a 3D computerized game that do not require the actions of dragging, clicking, or activating various keys at the same time were developed. The peripheral access device consists of a webcam and a supervisory system that processes the images. This method provides a field of action that can be adjusted to various types of motor impairments. To analyze the sensitivity of the commands, a virtual course was developed using the scenario of a path of straight lines and curves. A volunteer with good ability in virtual games performed a short training with the virtual course and, after 15min of training, obtained similar results with a standard keyboard and the adapted peripheral device. A 3D game in the Amazon forest was developed using the Blender 3D tool. This free software was used to model the characters and scenarios. To evaluate the usability of the 3D game, the game was tested by 20 volunteers without motor impairments (group A) and 13 volunteers with severe motor limitations of the upper limbs (group B). All the volunteers (group A and B) could easily execute all the actions of the game using the adapted peripheral device. The majority positively evaluated the questions of usability and expressed their satisfaction. The computerized game coupled to the adapted device will offer the option of leisure and learning to people with severe motor impairments who previously lacked this possibility. It also provided equality in this activity to all the users. Copyright © 2014. Published by Elsevier Ireland Ltd.
Chou, Ting-Chao
2006-09-01
The median-effect equation derived from the mass-action law principle at equilibrium-steady state via mathematical induction and deduction for different reaction sequences and mechanisms and different types of inhibition has been shown to be the unified theory for the Michaelis-Menten equation, Hill equation, Henderson-Hasselbalch equation, and Scatchard equation. It is shown that dose and effect are interchangeable via defined parameters. This general equation for the single drug effect has been extended to the multiple drug effect equation for n drugs. These equations provide the theoretical basis for the combination index (CI)-isobologram equation that allows quantitative determination of drug interactions, where CI < 1, = 1, and > 1 indicate synergism, additive effect, and antagonism, respectively. Based on these algorithms, computer software has been developed to allow automated simulation of synergism and antagonism at all dose or effect levels. It displays the dose-effect curve, median-effect plot, combination index plot, isobologram, dose-reduction index plot, and polygonogram for in vitro or in vivo studies. This theoretical development, experimental design, and computerized data analysis have facilitated dose-effect analysis for single drug evaluation or carcinogen and radiation risk assessment, as well as for drug or other entity combinations in a vast field of disciplines of biomedical sciences. In this review, selected examples of applications are given, and step-by-step examples of experimental designs and real data analysis are also illustrated. The merging of the mass-action law principle with mathematical induction-deduction has been proven to be a unique and effective scientific method for general theory development. The median-effect principle and its mass-action law based computer software are gaining increased applications in biomedical sciences, from how to effectively evaluate a single compound or entity to how to beneficially use multiple drugs or modalities in combination therapies.
Software Defined Cyberinfrastructure
DOE Office of Scientific and Technical Information (OSTI.GOV)
Foster, Ian; Blaiszik, Ben; Chard, Kyle
Within and across thousands of science labs, researchers and students struggle to manage data produced in experiments, simulations, and analyses. Largely manual research data lifecycle management processes mean that much time is wasted, research results are often irreproducible, and data sharing and reuse remain rare. In response, we propose a new approach to data lifecycle management in which researchers are empowered to define the actions to be performed at individual storage systems when data are created or modified: actions such as analysis, transformation, copying, and publication. We term this approach software-defined cyberinfrastructure because users can implement powerful data management policiesmore » by deploying rules to local storage systems, much as software-defined networking allows users to configure networks by deploying rules to switches.We argue that this approach can enable a new class of responsive distributed storage infrastructure that will accelerate research innovation by allowing any researcher to associate data workflows with data sources, whether local or remote, for such purposes as data ingest, characterization, indexing, and sharing. We report on early experiments with this approach in the context of experimental science, in which a simple if-trigger-then-action (IFTA) notation is used to define rules.« less
Programming methodology for a general purpose automation controller
NASA Technical Reports Server (NTRS)
Sturzenbecker, M. C.; Korein, J. U.; Taylor, R. H.
1987-01-01
The General Purpose Automation Controller is a multi-processor architecture for automation programming. A methodology has been developed whose aim is to simplify the task of programming distributed real-time systems for users in research or manufacturing. Programs are built by configuring function blocks (low-level computations) into processes using data flow principles. These processes are activated through the verb mechanism. Verbs are divided into two classes: those which support devices, such as robot joint servos, and those which perform actions on devices, such as motion control. This programming methodology was developed in order to achieve the following goals: (1) specifications for real-time programs which are to a high degree independent of hardware considerations such as processor, bus, and interconnect technology; (2) a component approach to software, so that software required to support new devices and technologies can be integrated by reconfiguring existing building blocks; (3) resistance to error and ease of debugging; and (4) a powerful command language interface.
NASA Astrophysics Data System (ADS)
Celicourt, P.; Sam, R.; Piasecki, M.
2016-12-01
Global phenomena such as climate change and large scale environmental degradation require the collection of accurate environmental data at detailed spatial and temporal scales from which knowledge and actionable insights can be derived using data science methods. Despite significant advances in sensor network technologies, sensors and sensor network deployment remains a labor-intensive, time consuming, cumbersome and expensive task. These factors demonstrate why environmental data collection remains a challenge especially in developing countries where technical infrastructure, expertise and pecuniary resources are scarce. In addition, they also demonstrate the reason why dense and long-term environmental data collection has been historically quite difficult. Moreover, hydrometeorological data collection efforts usually overlook the (critically important) inclusion of a standards-based system for storing, managing, organizing, indexing, documenting and sharing sensor data. We are developing a cross-platform software framework using the Python programming language that will allow us to develop a low cost end-to-end (from sensor to publication) system for hydrometeorological conditions monitoring. The software framework contains provision for sensor, sensor platforms, calibration and network protocols description, sensor programming, data storage, data publication and visualization and more importantly data retrieval in a desired unit system. It is being tested on the Raspberry Pi microcomputer as end node and a laptop PC as the base station in a wireless setting.
Adding intelligent services to an object oriented system
NASA Technical Reports Server (NTRS)
Robideaux, Bret R.; Metzler, Theodore A.
1994-01-01
As today's software becomes increasingly complex, the need grows for intelligence of one sort or another to becomes part of the application, often an intelligence that does not readily fit the paradigm of one's software development. There are many methods of developing software, but at this time, the most promising is the object oriented (OO) method. This method involves an analysis to abstract the problem into separate 'objects' that are unique in the data that describe them and the behavior that they exhibit, and eventually to convert this analysis into computer code using a programming language that was designed (or retrofitted) for OO implementation. This paper discusses the creation of three different applications that are analyzed, designed, and programmed using the Shlaer/Mellor method of OO development and C++ as the programming language. All three, however, require the use of an expert system to provide an intelligence that C++ (or any other 'traditional' language) is not directly suited to supply. The flexibility of CLIPS permitted us to make modifications to it that allow seamless integration with any of our applications that require an expert system. We illustrate this integration with the following applications: (1) an after action review (AAR) station that assists a reviewer in watching a simulated tank battle and developing an AAR to critique the performance of the participants in the battle; (2) an embedded training system and over-the-shoulder coach for howitzer crewmen; and (3) a system to identify various chemical compounds from their infrared absorption spectra.
That's Infotainment!: How to Create Your Own Screencasts
ERIC Educational Resources Information Center
Kroski, Ellyssa
2009-01-01
Screencasts are videos that record the actions that take place on the computer screen, most often including a narrative audio track, in order to demonstrate various computer-related tasks, such as how to use a software program or navigate a certain Web site. All that is needed is a standard microphone and screen recording software, which can be…
ERIC Educational Resources Information Center
Sosin, Adrienne
This action research study of electronic conferencing highlights the online portions of teacher education courses at Pace University, New York. The study explores the infusion of technology into teaching and investigates the utility of a particular type of discussion software for learning. Data sources include texts of electronic conversations,…
Use of PharmaCALogy Software in a PBL Programme to Teach Nurse Prescribing
ERIC Educational Resources Information Center
Coleman, Iain P. L.; Watts, Adam S.
2007-01-01
Pharmacology is taught on a dedicated module for nurse prescribers who have a limited physical science background. To facilitate learning a problem-based approach was adopted. However, to enhance students' knowledge of drug action a PharmaCALogy software package from the British Pharmacological Society was used. Students were alternately given a…
Conducting a Trial of Web Conferencing Software: Why, How, and Perceptions from the Coalface
ERIC Educational Resources Information Center
Reushle, Shirley; Loch, Birgit
2008-01-01
This paper reports on the trial of web conferencing software conducted at a regional Australian university with a significant distance population. The paper shares preliminary findings, the views of participants and recommendations for future activity. To design and conduct the trial, an action research method was chosen because it is…
A physical action potential generator: design, implementation and evaluation.
Latorre, Malcolm A; Chan, Adrian D C; Wårdell, Karin
2015-01-01
The objective was to develop a physical action potential generator (Paxon) with the ability to generate a stable, repeatable, programmable, and physiological-like action potential. The Paxon has an equivalent of 40 nodes of Ranvier that were mimicked using resin embedded gold wires (Ø = 20 μm). These nodes were software controlled and the action potentials were initiated by a start trigger. Clinically used Ag-AgCl electrodes were coupled to the Paxon for functional testing. The Paxon's action potential parameters were tunable using a second order mathematical equation to generate physiologically relevant output, which was accomplished by varying the number of nodes involved (1-40 in incremental steps of 1) and the node drive potential (0-2.8 V in 0.7 mV steps), while keeping a fixed inter-nodal timing and test electrode configuration. A system noise floor of 0.07 ± 0.01 μV was calculated over 50 runs. A differential test electrode recorded a peak positive amplitude of 1.5 ± 0.05 mV (gain of 40x) at time 196.4 ± 0.06 ms, including a post trigger delay. The Paxon's programmable action potential like signal has the possibility to be used as a validation test platform for medical surface electrodes and their attached systems.
NASA Technical Reports Server (NTRS)
Woodward, Hugh
2005-01-01
Remote meetings are best for updates and information sharing, but it is possible to effectively facilitate decisions with a little planning. Generally, the meeting leader needs to clearly state the proposed decision and then separately poll each participant for concurrence. Normally, there will be a range of responses, requiring the facilitator to restate the proposal and repeat the process. Several iterations may be required before a consensus is achieved. I usually confirm decisions by restating the conclusion as it will appear in the meeting notes and asking the participants to express any objections. Gaining commitment to follow-up actions is never easy, of course, but tends to be particularly tricky in remote meetings. The ideal solution is to use collaboration software with a whiteboard as a means of recording the follow-up actions and responsibilities. (A Word or Excel document viewed through NetMeeting works equally well.) But if the meeting is being conducted without collaboration software, the leader must review each follow-up action explicitly, even painstakingly. I generally note follow-up actions throughout the meeting and use the last few minutes to confirm and finalize. I read each action and name the person I think owns the responsibility. When the person accepts, I validate by asking for a completion date. All the normal rules for assigning follow-up actions apply, of course. One, and only one, person must be responsible for each action, and assigning an action to somebody not present is akin to assigning it to nobody.
García de Diego, Laura; Cuervo, Marta; Martínez, J. Alfredo
2015-01-01
Computer assisted instruction (CAI) is an effective tool for evaluating and training students and professionals. In this article we will present a learning-oriented CAI, which has been developed for students and health professionals to acquire and retain new knowledge through the practice. A two-phase pilot evaluation was conducted, involving 8 nutrition experts and 30 postgraduate students, respectively. In each training session, the software developed guides users in the integral evaluation of a patient’s nutritional status and helps them to implement actions. The program includes into the format clinical tools, which can be used to recognize possible patient’s needs, to improve the clinical reasoning and to develop professional skills. Among them are assessment questionnaires and evaluation criteria, cardiovascular risk charts, clinical guidelines and photographs of various diseases. This CAI is a complete software package easy to use and versatile, aimed at clinical specialists, medical staff, scientists, educators and clinical students, which can be used as a learning tool. This application constitutes an advanced method for students and health professionals to accomplish nutritional assessments combining theoretical and empirical issues, which can be implemented in their academic curriculum. PMID:25978456
García de Diego, Laura; Cuervo, Marta; Martínez, J Alfredo
2015-01-01
Computer assisted instruction (CAI) is an effective tool for evaluating and training students and professionals. In this article we will present a learning-oriented CAI, which has been developed for students and health professionals to acquire and retain new knowledge through the practice. A two-phase pilot evaluation was conducted, involving 8 nutrition experts and 30 postgraduate students, respectively. In each training session, the software developed guides users in the integral evaluation of a patient's nutritional status and helps them to implement actions. The program includes into the format clinical tools, which can be used to recognize possible patient's needs, to improve the clinical reasoning and to develop professional skills. Among them are assessment questionnaires and evaluation criteria, cardiovascular risk charts, clinical guidelines and photographs of various diseases. This CAI is a complete software package easy to use and versatile, aimed at clinical specialists, medical staff, scientists, educators and clinical students, which can be used as a learning tool. This application constitutes an advanced method for students and health professionals to accomplish nutritional assessments combining theoretical and empirical issues, which can be implemented in their academic curriculum.
Development of a remote control console for the HHIRF 25-MV tandem accelerator
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hasanul Basher, A.M.
1991-09-01
The CAMAC-based control system for the 25-MV Tandem Accelerator at HHIRF uses two Perkin-Elmer, 32-bit minicomputers: a message-switching computer and a supervisory computer. Two operator consoles are located on one of the six serial highways. Operator control is provided by means of a console CRT, trackball, assignable shaft encoders and meters. The message-switching computer transmits and receives control information on the serial highways. At present, the CRT pages with updated parameters can be displayed and parameters can be controlled only from the two existing consoles, one in the Tandem control room and the other in the ORIC control room. Itmore » has become necessary to expand the control capability to several other locations in the building. With the expansion of control and monitoring capability of accelerator parameters to other locations, the operators will be able to control and observe the result of the control action at the same time. Since the new control console will be PC-based, the existing page format will be changed. The PC will be communicating with the Perkin-Elmer through RS-232 and a communication software package. Hardware configuration has been established, a communication software program that reads the pages from the shared memory has been developed. In this paper, we present the implementation strategy, works completed, existing and new page format, future action plans, explanation of pages and use of related global variables, a sample session, and flowcharts.« less
NASA Technical Reports Server (NTRS)
2001-01-01
Qualtech Systems, Inc. developed a complete software system with capabilities of multisignal modeling, diagnostic analysis, run-time diagnostic operations, and intelligent interactive reasoners. Commercially available as the TEAMS (Testability Engineering and Maintenance System) tool set, the software can be used to reveal unanticipated system failures. The TEAMS software package is broken down into four companion tools: TEAMS-RT, TEAMATE, TEAMS-KB, and TEAMS-RDS. TEAMS-RT identifies good, bad, and suspect components in the system in real-time. It reports system health results from onboard tests, and detects and isolates failures within the system, allowing for rapid fault isolation. TEAMATE takes over from where TEAMS-RT left off by intelligently guiding the maintenance technician through the troubleshooting procedure, repair actions, and operational checkout. TEAMS-KB serves as a model management and collection tool. TEAMS-RDS (TEAMS-Remote Diagnostic Server) has the ability to continuously assess a system and isolate any failure in that system or its components, in real time. RDS incorporates TEAMS-RT, TEAMATE, and TEAMS-KB in a large-scale server architecture capable of providing advanced diagnostic and maintenance functions over a network, such as the Internet, with a web browser user interface.
An Investigation of Tool Mediation in the Research Activity of Eighth-Grade Students
ERIC Educational Resources Information Center
Henry, Nancy L.
2016-01-01
Technology and a variety of resources play an important role in students' educational lives. Vygotsky's (1987) theory of tool mediation suggests that cultural tools, such as computer software influence individuals' thinking and action. However, it is not completely understood how technology and other resources influence student action. Middle…
Measuring Cyber Operations Effectiveness
2014-11-01
are advanced firewalls capable of taking limited action to block malicious traffic or hacking attempts. Their capabilities vary widely and must be...using many automated tools, included in the defense hardware and software itself. These devices include hardware and software firewalls , Network...DoD networks are probed millions of times per day…the Air Force blocks roughly two billion threats and denies two million emails each week
Ren, Jun-Guo; Wang, Dong-Zhi; Lei, Lei; Kang, Li; Liu, Jian-Xun
2017-05-01
To find the relationship between traditional efficacy of Chinese medicine and modern pharmacological action by using data mining, and provide information and reference for further research and development for the pharmacology research of traditional Chinese medicine.The information of 547 kinds of traditional Chinese medicines, 335 kinds of Chinese medicine effects and 86 kinds of pharmacological actions were collected and processed in Clinical Guide to the Chinese Pharmacopoeia published in 2010; Access and Excel software were used to analyze the frequence and frequency of single effect, pharmacological action, and both. In addition, the relationship between efficacy and pharmacology was analyzed with the clearing heat and antibacterial effects as the example. The analysis results showed that 547 kinds of Chinese medicines involved 335 kinds of Chinese medicine effects and 86 kinds of pharmacological actions. Among them, the most frequent Chinese medicine effect was"clearing heat", whose frequence was 130 and the frequency was 0.24; the most frequent pharmacological action was "anti-inflammatory action" whose frequence was 191 and the frequency was 0.35. The most common efficacy-pharmacological action group was "clearing heat" and "anti-bacterial action", whose frequence was 75 and the frequency was 0.26. The couple of "purgation" and "cathartic effect" had the largest frequency of 0.30, but they just appeared together for 3 times. There were 52 kinds of pharmacological actions that occurred together with clearing heat, of which, the top 10 were anti-bacterial action, anti-inflammatory action, antineoplastic action, anti-hepatic injury action, immunoregulation action, antipyretic action, antiviralaction, hypoglycemic action, antioxidant action and analgesic action. There were 161 kinds of Chinese medicine effects that occurred together with anti-bacterial action, of which, the top 10 were clearing heat, detoxification, detumescence, analgesia, resolving dampness, pesticide, cooling blood, expelling wind, eliminating dampness and hemostasis. These results suggested that there was a certain relationship between traditional Chinese medicine effects and modern pharmacological actions. Copyright© by the Chinese Pharmaceutical Association.
Space Flight Software Development Software for Intelligent System Health Management
NASA Technical Reports Server (NTRS)
Trevino, Luis C.; Crumbley, Tim
2004-01-01
The slide presentation examines the Marshall Space Flight Center Flight Software Branch, including software development projects, mission critical space flight software development, software technical insight, advanced software development technologies, and continuous improvement in the software development processes and methods.
Beyond Wiki to Judgewiki for Transparent Climate Change Decisions
NASA Astrophysics Data System (ADS)
Capron, M. E.
2008-12-01
Climate Change is like the prisoner's dilemma, a zero-sum game, or cheating in sports. Everyone and every country is tempted to selfishly maintain or advance their standard of living. The tremendous difference between standards of living amplifies the desire to opt out of Climate Change solutions adverse to economic competitiveness. Climate Change is also exceedingly complex. No one person, one organization, one country, or partial collection of countries has the capacity and the global support needed to make decisions on Climate Change solutions. There are thousands of potential actions, tens of thousands of known and unknown environmental and economic impacts. Some actions are belatedly found to be unsustainable beyond token volumes, corn ethanol or soy-biodiesel for example. Mankind can address human nature and complexity with a globally transparent information and decision process available to all 7 billion of us. We need a process that builds trust and simplifies complexity. Fortunately, we have the Internet for trust building communication and computers to simplify complexity. Mankind can produce new software tailored to the challenge. We would combine group information collection software (a wiki) with a decision-matrix (a judge), market forecasting, and video games to produce the tool mankind needs for trust building transparent decisions on Climate Change actions. The resulting software would be a judgewiki.
Staudacher, Ingo; Nalpathamkalam, Asha Roy; Uhlmann, Lorenz; Illg, Claudius; Seehausen, Sebastian; Akhavanpoor, Mohammadreza; Buchauer, Anke; Geis, Nicolas; Lugenbiel, Patrick; Schweizer, Patrick A; Xynogalos, Panagiotis; Zylla, Maura M; Scholz, Eberhard; Zitron, Edgar; Katus, Hugo A; Thomas, Dierk
2017-10-11
Increasing numbers of patients with cardiovascular implantable electronic devices (CIEDs) and limited follow-up capacities highlight unmet challenges in clinical electrophysiology. Integrated software (MediConnect ® ) enabling fully digital processing of device interrogation data has been commercially developed to facilitate follow-up visits. We sought to assess feasibility of fully digital data processing (FDDP) during ambulatory device follow-up in a high-volume tertiary hospital to provide guidance for future users of FDDP software. A total of 391 patients (mean age, 70 years) presenting to the outpatient department for routine device follow-up were analyzed (pacemaker, 44%; implantable cardioverter defibrillator, 39%; cardiac resynchronization therapy device, 16%). Quality of data transfer and follow-up duration were compared between digital (n = 265) and manual processing of device data (n = 126). Digital data import was successful, complete and correct in 82% of cases when early software versions were used. When using the most recent software version the rate of successful digital data import increased to 100%. Software-based import of interrogation data was complete and without failure in 97% of cases. The mean duration of a follow-up visit did not differ between the two groups (digital 18.7 min vs. manual data transfer 18.2 min). FDDP software was successfully implemented into the ambulatory follow-up of patients with implanted pacemakers and defibrillators. Digital data import into electronic patient management software was feasible and supported the physician's workflow. The total duration of follow-up visits comprising technical device interrogation and clinical actions was not affected in the present tertiary center outpatient cohort.
Irreducible Tests for Space Mission Sequencing Software
NASA Technical Reports Server (NTRS)
Ferguson, Lisa
2012-01-01
As missions extend further into space, the modeling and simulation of their every action and instruction becomes critical. The greater the distance between Earth and the spacecraft, the smaller the window for communication becomes. Therefore, through modeling and simulating the planned operations, the most efficient sequence of commands can be sent to the spacecraft. The Space Mission Sequencing Software is being developed as the next generation of sequencing software to ensure the most efficient communication to interplanetary and deep space mission spacecraft. Aside from efficiency, the software also checks to make sure that communication during a specified time is even possible, meaning that there is not a planet or moon preventing reception of a signal from Earth or that two opposing commands are being given simultaneously. In this way, the software not only models the proposed instructions to the spacecraft, but also validates the commands as well.To ensure that all spacecraft communications are sequenced properly, a timeline is used to structure the data. The created timelines are immutable and once data is as-signed to a timeline, it shall never be deleted nor renamed. This is to prevent the need for storing and filing the timelines for use by other programs. Several types of timelines can be created to accommodate different types of communications (activities, measurements, commands, states, events). Each of these timeline types requires specific parameters and all have options for additional parameters if needed. With so many combinations of parameters available, the robustness and stability of the software is a necessity. Therefore a baseline must be established to ensure the full functionality of the software and it is here where the irreducible tests come into use.
Novel fully integrated computer system for custom footwear: from 3D digitization to manufacturing
NASA Astrophysics Data System (ADS)
Houle, Pascal-Simon; Beaulieu, Eric; Liu, Zhaoheng
1998-03-01
This paper presents a recently developed custom footwear system, which integrates 3D digitization technology, range image fusion techniques, a 3D graphical environment for corrective actions, parametric curved surface representation and computer numerical control (CNC) machining. In this system, a support designed with the help of biomechanics experts can stabilize the foot in a correct and neutral position. The foot surface is then captured by a 3D camera using active ranging techniques. A software using a library of documented foot pathologies suggests corrective actions on the orthosis. Three kinds of deformations can be achieved. The first method uses previously scanned pad surfaces by our 3D scanner, which can be easily mapped onto the foot surface to locally modify the surface shape. The second kind of deformation is construction of B-Spline surfaces by manipulating control points and modifying knot vectors in a 3D graphical environment to build desired deformation. The last one is a manual electronic 3D pen, which may be of different shapes and sizes, and has an adjustable 'pressure' information. All applied deformations should respect a G1 surface continuity, which ensure that the surface can accustom a foot. Once the surface modification process is completed, the resulting data is sent to manufacturing software for CNC machining.
Autonomic Computing for Spacecraft Ground Systems
NASA Technical Reports Server (NTRS)
Li, Zhenping; Savkli, Cetin; Jones, Lori
2007-01-01
Autonomic computing for spacecraft ground systems increases the system reliability and reduces the cost of spacecraft operations and software maintenance. In this paper, we present an autonomic computing solution for spacecraft ground systems at NASA Goddard Space Flight Center (GSFC), which consists of an open standard for a message oriented architecture referred to as the GMSEC architecture (Goddard Mission Services Evolution Center), and an autonomic computing tool, the Criteria Action Table (CAT). This solution has been used in many upgraded ground systems for NASA 's missions, and provides a framework for developing solutions with higher autonomic maturity.
Automated Diagnosis Of Conditions In A Plant-Growth Chamber
NASA Technical Reports Server (NTRS)
Clinger, Barry R.; Damiano, Alfred L.
1995-01-01
Biomass Production Chamber Operations Assistant software and hardware constitute expert system that diagnoses mechanical failures in controlled-environment hydroponic plant-growth chamber and recommends corrective actions to be taken by technicians. Subjects of continuing research directed toward development of highly automated closed life-support systems aboard spacecraft to process animal (including human) and plant wastes into food and oxygen. Uses Microsoft Windows interface to give technicians intuitive, efficient access to critical data. In diagnostic mode, system prompts technician for information. When expert system has enough information, it generates recovery plan.
Assessment & Commitment Tracking System (ACTS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bryant, Robert A.; Childs, Teresa A.; Miller, Michael A.
2004-12-20
The ACTS computer code provides a centralized tool for planning and scheduling assessments, tracking and managing actions associated with assessments or that result from an event or condition, and "mining" data for reporting and analyzing information for improving performance. The ACTS application is designed to work with the MS SQL database management system. All database interfaces are written in SQL. The following software is used to develop and support the ACTS application: Cold Fusion HTML JavaScript Quest TOAD Microsoft Visual Source Safe (VSS) HTML Mailer for sending email Microsoft SQL Microsoft Internet Information Server
A Teleo-Reactive Node for Implementing Internet of Things Systems
Álvarez, Bárbara; Fernández, Diego
2018-01-01
The Internet of Things (IoT) is one of today’s main disruptive technologies and, although massive research has been carried out in recent years, there are still some open issues such as the consideration of software engineering methods and tools. We propose the adoption of the Teleo-Reactive approach in order to facilitate the development of Internet of Things systems as a set of communicating Teleo-Reactive nodes. The software behavior of the nodes is specified in terms of goals, perceptions and actions over the environment, achieving higher abstraction than using general-purpose programming languages and therefore, enhancing the involvement of non-technical users in the specification process. Throughout this paper, we describe the elements of a Teleo-Reactive node and a systematic procedure for translating Teleo-Reactive specifications into executable code for Internet of Things devices. The case study of a robotic agent is used in order to validate the whole approach. PMID:29614772
A Teleo-Reactive Node for Implementing Internet of Things Systems.
Sánchez, Pedro; Álvarez, Bárbara; Antolinos, Elías; Fernández, Diego; Iborra, Andrés
2018-04-01
The Internet of Things (IoT) is one of today's main disruptive technologies and, although massive research has been carried out in recent years, there are still some open issues such as the consideration of software engineering methods and tools. We propose the adoption of the Teleo-Reactive approach in order to facilitate the development of Internet of Things systems as a set of communicating Teleo-Reactive nodes. The software behavior of the nodes is specified in terms of goals, perceptions and actions over the environment, achieving higher abstraction than using general-purpose programming languages and therefore, enhancing the involvement of non-technical users in the specification process. Throughout this paper, we describe the elements of a Teleo-Reactive node and a systematic procedure for translating Teleo-Reactive specifications into executable code for Internet of Things devices. The case study of a robotic agent is used in order to validate the whole approach.
Counter Action Procedure Generation in an Emergency Situation of Nuclear Power Plants
NASA Astrophysics Data System (ADS)
Gofuku, A.
2018-02-01
Lessons learned from the Fukushima Daiichi accident revealed various weak points in the design and operation of nuclear power plants at the time although there were many resilient activities made by the plant staff under difficult work environment. In order to reinforce the measures to make nuclear power plants more resilient, improvement of hardware and improvement of education and training of nuclear personnel are considered. In addition, considering the advancement of computer technology and artificial intelligence, it is a promising way to develop software tools to support the activities of plant staff.This paper focuses on the software tools to support the operations by human operators and introduces a concept of an intelligent operator support system that is called as co-operator. This paper also describes a counter operation generation technique the authors are studying as a core component of the co-operator.
Workflow in interventional radiology: uterine fibroid embolization (UFE)
NASA Astrophysics Data System (ADS)
Lindisch, David; Neumuth, Thomas; Burgert, Oliver; Spies, James; Cleary, Kevin
2008-03-01
Workflow analysis can be used to record the steps taken during clinical interventions with the goal of identifying bottlenecks and streamlining the procedure efficiency. In this study, we recorded the workflow for uterine fibroid embolization (UFE) procedures in the interventional radiology suite at Georgetown University Hospital in Washington, DC, USA. We employed a custom client/server software architecture developed by the Innovation Center for Computer Assisted Surgery (ICCAS) at the University of Leipzig, Germany. This software runs in a JAVA environment and enables an observer to record the actions taken by the physician and surgical team during these interventions. The data recorded is stored as an XML document, which can then be further processed. We recorded data from 30 patients and found a mean intervention time of 01:49:46 (+/- 16:04) minutes. The critical intervention step, the embolization, had a mean time of 00:15:42 (+/- 05:49) minutes, which was only 15% of the total intervention time.
Guzsvinecz, Tibor; Szucs, Veronika; Sik Lányi, Cecília
2015-01-01
Nowadays the development of virtual reality-based application is one of the most dynamically growing areas. These applications have a wide user base, more and more devices which are providing several kinds of user interactions and are available on the market. In the applications where the not-handheld devices are not necessary, the potential is that these can be used in educational, entertainment and rehabilitation applications. The purpose of this paper is to examine the precision and the efficiency of the not-handheld devices with user interaction in the virtual reality-based applications. The first task of the developed application is to support the rehabilitation process of stroke patients in their homes. A newly developed application will be introduced in this paper, which uses the two popular devices, the Shimmer sensor and the Microsoft Kinect sensor. To identify and to validate the actions of the user these sensors are working together in parallel mode. For the problem solving, the application is available to record an educational pattern, and then the software compares this pattern to the action of the user. The goal of the current research is to examine the extent of the difference in the recognition of the gestures, how precisely the two sensors are identifying the predefined actions. This could affect the rehabilitation process of the stroke patients and influence the efficiency of the rehabilitation. This application was developed in C# programming language and uses the original Shimmer connecting application as a base. During the working of this application it is possible to teach five-five different movements with the use of the Shimmer and the Microsoft Kinect sensors. The application can recognize these actions at any later time. This application uses a file-based database and the runtime memory of the application to store the saved data in order to reach the actions easier. The conclusion is that much more precise data were collected from the Microsoft Kinect sensor than the Shimmer sensors.
ERIC Educational Resources Information Center
Lee, Jong Seok
2013-01-01
Escalation of commitment is manifested as a behavior in which an individual resists withdrawing from a failing course of action despite negative feedback, and it is an enduring problem that occurs in a variety of situations, including R&D investment decisions and software project overruns. To date, a variety of theoretical explanations have…
Software Test Handbook: Software Test Guidebook. Volume 2.
1984-03-01
system test phase for usually one or more of the following three reasons. a. To simulate stress and volume tests (e.g., simulating the actions of 100...peer reviews that differ in formality, participant roles and responsibilities, output produced, and input required. a. Information Input. The input to...form (containing review summary and group decision). - Inspection- Inspection schedule and memo (defining individual roles and respon- sibilities
NASA Technical Reports Server (NTRS)
1975-01-01
A system is presented which processes FORTRAN based software systems to surface potential problems before they become execution malfunctions. The system complements the diagnostic capabilities of compilers, loaders, and execution monitors rather than duplicating these functions. Also, it emphasizes frequent sources of FORTRAN problems which require inordinate manual effort to identify. The principle value of the system is extracting small sections of unusual code from the bulk of normal sequences. Code structures likely to cause immediate or future problems are brought to the user's attention. These messages stimulate timely corrective action of solid errors and promote identification of 'tricky' code. Corrective action may require recoding or simply extending software documentation to explain the unusual technique.
Self-organization of granular media in airborne ultrasonic fields
NASA Astrophysics Data System (ADS)
Bobrovskaya, A. I.; Stepanenko, D. A.; Minchenya, V. T.
2012-05-01
The article presents results of experimental and theoretical studies of behaviour of granular media (powder materials) in airborne ultrasonic field created by flexurally-vibrating ring-shaped waveguide with resonant frequency in the range 20-40 kHz. Experiments show that action of acoustic radiation forces results in formation of ordered structures in the form of ultrathin walls (monolayers) with number corresponding to the number of ring nodal points. Action of secondary radiation forces (König forces) results in formation of collateral (secondary) walls situated nearby primary walls. Experimental observations are compared with results of modelling of acoustic radiation force field inside the ring by means of COMSOL Multiphysics and MathCad software. Results of the studies can be used in development of devices for ultrasonic separation and concentration of particles as well as for formation of ordered monolayers from spherical particles.
Executable medical guidelines with Arden Syntax-Applications in dermatology and obstetrics.
Seitinger, Alexander; Rappelsberger, Andrea; Leitich, Harald; Binder, Michael; Adlassnig, Klaus-Peter
2016-08-12
Clinical decision support systems (CDSSs) are being developed to assist physicians in processing extensive data and new knowledge based on recent scientific advances. Structured medical knowledge in the form of clinical alerts or reminder rules, decision trees or tables, clinical protocols or practice guidelines, score algorithms, and others, constitute the core of CDSSs. Several medical knowledge representation and guideline languages have been developed for the formal computerized definition of such knowledge. One of these languages is Arden Syntax for Medical Logic Systems, an International Health Level Seven (HL7) standard whose development started in 1989. Its latest version is 2.10, which was presented in 2014. In the present report we discuss Arden Syntax as a modern medical knowledge representation and processing language, and show that this language is not only well suited to define clinical alerts, reminders, and recommendations, but can also be used to implement and process computerized medical practice guidelines. This section describes how contemporary software such as Java, server software, web-services, XML, is used to implement CDSSs based on Arden Syntax. Special emphasis is given to clinical decision support (CDS) that employs practice guidelines as its clinical knowledge base. Two guideline-based applications using Arden Syntax for medical knowledge representation and processing were developed. The first is a software platform for implementing practice guidelines from dermatology. This application employs fuzzy set theory and logic to represent linguistic and propositional uncertainty in medical data, knowledge, and conclusions. The second application implements a reminder system based on clinically published standard operating procedures in obstetrics to prevent deviations from state-of-the-art care. A to-do list with necessary actions specifically tailored to the gestational week/labor/delivery is generated. Today, with the latest versions of Arden Syntax and the application of contemporary software development methods, Arden Syntax has become a powerful and versatile medical knowledge representation and processing language, well suited to implement a large range of CDSSs, including clinical-practice-guideline-based CDSSs. Moreover, such CDS is provided and can be shared as a service by different medical institutions, redefining the sharing of medical knowledge. Arden Syntax is also highly flexible and provides developers the freedom to use up-to-date software design and programming patterns for external patient data access. Copyright © 2016. Published by Elsevier B.V.
Kehoe, Helen
2017-01-01
Changes to the software used in general practice could improve the collection of the Aboriginal and Torres Strait Islander status of all patients, and boost access to healthcare measures specifically for Aboriginal and Torres Strait Islander peoples provided directly or indirectly by general practitioners (GPs). Despite longstanding calls for improvements to general practice software to better support Aboriginal and Torres Strait Islander health, little change has been made. The aim of this article is to promote software improvements by identifying desirable software attributes and encouraging GPs to promote their adoption. Establishing strong links between collecting Aboriginal and Torres Strait Islander status, clinical decision supports, and uptake of GP-mediated health measures specifically for Aboriginal and Torres Strait Islander peoples - and embedding these links in GP software - is a long overdue reform. In the absence of government initiatives in this area, GPs are best placed to advocate for software changes, using the model described here as a starting point for action.
Software platform for managing the classification of error- related potentials of observers
NASA Astrophysics Data System (ADS)
Asvestas, P.; Ventouras, E.-C.; Kostopoulos, S.; Sidiropoulos, K.; Korfiatis, V.; Korda, A.; Uzunolglu, A.; Karanasiou, I.; Kalatzis, I.; Matsopoulos, G.
2015-09-01
Human learning is partly based on observation. Electroencephalographic recordings of subjects who perform acts (actors) or observe actors (observers), contain a negative waveform in the Evoked Potentials (EPs) of the actors that commit errors and of observers who observe the error-committing actors. This waveform is called the Error-Related Negativity (ERN). Its detection has applications in the context of Brain-Computer Interfaces. The present work describes a software system developed for managing EPs of observers, with the aim of classifying them into observations of either correct or incorrect actions. It consists of an integrated platform for the storage, management, processing and classification of EPs recorded during error-observation experiments. The system was developed using C# and the following development tools and frameworks: MySQL, .NET Framework, Entity Framework and Emgu CV, for interfacing with the machine learning library of OpenCV. Up to six features can be computed per EP recording per electrode. The user can select among various feature selection algorithms and then proceed to train one of three types of classifiers: Artificial Neural Networks, Support Vector Machines, k-nearest neighbour. Next the classifier can be used for classifying any EP curve that has been inputted to the database.
Formal verification of software-based medical devices considering medical guidelines.
Daw, Zamira; Cleaveland, Rance; Vetter, Marcus
2014-01-01
Software-based devices have increasingly become an important part of several clinical scenarios. Due to their critical impact on human life, medical devices have very strict safety requirements. It is therefore necessary to apply verification methods to ensure that the safety requirements are met. Verification of software-based devices is commonly limited to the verification of their internal elements without considering the interaction that these elements have with other devices as well as the application environment in which they are used. Medical guidelines define clinical procedures, which contain the necessary information to completely verify medical devices. The objective of this work was to incorporate medical guidelines into the verification process in order to increase the reliability of the software-based medical devices. Medical devices are developed using the model-driven method deterministic models for signal processing of embedded systems (DMOSES). This method uses unified modeling language (UML) models as a basis for the development of medical devices. The UML activity diagram is used to describe medical guidelines as workflows. The functionality of the medical devices is abstracted as a set of actions that is modeled within these workflows. In this paper, the UML models are verified using the UPPAAL model-checker. For this purpose, a formalization approach for the UML models using timed automaton (TA) is presented. A set of requirements is verified by the proposed approach for the navigation-guided biopsy. This shows the capability for identifying errors or optimization points both in the workflow and in the system design of the navigation device. In addition to the above, an open source eclipse plug-in was developed for the automated transformation of UML models into TA models that are automatically verified using UPPAAL. The proposed method enables developers to model medical devices and their clinical environment using clinical workflows as one UML diagram. Additionally, the system design can be formally verified automatically.
Rolston, John D.; Gross, Robert E.; Potter, Steve M.
2009-01-01
Commercially available data acquisition systems for multielectrode recording from freely moving animals are expensive, often rely on proprietary software, and do not provide detailed, modifiable circuit schematics. When used in conjunction with electrical stimulation, they are prone to prolonged, saturating stimulation artifacts that prevent the recording of short-latency evoked responses. Yet electrical stimulation is integral to many experimental designs, and critical for emerging brain-computer interfacing and neuroprosthetic applications. To address these issues, we developed an easy-to-use, modifiable, and inexpensive system for multielectrode neural recording and stimulation. Setup costs are less than US$10,000 for 64 channels, an order of magnitude lower than comparable commercial systems. Unlike commercial equipment, the system recovers rapidly from stimulation and allows short-latency action potentials (<1 ms post-stimulus) to be detected, facilitating closed-loop applications and exposing neural activity that would otherwise remain hidden. To illustrate this capability, evoked activity from microstimulation of the rodent hippocampus is presented. System noise levels are similar to existing platforms, and extracellular action potentials and local field potentials can be recorded simultaneously. The system is modular, in banks of 16 channels, and flexible in usage: while primarily designed for in vivo use, it can be combined with commercial preamplifiers to record from in vitro multielectrode arrays. The system's open-source control software, NeuroRighter, is implemented in C#, with an easy-to-use graphical interface. As C# functions in a managed code environment, which may impact performance, analysis was conducted to ensure comparable speed to C++ for this application. Hardware schematics, layout files, and software are freely available. Since maintaining wired headstage connections with freely moving animals is difficult, we describe a new method of electrode-headstage coupling using neodymium magnets. PMID:19668698
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sheu, R; Ghafar, R; Powers, A
Purpose: Demonstrate the effectiveness of in-house software in ensuring EMR workflow efficiency and safety. Methods: A web-based dashboard system (WBDS) was developed to monitor clinical workflow in real time using web technology (WAMP) through ODBC (Open Database Connectivity). Within Mosaiq (Elekta Inc), operational workflow is driven and indicated by Quality Check Lists (QCLs), which is triggered by automation software IQ Scripts (Elekta Inc); QCLs rely on user completion to propagate. The WBDS retrieves data directly from the Mosaig SQL database and tracks clinical events in real time. For example, the necessity of a physics initial chart check can be determinedmore » by screening all patients on treatment who have received their first fraction and who have not yet had their first chart check. Monitoring similar “real” events with our in-house software creates a safety net as its propagation does not rely on individual users input. Results: The WBDS monitors the following: patient care workflow (initial consult to end of treatment), daily treatment consistency (scheduling, technique, charges), physics chart checks (initial, EOT, weekly), new starts, missing treatments (>3 warning/>5 fractions, action required), and machine overrides. The WBDS can be launched from any web browser which allows the end user complete transparency and timely information. Since the creation of the dashboards, workflow interruptions due to accidental deletion or completion of QCLs were eliminated. Additionally, all physics chart checks were completed timely. Prompt notifications of treatment record inconsistency and machine overrides have decreased the amount of time between occurrence and execution of corrective action. Conclusion: Our clinical workflow relies primarily on QCLs and IQ Scripts; however, this functionality is not the panacea of safety and efficiency. The WBDS creates a more thorough system of checks to provide a safer and near error-less working environment.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fako, Raluca; Sociu, Florin; Stan, Camelia
Romania is actively engaged to update the Medium and Long Term National Strategy for Safe Management of Radioactive Waste and to approve the Road Map for Geological Repository Development. Considering relevant documents to be further updated, about 122,000 m{sup 3} SL-LILW are to be disposed in a near surface facility that will have room, also, for quantities of VLLW. Planned date for commissioning is under revision. Taking into account that in this moment there are initiated several actions for the improvement of the technical capability for LILW treatment and conditioning, several steps for the possible use of SAFRAN software weremore » considered. In view of specific data for Romanian radioactive waste inventory, authors are trying to highlight the expected limitations and unknown data related with the implementation of SAFRAN software for the foreseen pre-disposal waste management activities. There are challenges that have to be faced in the near future related with clear definition of the properties of each room, area and waste management activity. This work has the aim to address several LILW management issues in accordance with national and international regulatory framework for the assurance of nuclear safety. Also, authors intend to develop their institutional capability for the safety demonstration of the existent and future radioactive waste management facilities and activities. (authors)« less
2007-09-01
Findings______________________________________________________________________ Chinese hackers forced one of its bureaus to cut off Internet access and discard virus -infected...Vulnerability ( IAV ) Management (IAVM) process was created to prepare and rapidly disseminate mitigating actions for potentially critical software...vulnerabilities to DoD Components. IAVM notices have three criticality levels: • IAV Alert (IAVA) – most critical – a vulnerability posing an immediate
System Engineering Concept Demonstration, System Engineering Needs. Volume 2
1992-12-01
changeability, and invisibility. "Software entities are perhaps more complex for their size than any other human construct..." In addition, software is... human actions and interactions that often fail or insufficient in large organizations. Specific needs in this area include the following: " Each...needed to accomplish incremental review and critique of information. * Automi ..-’ metrics support is needed for the measuring ikey quality aspects of
Synchronous monitoring of muscle dynamics and electromyogram
NASA Astrophysics Data System (ADS)
Zakir Hossain, M.; Grill, Wolfgang
2011-04-01
A non-intrusive novel detection scheme has been implemented to detect the lateral muscle extension, force of the skeletal muscle and the motor action potential (EMG) synchronously. This allows the comparison of muscle dynamics and EMG signals as a basis for modeling and further studies to determine which architectural parameters are most sensitive to changes in muscle activity. For this purpose the transmission time for ultrasonic chirp signal in the frequency range of 100 kHz to 2.5 MHz passing through the muscle under observation and respective motor action potentials are recorded synchronously to monitor and quantify biomechanical parameters related to muscle performance. Additionally an ultrasonic force sensor has been employed for monitoring. Ultrasonic traducers are placed on the skin to monitor muscle expansion. Surface electrodes are placed suitably to pick up the potential for activation of the monitored muscle. Isometric contraction of the monitored muscle is ensured by restricting the joint motion with the ultrasonic force sensor. Synchronous monitoring was initiated by a software activated audio beep starting at zero time of the subsequent data acquisition interval. Computer controlled electronics are used to generate and detect the ultrasonic signals and monitor the EMG signals. Custom developed software and data analysis is employed to analyze and quantify the monitored data. Reaction time, nerve conduction speed, latent period between the on-set of EMG signals and muscle response, degree of muscle activation and muscle fatigue development, rate of energy expenditure and motor neuron recruitment rate in isometric contraction, and other relevant parameters relating to muscle performance have been quantified with high spatial and temporal resolution.
Hypermedia and intelligent tutoring applications in a mission operations environment
NASA Technical Reports Server (NTRS)
Ames, Troy; Baker, Clifford
1990-01-01
Hypermedia, hypertext and Intelligent Tutoring System (ITS) applications to support all phases of mission operations are investigated. The application of hypermedia and ITS technology to improve system performance and safety in supervisory control is described - with an emphasis on modeling operator's intentions in the form of goals, plans, tasks, and actions. Review of hypermedia and ITS technology is presented as may be applied to the tutoring of command and control languages. Hypertext based ITS is developed to train flight operation teams and System Test and Operation Language (STOL). Specific hypermedia and ITS application areas are highlighted, including: computer aided instruction of flight operation teams (STOL ITS) and control center software development tools (CHIMES and STOL Certification Tool).
Building Energy Monitoring and Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hong, Tianzhen; Feng, Wei; Lu, Alison
This project aimed to develop a standard methodology for building energy data definition, collection, presentation, and analysis; apply the developed methods to a standardized energy monitoring platform, including hardware and software, to collect and analyze building energy use data; and compile offline statistical data and online real-time data in both countries for fully understanding the current status of building energy use. This helps decode the driving forces behind the discrepancy of building energy use between the two countries; identify gaps and deficiencies of current building energy monitoring, data collection, and analysis; and create knowledge and tools to collect and analyzemore » good building energy data to provide valuable and actionable information for key stakeholders.« less
Using Decision Structures for Policy Analysis in Software Product-line Evolution - A Case Study
NASA Astrophysics Data System (ADS)
Sarang, Nita; Sanglikar, Mukund A.
Project management decisions are the primary basis for project success (or failure). Mostly, such decisions are based on an intuitive understanding of the underlying software engineering and management process and have a likelihood of being misjudged. Our problem domain is product-line evolution. We model the dynamics of the process by incorporating feedback loops appropriate to two decision structures: staffing policy, and the forces of growth associated with long-term software evolution. The model is executable and supports project managers to assess the long-term effects of possible actions. Our work also corroborates results from earlier studies of E-type systems, in particular the FEAST project and the rules for software evolution, planning and management.
Wauchope, R Don; Ahuja, Lajpat R; Arnold, Jeffrey G; Bingner, Ron; Lowrance, Richard; van Genuchten, Martinus T; Adams, Larry D
2003-01-01
We present an overview of USDA Agricultural Research Service (ARS) computer models and databases related to pest-management science, emphasizing current developments in environmental risk assessment and management simulation models. The ARS has a unique national interdisciplinary team of researchers in surface and sub-surface hydrology, soil and plant science, systems analysis and pesticide science, who have networked to develop empirical and mechanistic computer models describing the behavior of pests, pest responses to controls and the environmental impact of pest-control methods. Historically, much of this work has been in support of production agriculture and in support of the conservation programs of our 'action agency' sister, the Natural Resources Conservation Service (formerly the Soil Conservation Service). Because we are a public agency, our software/database products are generally offered without cost, unless they are developed in cooperation with a private-sector cooperator. Because ARS is a basic and applied research organization, with development of new science as our highest priority, these products tend to be offered on an 'as-is' basis with limited user support except for cooperating R&D relationship with other scientists. However, rapid changes in the technology for information analysis and communication continually challenge our way of doing business.
Vélez-Díaz-Pallarés, Manuel; Delgado-Silveira, Eva; Carretero-Accame, María Emilia; Bermejo-Vicedo, Teresa
2013-01-01
To identify actions to reduce medication errors in the process of drug prescription, validation and dispensing, and to evaluate the impact of their implementation. A Health Care Failure Mode and Effect Analysis (HFMEA) was supported by a before-and-after medication error study to measure the actual impact on error rate after the implementation of corrective actions in the process of drug prescription, validation and dispensing in wards equipped with computerised physician order entry (CPOE) and unit-dose distribution system (788 beds out of 1080) in a Spanish university hospital. The error study was carried out by two observers who reviewed medication orders on a daily basis to register prescription errors by physicians and validation errors by pharmacists. Drugs dispensed in the unit-dose trolleys were reviewed for dispensing errors. Error rates were expressed as the number of errors for each process divided by the total opportunities for error in that process times 100. A reduction in prescription errors was achieved by providing training for prescribers on CPOE, updating prescription procedures, improving clinical decision support and automating the software connection to the hospital census (relative risk reduction (RRR), 22.0%; 95% CI 12.1% to 31.8%). Validation errors were reduced after optimising time spent in educating pharmacy residents on patient safety, developing standardised validation procedures and improving aspects of the software's database (RRR, 19.4%; 95% CI 2.3% to 36.5%). Two actions reduced dispensing errors: reorganising the process of filling trolleys and drawing up a protocol for drug pharmacy checking before delivery (RRR, 38.5%; 95% CI 14.1% to 62.9%). HFMEA facilitated the identification of actions aimed at reducing medication errors in a healthcare setting, as the implementation of several of these led to a reduction in errors in the process of drug prescription, validation and dispensing.
PREPARING FOR EXASCALE: ORNL Leadership Computing Application Requirements and Strategy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Joubert, Wayne; Kothe, Douglas B; Nam, Hai Ah
2009-12-01
In 2009 the Oak Ridge Leadership Computing Facility (OLCF), a U.S. Department of Energy (DOE) facility at the Oak Ridge National Laboratory (ORNL) National Center for Computational Sciences (NCCS), elicited petascale computational science requirements from leading computational scientists in the international science community. This effort targeted science teams whose projects received large computer allocation awards on OLCF systems. A clear finding of this process was that in order to reach their science goals over the next several years, multiple projects will require computational resources in excess of an order of magnitude more powerful than those currently available. Additionally, for themore » longer term, next-generation science will require computing platforms of exascale capability in order to reach DOE science objectives over the next decade. It is generally recognized that achieving exascale in the proposed time frame will require disruptive changes in computer hardware and software. Processor hardware will become necessarily heterogeneous and will include accelerator technologies. Software must undergo the concomitant changes needed to extract the available performance from this heterogeneous hardware. This disruption portends to be substantial, not unlike the change to the message passing paradigm in the computational science community over 20 years ago. Since technological disruptions take time to assimilate, we must aggressively embark on this course of change now, to insure that science applications and their underlying programming models are mature and ready when exascale computing arrives. This includes initiation of application readiness efforts to adapt existing codes to heterogeneous architectures, support of relevant software tools, and procurement of next-generation hardware testbeds for porting and testing codes. The 2009 OLCF requirements process identified numerous actions necessary to meet this challenge: (1) Hardware capabilities must be advanced on multiple fronts, including peak flops, node memory capacity, interconnect latency, interconnect bandwidth, and memory bandwidth. (2) Effective parallel programming interfaces must be developed to exploit the power of emerging hardware. (3) Science application teams must now begin to adapt and reformulate application codes to the new hardware and software, typified by hierarchical and disparate layers of compute, memory and concurrency. (4) Algorithm research must be realigned to exploit this hierarchy. (5) When possible, mathematical libraries must be used to encapsulate the required operations in an efficient and useful way. (6) Software tools must be developed to make the new hardware more usable. (7) Science application software must be improved to cope with the increasing complexity of computing systems. (8) Data management efforts must be readied for the larger quantities of data generated by larger, more accurate science models. Requirements elicitation, analysis, validation, and management comprise a difficult and inexact process, particularly in periods of technological change. Nonetheless, the OLCF requirements modeling process is becoming increasingly quantitative and actionable, as the process becomes more developed and mature, and the process this year has identified clear and concrete steps to be taken. This report discloses (1) the fundamental science case driving the need for the next generation of computer hardware, (2) application usage trends that illustrate the science need, (3) application performance characteristics that drive the need for increased hardware capabilities, (4) resource and process requirements that make the development and deployment of science applications on next-generation hardware successful, and (5) summary recommendations for the required next steps within the computer and computational science communities.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Soubies, B.; Henry, J.Y.; Le Meur, M.
1300 MWe pressurised water reactors (PWRs), like the 1400 MWe reactors, operate with microprocessor-based safety systems. This is particularly the case for the Digital Integrated Protection System (SPIN), which trips the reactor in an emergency and sets in action the safeguard functions. The softwares used in these systems must therefore be highly dependable in the execution of their functions. In the case of SPIN, three players are working at different levels to achieve this goal: the protection system manufacturer, Merlin Gerin; the designer of the nuclear steam supply system, Framatome; the operator of the nuclear power plants, Electricite de Francemore » (EDF), which is also responsible for the safety of its installations. Regulatory licenses are issued by the French safety authority, the Nuclear Installations Safety Directorate (French abbreviation DSIN), subsequent to a successful examination of the technical provisions adopted by the operator. This examination is carried out by the IPSN and the standing group on nuclear reactors. This communication sets out: the methods used by the manufacturer to develop SPIN software for the 1400 MWe PWRs (N4 series); the approach adopted by the IPSN to evaluate the safety software of the protection system for the N4 series of reactors.« less
Integrating manufacturing softwares for intelligent planning execution: a CIIMPLEX perspective
NASA Astrophysics Data System (ADS)
Chu, Bei Tseng B.; Tolone, William J.; Wilhelm, Robert G.; Hegedus, M.; Fesko, J.; Finin, T.; Peng, Yun; Jones, Chris H.; Long, Junshen; Matthews, Mike; Mayfield, J.; Shimp, J.; Su, S.
1997-01-01
Recent developments have made it possible to interoperate complex business applications at much lower costs. Application interoperation, along with business process re- engineering can result in significant savings by eliminating work created by disconnected business processes due to isolated business applications. However, we believe much greater productivity benefits can be achieved by facilitating timely decision-making, utilizing information from multiple enterprise perspectives. The CIIMPLEX enterprise integration architecture is designed to enable such productivity gains by helping people to carry out integrated enterprise scenarios. An enterprise scenario is triggered typically by some external event. The goal of an enterprise scenario is to make the right decisions considering the full context of the problem. Enterprise scenarios are difficult for people to carry out because of the interdependencies among various actions. One can easily be overwhelmed by the large amount of information. We propose the use of software agents to help gathering relevant information and present them in the appropriate context of an enterprise scenario. The CIIMPLEX enterprise integration architecture is based on the FAIME methodology for application interoperation and plug-and-play. It also explores the use of software agents in application plug-and- play.
Information System Engineering Supporting Observation, Orientation, Decision, and Compliant Action
NASA Astrophysics Data System (ADS)
Georgakopoulos, Dimitrios
The majority of today's software systems and organizational/business structures have been built on the foundation of solving problems via long-term data collection, analysis, and solution design. This traditional approach of solving problems and building corresponding software systems and business processes, falls short in providing the necessary solutions needed to deal with many problems that require agility as the main ingredient of their solution. For example, such agility is needed in responding to an emergency, in military command control, physical security, price-based competition in business, investing in the stock market, video gaming, network monitoring and self-healing, diagnosis in emergency health care, and many other areas that are too numerous to list here. The concept of Observe, Orient, Decide, and Act (OODA) loops is a guiding principal that captures the fundamental issues and approach for engineering information systems that deal with many of these problem areas. However, there are currently few software systems that are capable of supporting OODA. In this talk, we provide a tour of the research issues and state of the art solutions for supporting OODA. In addition, we provide specific examples of OODA solutions we have developed for the video surveillance and emergency response domains.
Interaction design challenges and solutions for ALMA operations monitoring and control
NASA Astrophysics Data System (ADS)
Pietriga, Emmanuel; Cubaud, Pierre; Schwarz, Joseph; Primet, Romain; Schilling, Marcus; Barkats, Denis; Barrios, Emilio; Vila Vilaro, Baltasar
2012-09-01
The ALMA radio-telescope, currently under construction in northern Chile, is a very advanced instrument that presents numerous challenges. From a software perspective, one critical issue is the design of graphical user interfaces for operations monitoring and control that scale to the complexity of the system and to the massive amounts of data users are faced with. Early experience operating the telescope with only a few antennas has shown that conventional user interface technologies are not adequate in this context. They consume too much screen real-estate, require many unnecessary interactions to access relevant information, and fail to provide operators and astronomers with a clear mental map of the instrument. They increase extraneous cognitive load, impeding tasks that call for quick diagnosis and action. To address this challenge, the ALMA software division adopted a user-centered design approach. For the last two years, astronomers, operators, software engineers and human-computer interaction researchers have been involved in participatory design workshops, with the aim of designing better user interfaces based on state-of-the-art visualization techniques. This paper describes the process that led to the development of those interface components and to a proposal for the science and operations console setup: brainstorming sessions, rapid prototyping, joint implementation work involving software engineers and human-computer interaction researchers, feedback collection from a broader range of users, further iterations and testing.
NASA Astrophysics Data System (ADS)
Mertz, Sharon A.; Groothuis, Adam; Fellman, Philip Vos
The subject of technology succession and new technology adoption in a generalized sense has been addressed by numerous authors for over one hundred years. Models which accommodate macro-level events as well as micro-level actions are needed to gain insight to future market outcomes. In the ICT industry, macro-level factors affecting technology adoption include global events and shocks, economic factors, and global regulatory trends. Micro-level elements involve individual agent actions and interactions, such as the behaviors of buyers and suppliers in reaction to each other, and to macro events. Projecting technology adoption and software market composition and growth requires evaluating a special set of technology characteristics, buyer behaviors, and supplier issues and responses which make this effort particularly challenging.
Wallot, Sebastian; Roepstorff, Andreas; Mønster, Dan
2016-01-01
We introduce Multidimensional Recurrence Quantification Analysis (MdRQA) as a tool to analyze multidimensional time-series data. We show how MdRQA can be used to capture the dynamics of high-dimensional signals, and how MdRQA can be used to assess coupling between two or more variables. In particular, we describe applications of the method in research on joint and collective action, as it provides a coherent analysis framework to systematically investigate dynamics at different group levels—from individual dynamics, to dyadic dynamics, up to global group-level of arbitrary size. The Appendix in Supplementary Material contains a software implementation in MATLAB to calculate MdRQA measures. PMID:27920748
Wallot, Sebastian; Roepstorff, Andreas; Mønster, Dan
2016-01-01
We introduce Multidimensional Recurrence Quantification Analysis (MdRQA) as a tool to analyze multidimensional time-series data. We show how MdRQA can be used to capture the dynamics of high-dimensional signals, and how MdRQA can be used to assess coupling between two or more variables. In particular, we describe applications of the method in research on joint and collective action, as it provides a coherent analysis framework to systematically investigate dynamics at different group levels-from individual dynamics, to dyadic dynamics, up to global group-level of arbitrary size. The Appendix in Supplementary Material contains a software implementation in MATLAB to calculate MdRQA measures.
NASA Technical Reports Server (NTRS)
Voigt, S. (Editor); Beskenis, S. (Editor)
1985-01-01
Issues in the development of software for the Space Station are discussed. Software acquisition and management, software development environment, standards, information system support for software developers, and a future software advisory board are addressed.
A Quantitative Study of Global Software Development Teams, Requirements, and Software Projects
ERIC Educational Resources Information Center
Parker, Linda L.
2016-01-01
The study explored the relationship between global software development teams, effective software requirements, and stakeholders' perception of successful software development projects within the field of information technology management. It examined the critical relationship between Global Software Development (GSD) teams creating effective…
The Knowledge-Based Software Assistant: Beyond CASE
NASA Technical Reports Server (NTRS)
Carozzoni, Joseph A.
1993-01-01
This paper will outline the similarities and differences between two paradigms of software development. Both support the whole software life cycle and provide automation for most of the software development process, but have different approaches. The CASE approach is based on a set of tools linked by a central data repository. This tool-based approach is data driven and views software development as a series of sequential steps, each resulting in a product. The Knowledge-Based Software Assistant (KBSA) approach, a radical departure from existing software development practices, is knowledge driven and centers around a formalized software development process. KBSA views software development as an incremental, iterative, and evolutionary process with development occurring at the specification level.
ERIC Educational Resources Information Center
Shih, Ching-Hsiang; Yeh, Jui-Chi; Shih, Ching-Tien; Chang, Man-Ling
2011-01-01
The latest studies have adopted software technology which turns the Wii Remote Controller into a high-performance limb action detector, we assessed whether two persons with multiple disabilities would be able to control an environmental stimulus through limb action. This study extends the functionality of the Wii Remote Controller to the…
The Warfighter Associate: Decision-Support and Metrics for Mission Command
2013-01-01
complex situations can be captured it makes sense to use software to provide this important adjunct to complex human cognitive problems. As a software...tasks that could distract the user from the important events occurring. An Associate System also observes the actions undertaken by a human operator...the Commander’s Critical Information Requirements. ‡It is important to note that the Warfighter Associate maintains a human -in-the-loop for decision
NASA Astrophysics Data System (ADS)
Peckham, S. D.
2017-12-01
Standardized, deep descriptions of digital resources (e.g. data sets, computational models, software tools and publications) make it possible to develop user-friendly software systems that assist scientists with the discovery and appropriate use of these resources. Semantic metadata makes it possible for machines to take actions on behalf of humans, such as automatically identifying the resources needed to solve a given problem, retrieving them and then automatically connecting them (despite their heterogeneity) into a functioning workflow. Standardized model metadata also helps model users to understand the important details that underpin computational models and to compare the capabilities of different models. These details include simplifying assumptions on the physics, governing equations and the numerical methods used to solve them, discretization of space (the grid) and time (the time-stepping scheme), state variables (input or output), model configuration parameters. This kind of metadata provides a "deep description" of a computational model that goes well beyond other types of metadata (e.g. author, purpose, scientific domain, programming language, digital rights, provenance, execution) and captures the science that underpins a model. A carefully constructed, unambiguous and rules-based schema to address this problem, called the Geoscience Standard Names ontology will be presented that utilizes Semantic Web best practices and technologies. It has also been designed to work across science domains and to be readable by both humans and machines.
Automated Reuse of Scientific Subroutine Libraries through Deductive Synthesis
NASA Technical Reports Server (NTRS)
Lowry, Michael R.; Pressburger, Thomas; VanBaalen, Jeffrey; Roach, Steven
1997-01-01
Systematic software construction offers the potential of elevating software engineering from an art-form to an engineering discipline. The desired result is more predictable software development leading to better quality and more maintainable software. However, the overhead costs associated with the formalisms, mathematics, and methods of systematic software construction have largely precluded their adoption in real-world software development. In fact, many mainstream software development organizations, such as Microsoft, still maintain a predominantly oral culture for software development projects; which is far removed from a formalism-based culture for software development. An exception is the limited domain of safety-critical software, where the high-assuiance inherent in systematic software construction justifies the additional cost. We believe that systematic software construction will only be adopted by mainstream software development organization when the overhead costs have been greatly reduced. Two approaches to cost mitigation are reuse (amortizing costs over many applications) and automation. For the last four years, NASA Ames has funded the Amphion project, whose objective is to automate software reuse through techniques from systematic software construction. In particular, deductive program synthesis (i.e., program extraction from proofs) is used to derive a composition of software components (e.g., subroutines) that correctly implements a specification. The construction of reuse libraries of software components is the standard software engineering solution for improving software development productivity and quality.
Strategic Environmental Education Plan for the State of Sinaloa (SEEPSIN), Mexico, 2011-2016
NASA Astrophysics Data System (ADS)
Torrecillas Nunez, C.; Miguel-rodriguez, A.
2012-12-01
SEEPSIN is based on the principles of action research (Kurt Lewin), a comparative research on the conditions and effects of various forms of social action and research leading to social action that uses a spiral of steps, each of which is composed of a circle of planning, action, and fact-finding about the result of the action. It was designed and implemented by the Autonomous University of Sinaloa, Mexico, for the Human and Social Development Secretariat (SEDESHU) with funding from SEMARNAT and Sinaloa State. The objective of SEEPSIN is to foster an environmental culture of the population living in the catchment - subject to intervention - through non-formal educational process, using the model of environmental education developed by Torrecillas et al 2008. Non-formal education and continuing education are factors that should be in constant development, evolving along with all the changes that are occurring in the context, thus they are a suitable instrument to promote change and improve the cultural, social, economic and environmental well-being of the population. In turn this contributes to the development of skills in children, youth and the general public considering the watershed and community involvement as central to restoring the balance of man and nature, based on the implementation of sustainable development models. The tools and program for SEEPSIN include: dissemination of the project; acquisition of a mobile environmental education unit; developing and distributing educational materials including books, pamphlets, brochures, manuals, calendars, posters, guides and CD's; installation in the webpage of the State Government through a specially designed software to provide access for all Sinaloa; forming a network of trainers and promoters, including staff of the 18 municipalities and students at all levels; media intervention; creation of a State Environmental Education Forum and evaluation/analysis of the results. Training is provided through interactive workshops based on the SEEPSIN book of environmental education which is divided into three modules: 5 Waters; Soil, Solid Waste, Biodiversity, Plantation, Reforestation and Agriculture; and Energy, Climate Change and Sustainable Construction. In addition, all residents can find all the information through the website of the State Government to implement best management practices of natural resources at home. The eventual aim of SEEPSIN is the involvement and participation of 51.480 inhabitants annually in Sinaloa State, which based on the multiplier effect , within 6 years, will train a total of 308,.880 inhabitants including: 259,200 students, 6,480 teachers and 43,200 environmental promoters to care for the environment. SEEPSIN was launched at the State Environmental Education Forum in May 2012 attended by 343 persons, as well 972 participated in interactive workshops and the Network of Environmental Promoters established. In May and July 2012, 20 workshops were conducted in four State zones with a total of 1450 attendees. The results are evaluated and reported by means of surveys, the online software and statistical methods and have confirmed good acceptance of SEEPSIN with more than 90% interested in contributing to its on-going implementation and achieving an environmental culture in Sinaloa.
Predicting Software Suitability Using a Bayesian Belief Network
NASA Technical Reports Server (NTRS)
Beaver, Justin M.; Schiavone, Guy A.; Berrios, Joseph S.
2005-01-01
The ability to reliably predict the end quality of software under development presents a significant advantage for a development team. It provides an opportunity to address high risk components earlier in the development life cycle, when their impact is minimized. This research proposes a model that captures the evolution of the quality of a software product, and provides reliable forecasts of the end quality of the software being developed in terms of product suitability. Development team skill, software process maturity, and software problem complexity are hypothesized as driving factors of software product quality. The cause-effect relationships between these factors and the elements of software suitability are modeled using Bayesian Belief Networks, a machine learning method. This research presents a Bayesian Network for software quality, and the techniques used to quantify the factors that influence and represent software quality. The developed model is found to be effective in predicting the end product quality of small-scale software development efforts.
The Effects of Development Team Skill on Software Product Quality
NASA Technical Reports Server (NTRS)
Beaver, Justin M.; Schiavone, Guy A.
2006-01-01
This paper provides an analysis of the effect of the skill/experience of the software development team on the quality of the final software product. A method for the assessment of software development team skill and experience is proposed, and was derived from a workforce management tool currently in use by the National Aeronautics and Space Administration. Using data from 26 smallscale software development projects, the team skill measures are correlated to 5 software product quality metrics from the ISO/IEC 9126 Software Engineering Product Quality standard. in the analysis of the results, development team skill is found to be a significant factor in the adequacy of the design and implementation. In addition, the results imply that inexperienced software developers are tasked with responsibilities ill-suited to their skill level, and thus have a significant adverse effect on the quality of the software product. Keywords: software quality, development skill, software metrics
Shachak, Aviv; Dow, Rustam; Barnsley, Jan; Tu, Karen; Domb, Sharon; Jadad, Alejandro R; Lemieux-Charles, Louise
2013-06-04
Tutorials and user manuals are important forms of impersonal support for using software applications including electronic medical records (EMRs). Differences between user- and vendor documentation may indicate support needs, which are not sufficiently addressed by the official documentation, and reveal new elements that may inform the design of tutorials and user manuals. What are the differences between user-generated tutorials and manuals for an EMR and the official user manual from the software vendor? Effective design of tutorials and user manuals requires careful packaging of information, balance between declarative and procedural texts, an action and task-oriented approach, support for error recognition and recovery, and effective use of visual elements. No previous research compared these elements between formal and informal documents. We conducted an mixed methods study. Seven tutorials and two manuals for an EMR were collected from three family health teams and compared with the official user manual from the software vendor. Documents were qualitatively analyzed using a framework analysis approach in relation to the principles of technical documentation described above. Subsets of the data were quantitatively analyzed using cross-tabulation to compare the types of error information and visual cues in screen captures between user- and vendor-generated manuals. The user-developed tutorials and manuals differed from the vendor-developed manual in that they contained mostly procedural and not declarative information; were customized to the specific workflow, user roles, and patient characteristics; contained more error information related to work processes than to software usage; and used explicit visual cues on screen captures to help users identify window elements. These findings imply that to support EMR implementation, tutorials and manuals need to be customized and adapted to specific organizational contexts and workflows. The main limitation of the study is its generalizability. Future research should address this limitation and may explore alternative approaches to software documentation, such as modular manuals or participatory design.
Dufendach, Kevin R; Koch, Sabine; Unertl, Kim M; Lehmann, Christoph U
2017-10-26
Early involvement of stakeholders in the design of medical software is particularly important due to the need to incorporate complex knowledge and actions associated with clinical work. Standard user-centered design methods include focus groups and participatory design sessions with individual stakeholders, which generally limit user involvement to a small number of individuals due to the significant time investments from designers and end users. The goal of this project was to reduce the effort for end users to participate in co-design of a software user interface by developing an interactive web-based crowdsourcing platform. In a randomized trial, we compared a new web-based crowdsourcing platform to standard participatory design sessions. We developed an interactive, modular platform that allows responsive remote customization and design feedback on a visual user interface based on user preferences. The responsive canvas is a dynamic HTML template that responds in real time to user preference selections. Upon completion, the design team can view the user's interface creations through an administrator portal and download the structured selections through a REDCap interface. We have created a software platform that allows users to customize a user interface and see the results of that customization in real time, receiving immediate feedback on the impact of their design choices. Neonatal clinicians used the new platform to successfully design and customize a neonatal handoff tool. They received no specific instruction and yet were able to use the software easily and reported high usability. VandAID, a new web-based crowdsourcing platform, can involve multiple users in user-centered design simultaneously and provides means of obtaining design feedback remotely. The software can provide design feedback at any stage in the design process, but it will be of greatest utility for specifying user requirements and evaluating iterative designs with multiple options.
2012-10-01
higher Java v5Apache Struts v2 Hibernate v2 C3PO SQL*Net client / JDBC Database Server Oracle 10.0.2 Desktop Client Internet Explorer...for mobile Smartphones - A Java -based framework utilizing Apache Struts on the server - Relational database to handle data storage requirements B...technologies are as follows: Technology Use Requirements Java Application Provides the backend application software to drive the PHR-A 7 BEA Web
Computer-Aided Software Engineering - An approach to real-time software development
NASA Technical Reports Server (NTRS)
Walker, Carrie K.; Turkovich, John J.
1989-01-01
A new software engineering discipline is Computer-Aided Software Engineering (CASE), a technology aimed at automating the software development process. This paper explores the development of CASE technology, particularly in the area of real-time/scientific/engineering software, and a history of CASE is given. The proposed software development environment for the Advanced Launch System (ALS CASE) is described as an example of an advanced software development system for real-time/scientific/engineering (RT/SE) software. The Automated Programming Subsystem of ALS CASE automatically generates executable code and corresponding documentation from a suitably formatted specification of the software requirements. Software requirements are interactively specified in the form of engineering block diagrams. Several demonstrations of the Automated Programming Subsystem are discussed.
NASA Astrophysics Data System (ADS)
Demuzere, Matthias; Kassomenos, P.; Philipp, A.
2011-08-01
In the framework of the COST733 Action "Harmonisation and Applications of Weather Types Classifications for European Regions" a new circulation type classification software (hereafter, referred to as cost733class software) is developed. The cost733class software contains a variety of (European) classification methods and is flexible towards choice of domain of interest, input variables, time step, number of circulation types, sequencing and (weighted) target variables. This work introduces the capabilities of the cost733class software in which the resulting circulation types (CTs) from various circulation type classifications (CTCs) are applied on observed summer surface ozone concentrations in Central Europe. Firstly, the main characteristics of the CTCs in terms of circulation pattern frequencies are addressed using the baseline COST733 catalogue (cat 2.0), at present the latest product of the new cost733class software. In a second step, the probabilistic Brier skill score is used to quantify the explanatory power of all classifications in terms of the maximum 8 hourly mean ozone concentrations exceeding the 120-μg/m3 threshold; this was based on ozone concentrations from 130 Central European measurement stations. Averaged evaluation results over all stations indicate generally higher performance of CTCs with a higher number of types. Within the subset of methodologies with a similar number of types, the results suggest that the use of CTCs based on optimisation algorithms are performing slightly better than those which are based on other algorithms (predefined thresholds, principal component analysis and leader algorithms). The results are further elaborated by exploring additional capabilities of the cost733class software. Sensitivity experiments are performed using different domain sizes, input variables, seasonally based classifications and multiple-day sequencing. As an illustration, CTCs which are also conditioned towards temperature with various weights are derived and tested similarly. All results exploit a physical interpretation by adapting the environment-to-circulation approach, providing more detailed information on specific synoptic conditions prevailing on days with high surface ozone concentrations. This research does not intend to bring forward a favourite classification methodology or construct a statistical ozone forecasting tool but should be seen as an introduction to the possibilities of the cost733class software. It this respect, the results presented here can provide a basic user support for the cost733class software and the development of a more user- or application-specific CTC approach.
NASA Astrophysics Data System (ADS)
Yetman, G.; Downs, R. R.
2011-12-01
Software deployment is needed to process and distribute scientific data throughout the data lifecycle. Developing software in-house can take software development teams away from other software development projects and can require efforts to maintain the software over time. Adopting and reusing software and system modules that have been previously developed by others can reduce in-house software development and maintenance costs and can contribute to the quality of the system being developed. A variety of models are available for reusing and deploying software and systems that have been developed by others. These deployment models include open source software, vendor-supported open source software, commercial software, and combinations of these approaches. Deployment in Earth science data processing and distribution has demonstrated the advantages and drawbacks of each model. Deploying open source software offers advantages for developing and maintaining scientific data processing systems and applications. By joining an open source community that is developing a particular system module or application, a scientific data processing team can contribute to aspects of the software development without having to commit to developing the software alone. Communities of interested developers can share the work while focusing on activities that utilize in-house expertise and addresses internal requirements. Maintenance is also shared by members of the community. Deploying vendor-supported open source software offers similar advantages to open source software. However, by procuring the services of a vendor, the in-house team can rely on the vendor to provide, install, and maintain the software over time. Vendor-supported open source software may be ideal for teams that recognize the value of an open source software component or application and would like to contribute to the effort, but do not have the time or expertise to contribute extensively. Vendor-supported software may also have the additional benefits of guaranteed up-time, bug fixes, and vendor-added enhancements. Deploying commercial software can be advantageous for obtaining system or software components offered by a vendor that meet in-house requirements. The vendor can be contracted to provide installation, support and maintenance services as needed. Combining these options offers a menu of choices, enabling selection of system components or software modules that meet the evolving requirements encountered throughout the scientific data lifecycle.
Computer systems and software engineering
NASA Technical Reports Server (NTRS)
Mckay, Charles W.
1988-01-01
The High Technologies Laboratory (HTL) was established in the fall of 1982 at the University of Houston Clear Lake. Research conducted at the High Tech Lab is focused upon computer systems and software engineering. There is a strong emphasis on the interrelationship of these areas of technology and the United States' space program. In Jan. of 1987, NASA Headquarters announced the formation of its first research center dedicated to software engineering. Operated by the High Tech Lab, the Software Engineering Research Center (SERC) was formed at the University of Houston Clear Lake. The High Tech Lab/Software Engineering Research Center promotes cooperative research among government, industry, and academia to advance the edge-of-knowledge and the state-of-the-practice in key topics of computer systems and software engineering which are critical to NASA. The center also recommends appropriate actions, guidelines, standards, and policies to NASA in matters pertinent to the center's research. Results of the research conducted at the High Tech Lab/Software Engineering Research Center have given direction to many decisions made by NASA concerning the Space Station Program.
Agent-based models of cellular systems.
Cannata, Nicola; Corradini, Flavio; Merelli, Emanuela; Tesei, Luca
2013-01-01
Software agents are particularly suitable for engineering models and simulations of cellular systems. In a very natural and intuitive manner, individual software components are therein delegated to reproduce "in silico" the behavior of individual components of alive systems at a given level of resolution. Individuals' actions and interactions among individuals allow complex collective behavior to emerge. In this chapter we first introduce the readers to software agents and multi-agent systems, reviewing the evolution of agent-based modeling of biomolecular systems in the last decade. We then describe the main tools, platforms, and methodologies available for programming societies of agents, possibly profiting also of toolkits that do not require advanced programming skills.
The Elements of an Effective Software Development Plan - Software Development Process Guidebook
2011-11-11
standards and practices required for all XMPL software development. This SDP implements the <corporate> Standard Software Process (SSP). as tailored...Developing and integrating reusable software products • Approach to managing COTS/Reuse software implementation • COTS/Reuse software selection...final selection and submit to change board for approval MAINTENANCE Monitor current products for obsolescence or end of support Track new
Artificial intelligence approaches to software engineering
NASA Technical Reports Server (NTRS)
Johannes, James D.; Macdonald, James R.
1988-01-01
Artificial intelligence approaches to software engineering are examined. The software development life cycle is a sequence of not so well-defined phases. Improved techniques for developing systems have been formulated over the past 15 years, but pressure continues to attempt to reduce current costs. Software development technology seems to be standing still. The primary objective of the knowledge-based approach to software development presented in this paper is to avoid problem areas that lead to schedule slippages, cost overruns, or software products that fall short of their desired goals. Identifying and resolving software problems early, often in the phase in which they first occur, has been shown to contribute significantly to reducing risks in software development. Software development is not a mechanical process but a basic human activity. It requires clear thinking, work, and rework to be successful. The artificial intelligence approaches to software engineering presented support the software development life cycle through the use of software development techniques and methodologies in terms of changing current practices and methods. These should be replaced by better techniques that that improve the process of of software development and the quality of the resulting products. The software development process can be structured into well-defined steps, of which the interfaces are standardized, supported and checked by automated procedures that provide error detection, production of the documentation and ultimately support the actual design of complex programs.
NASA Technical Reports Server (NTRS)
Mayer, Richard J.; Blinn, Thomas M.; Dewitte, Paul S.; Crump, John W.; Ackley, Keith A.
1992-01-01
The Framework Programmable Software Development Platform (FPP) is a project aimed at effectively combining tool and data integration mechanisms with a model of the software development process to provide an intelligent integrated software development environment. Guided by the model, this system development framework will take advantage of an integrated operating environment to automate effectively the management of the software development process so that costly mistakes during the development phase can be eliminated. The Advanced Software Development Workstation (ASDW) program is conducting research into development of advanced technologies for Computer Aided Software Engineering (CASE).
ARTIE: An Integrated Environment for the Development of Affective Robot Tutors
Imbernón Cuadrado, Luis-Eduardo; Manjarrés Riesco, Ángeles; De La Paz López, Félix
2016-01-01
Over the last decade robotics has attracted a great deal of interest from teachers and researchers as a valuable educational tool from preschool to highschool levels. The implementation of social-support behaviors in robot tutors, in particular in the emotional dimension, can make a significant contribution to learning efficiency. With the aim of contributing to the rising field of affective robot tutors we have developed ARTIE (Affective Robot Tutor Integrated Environment). We offer an architectural pattern which integrates any given educational software for primary school children with a component whose function is to identify the emotional state of the students who are interacting with the software, and with the driver of a robot tutor which provides personalized emotional pedagogical support to the students. In order to support the development of affective robot tutors according to the proposed architecture, we also provide a methodology which incorporates a technique for eliciting pedagogical knowledge from teachers, and a generic development platform. This platform contains a component for identiying emotional states by analysing keyboard and mouse interaction data, and a generic affective pedagogical support component which specifies the affective educational interventions (including facial expressions, body language, tone of voice,…) in terms of BML (a Behavior Model Language for virtual agent specification) files which are translated into actions of a robot tutor. The platform and the methodology are both adapted to primary school students. Finally, we illustrate the use of this platform to build a prototype implementation of the architecture, in which the educational software is instantiated with Scratch and the robot tutor with NAO. We also report on a user experiment we carried out to orient the development of the platform and of the prototype. We conclude from our work that, in the case of primary school students, it is possible to identify, without using intrusive and expensive identification methods, the emotions which most affect the character of educational interventions. Our work also demonstrates the feasibility of a general-purpose architecture of decoupled components, in which a wide range of educational software and robot tutors can be integrated and then used according to different educational criteria. PMID:27536230
ARTIE: An Integrated Environment for the Development of Affective Robot Tutors.
Imbernón Cuadrado, Luis-Eduardo; Manjarrés Riesco, Ángeles; De La Paz López, Félix
2016-01-01
Over the last decade robotics has attracted a great deal of interest from teachers and researchers as a valuable educational tool from preschool to highschool levels. The implementation of social-support behaviors in robot tutors, in particular in the emotional dimension, can make a significant contribution to learning efficiency. With the aim of contributing to the rising field of affective robot tutors we have developed ARTIE (Affective Robot Tutor Integrated Environment). We offer an architectural pattern which integrates any given educational software for primary school children with a component whose function is to identify the emotional state of the students who are interacting with the software, and with the driver of a robot tutor which provides personalized emotional pedagogical support to the students. In order to support the development of affective robot tutors according to the proposed architecture, we also provide a methodology which incorporates a technique for eliciting pedagogical knowledge from teachers, and a generic development platform. This platform contains a component for identiying emotional states by analysing keyboard and mouse interaction data, and a generic affective pedagogical support component which specifies the affective educational interventions (including facial expressions, body language, tone of voice,…) in terms of BML (a Behavior Model Language for virtual agent specification) files which are translated into actions of a robot tutor. The platform and the methodology are both adapted to primary school students. Finally, we illustrate the use of this platform to build a prototype implementation of the architecture, in which the educational software is instantiated with Scratch and the robot tutor with NAO. We also report on a user experiment we carried out to orient the development of the platform and of the prototype. We conclude from our work that, in the case of primary school students, it is possible to identify, without using intrusive and expensive identification methods, the emotions which most affect the character of educational interventions. Our work also demonstrates the feasibility of a general-purpose architecture of decoupled components, in which a wide range of educational software and robot tutors can be integrated and then used according to different educational criteria.
ERIC Educational Resources Information Center
Biju, Soly Mathew
2008-01-01
Many software development firms are now adopting the agile software development method. This method involves the customer at every level of software development, thus reducing the impact of change in the requirement at a later stage. In this article, the principles of the agile method for software development are explored and there is a focus on…
Imaging-based logics for ornamental stone quality chart definition
NASA Astrophysics Data System (ADS)
Bonifazi, Giuseppe; Gargiulo, Aldo; Serranti, Silvia; Raspi, Costantino
2007-02-01
Ornamental stone products are commercially classified on the market according to several factors related both to intrinsic lythologic characteristics and to their visible pictorial attributes. Sometimes these latter aspects prevail in quality criteria definition and assessment. Pictorial attributes are in any case also influenced by the performed working actions and the utilized tools selected to realize the final stone manufactured product. Stone surface finishing is a critical task because it can contribute to enhance certain aesthetic features of the stone itself. The study was addressed to develop an innovative set of methodologies and techniques able to quantify the aesthetic quality level of stone products taking into account both the physical and the aesthetical characteristics of the stones. In particular, the degree of polishing of the stone surfaces and the presence of defects have been evaluated, applying digital image processing strategies. Morphological and color parameters have been extracted developing specific software architectures. Results showed as the proposed approaches allow to quantify the degree of polishing and to identify surface defects related to the intrinsic characteristics of the stone and/or the performed working actions.
Overview of Intelligent Systems and Operations Development
NASA Technical Reports Server (NTRS)
Pallix, Joan; Dorais, Greg; Penix, John
2004-01-01
To achieve NASA's ambitious mission objectives for the future, aircraft and spacecraft will need intelligence to take the correct action in a variety of circumstances. Vehicle intelligence can be defined as the ability to "do the right thing" when faced with a complex decision-making situation. It will be necessary to implement integrated autonomous operations and low-level adaptive flight control technologies to direct actions that enhance the safety and success of complex missions despite component failures, degraded performance, operator errors, and environment uncertainty. This paper will describe the array of technologies required to meet these complex objectives. This includes the integration of high-level reasoning and autonomous capabilities with multiple subsystem controllers for robust performance. Future intelligent systems will use models of the system, its environment, and other intelligent agents with which it interacts. They will also require planners, reasoning engines, and adaptive controllers that can recommend or execute commands enabling the system to respond intelligently. The presentation will also address the development of highly dependable software, which is a key component to ensure the reliability of intelligent systems.
Development of a comprehensive software engineering environment
NASA Technical Reports Server (NTRS)
Hartrum, Thomas C.; Lamont, Gary B.
1987-01-01
The generation of a set of tools for software lifecycle is a recurring theme in the software engineering literature. The development of such tools and their integration into a software development environment is a difficult task because of the magnitude (number of variables) and the complexity (combinatorics) of the software lifecycle process. An initial development of a global approach was initiated in 1982 as the Software Development Workbench (SDW). Continuing efforts focus on tool development, tool integration, human interfacing, data dictionaries, and testing algorithms. Current efforts are emphasizing natural language interfaces, expert system software development associates and distributed environments with Ada as the target language. The current implementation of the SDW is on a VAX-11/780. Other software development tools are being networked through engineering workstations.
Reducing Risk in DoD Software-Intensive Systems Development
2016-03-01
intensive systems development risk. This research addresses the use of the Technical Readiness Assessment (TRA) using the nine-level software Technology...The software TRLs are ineffective in reducing technical risk for the software component development. • Without the software TRLs, there is no...effective method to perform software TRA or reduce the technical development risk. The software component will behave as a new, untried technology in nearly
A Reliable Service-Oriented Architecture for NASA's Mars Exploration Rover Mission
NASA Technical Reports Server (NTRS)
Mak, Ronald; Walton, Joan; Keely, Leslie; Hehner, Dennis; Chan, Louise
2005-01-01
The Collaborative Information Portal (CIP) was enterprise software developed jointly by the NASA Ames Research Center and the Jet Propulsion Laboratory (JPL) for NASA's highly successful Mars Exploration Rover (MER) mission. Both MER and CIP have performed far beyond their original expectations. Mission managers and engineers ran CIP inside the mission control room at JPL, and the scientists ran CIP in their laboratories, homes, and offices. All the users connected securely over the Internet. Since the mission ran on Mars time, CIP displayed the current time in various Mars and Earth time zones, and it presented staffing and event schedules with Martian time scales. Users could send and receive broadcast messages, and they could view and download data and image files generated by the rovers' instruments. CIP had a three-tiered, service-oriented architecture (SOA) based on industry standards, including J2EE and web services, and it integrated commercial off-the-shelf software. A user's interactions with the graphical interface of the CIP client application generated web services requests to the CIP middleware. The middleware accessed the back-end data repositories if necessary and returned results for these requests. The client application could make multiple service requests for a single user action and then present a composition of the results. This happened transparently, and many users did not even realize that they were connecting to a server. CIP performed well and was extremely reliable; it attained better than 99% uptime during the course of the mission. In this paper, we present overviews of the MER mission and of CIP. We show how CIP helped to fulfill some of the mission needs and how people used it. We discuss the criteria for choosing its architecture, and we describe how the developers made the software so reliable. CIP's reliability did not come about by chance, but was the result of several key design decisions. We conclude with some of the important lessons we learned form developing, deploying, and supporting the software.
Building Software Agents for Planning, Monitoring, and Optimizing Travel
2004-01-01
defined as plans in the Theseus Agent Execution language (Barish et al. 2002). In the Web environment, sources can be quite slow and the latencies of...executor is based on a dataflow paradigm, actions are executed as soon as the data becomes available. Second, Theseus performs the actions in a...while Thesues provides an expressive language for defining information gathering and monitoring plans. The Theseus language supports capabilities
Montella, Alfonso; Chiaradonna, Salvatore; Criscuolo, Giorgio; De Martino, Salvatore
2017-02-05
First step of the development of an effective safety management system is to create reliable crash databases since the quality of decision making in road safety depends on the quality of the data on which decisions are based. Improving crash data is a worldwide priority, as highlighted in the Global Plan for the Decade of Action for Road Safety adopted by the United Nations, which recognizes that the overall goal of the plan will be attained improving the quality of data collection at the national, regional and global levels. Crash databases provide the basic information for effective highway safety efforts at any level of government, but lack of uniformity among countries and among the different jurisdictions in the same country is observed. Several existing databases show significant drawbacks which hinder their effective use for safety analysis and improvement. Furthermore, modern technologies offer great potential for significant improvements of existing methods and procedures for crash data collection, processing and analysis. To address these issues, in this paper we present the development and evaluation of a web-based platform-independent software for crash data collection, processing and analysis. The software is designed for mobile and desktop electronic devices and enables a guided and automated drafting of the crash report, assisting police officers both on-site and in the office. The software development was based both on the detailed critical review of existing Australasian, EU, and U.S. crash databases and software as well as on the continuous consultation with the stakeholders. The evaluation was carried out comparing the completeness, timeliness, and accuracy of crash data before and after the use of the software in the city of Vico Equense, in south of Italy showing significant advantages. The amount of collected information increased from 82 variables to 268 variables, i.e., a 227% increase. The time saving was more than one hour per crash, i.e., a 36% reduction. The on-site data collection did not produce time saving, however this is a temporary weakness that will be annihilated very soon in the future after officers are more acquainted with the software. The phase of evaluation, processing and analysis carried out in the office was dramatically shortened, i.e., a 69% reduction. Another benefit was the standardization which allowed fast and consistent data analysis and evaluation. Even if all these benefits are remarkable, the most valuable benefit of the new procedure was the reduction of the police officers mistakes during the manual operations of survey and data evaluation. Because of these benefits, the satisfaction questionnaires administrated to the police officers after the testing phase showed very good acceptance of the procedure. Copyright © 2017 Elsevier Ltd. All rights reserved.
Evolution of Secondary Software Businesses: Understanding Industry Dynamics
NASA Astrophysics Data System (ADS)
Tyrväinen, Pasi; Warsta, Juhani; Seppänen, Veikko
Primary software industry originates from IBM's decision to unbundle software-related computer system development activities to external partners. This kind of outsourcing from an enterprise internal software development activity is a common means to start a new software business serving a vertical software market. It combines knowledge of the vertical market process with competence in software development. In this research, we present and analyze the key figures of the Finnish secondary software industry, in order to quantify its interaction with the primary software industry during the period of 2000-2003. On the basis of the empirical data, we present a model for evolution of a secondary software business, which makes explicit the industry dynamics. It represents the shift from internal software developed for competitive advantage to development of products supporting standard business processes on top of standardized technologies. We also discuss the implications for software business strategies in each phase.
NASA Technical Reports Server (NTRS)
Pitman, C. L.; Erb, D. M.; Izygon, M. E.; Fridge, E. M., III; Roush, G. B.; Braley, D. M.; Savely, R. T.
1992-01-01
The United State's big space projects of the next decades, such as Space Station and the Human Exploration Initiative, will need the development of many millions of lines of mission critical software. NASA-Johnson (JSC) is identifying and developing some of the Computer Aided Software Engineering (CASE) technology that NASA will need to build these future software systems. The goal is to improve the quality and the productivity of large software development projects. New trends are outlined in CASE technology and how the Software Technology Branch (STB) at JSC is endeavoring to provide some of these CASE solutions for NASA is described. Key software technology components include knowledge-based systems, software reusability, user interface technology, reengineering environments, management systems for the software development process, software cost models, repository technology, and open, integrated CASE environment frameworks. The paper presents the status and long-term expectations for CASE products. The STB's Reengineering Application Project (REAP), Advanced Software Development Workstation (ASDW) project, and software development cost model (COSTMODL) project are then discussed. Some of the general difficulties of technology transfer are introduced, and a process developed by STB for CASE technology insertion is described.
Radiation breakage of DNA: a model based on random-walk chromatin structure
NASA Technical Reports Server (NTRS)
Ponomarev, A. L.; Sachs, R. K.
2001-01-01
Monte Carlo computer software, called DNAbreak, has recently been developed to analyze observed non-random clustering of DNA double strand breaks in chromatin after exposure to densely ionizing radiation. The software models coarse-grained configurations of chromatin and radiation tracks, small-scale details being suppressed in order to obtain statistical results for larger scales, up to the size of a whole chromosome. We here give an analytic counterpart of the numerical model, useful for benchmarks, for elucidating the numerical results, for analyzing the assumptions of a more general but less mechanistic "randomly-located-clusters" formalism, and, potentially, for speeding up the calculations. The equations characterize multi-track DNA fragment-size distributions in terms of one-track action; an important step in extrapolating high-dose laboratory results to the much lower doses of main interest in environmental or occupational risk estimation. The approach can utilize the experimental information on DNA fragment-size distributions to draw inferences about large-scale chromatin geometry during cell-cycle interphase.
Advantages of Brahms for Specifying and Implementing a Multiagent Human-Robotic Exploration System
NASA Technical Reports Server (NTRS)
Clancey, William J.; Sierhuis, Maarten; Kaskiris, Charis; vanHoof, Ron
2003-01-01
We have developed a model-based, distributed architecture that integrates diverse components in a system designed for lunar and planetary surface operations: an astronaut's space suit, cameras, all-terrain vehicles, robotic assistant, crew in a local habitat, and mission support team. Software processes ('agents') implemented in the Brahms language, run on multiple, mobile platforms. These mobile agents interpret and transform available data to help people and robotic systems coordinate their actions to make operations more safe and efficient. The Brahms-based mobile agent architecture (MAA) uses a novel combination of agent types so the software agents may understand and facilitate communications between people and between system components. A state-of-the-art spoken dialogue interface is integrated with Brahms models, supporting a speech-driven field observation record and rover command system. An important aspect of the methodology involves first simulating the entire system in Brahms, then configuring the agents into a runtime system Thus, Brahms provides a language, engine, and system builder's toolkit for specifying and implementing multiagent systems.
Runway Safety Monitor Algorithm for Runway Incursion Detection and Alerting
NASA Technical Reports Server (NTRS)
Green, David F., Jr.; Jones, Denise R. (Technical Monitor)
2002-01-01
The Runway Safety Monitor (RSM) is an algorithm for runway incursion detection and alerting that was developed in support of NASA's Runway Incursion Prevention System (RIPS) research conducted under the NASA Aviation Safety Program's Synthetic Vision System element. The RSM algorithm provides pilots with enhanced situational awareness and warnings of runway incursions in sufficient time to take evasive action and avoid accidents during landings, takeoffs, or taxiing on the runway. The RSM currently runs as a component of the NASA Integrated Display System, an experimental avionics software system for terminal area and surface operations. However, the RSM algorithm can be implemented as a separate program to run on any aircraft with traffic data link capability. The report documents the RSM software and describes in detail how RSM performs runway incursion detection and alerting functions for NASA RIPS. The report also describes the RIPS flight tests conducted at the Dallas-Ft Worth International Airport (DFW) during September and October of 2000, and the RSM performance results and lessons learned from those flight tests.
NASA Astrophysics Data System (ADS)
Robinson, Alexandra R.
An updated global survey of radioisotope production and distribution was completed and subjected to a revised "down-selection methodology" to determine those radioisotopes that should be classified as potential national security risks based on availability and key physical characteristics that could be exploited in a hypothetical radiological dispersion device. The potential at-risk radioisotopes then were used in a modeling software suite known as Turbo FRMAC, developed by Sandia National Laboratories, to characterize plausible contamination maps known as Protective Action Guideline Zone Maps. This software also was used to calculate the whole body dose equivalent for exposed individuals based on various dispersion parameters and scenarios. Derived Response Levels then were determined for each radioisotope using: 1) target doses to members of the public provided by the U.S. EPA, and 2) occupational dose limits provided by the U.S. Nuclear Regulatory Commission. The limiting Derived Response Level for each radioisotope also was determined.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-22
... NUCLEAR REGULATORY COMMISSION [NRC-2012-0195] Developing Software Life Cycle Processes for Digital... Software Life Cycle Processes for Digital Computer Software used in Safety Systems of Nuclear Power Plants... clarifications, the enhanced consensus practices for developing software life-cycle processes for digital...
Generic domain models in software engineering
NASA Technical Reports Server (NTRS)
Maiden, Neil
1992-01-01
This paper outlines three research directions related to domain-specific software development: (1) reuse of generic models for domain-specific software development; (2) empirical evidence to determine these generic models, namely elicitation of mental knowledge schema possessed by expert software developers; and (3) exploitation of generic domain models to assist modelling of specific applications. It focuses on knowledge acquisition for domain-specific software development, with emphasis on tool support for the most important phases of software development.
Software for Automation of Real-Time Agents, Version 2
NASA Technical Reports Server (NTRS)
Fisher, Forest; Estlin, Tara; Gaines, Daniel; Schaffer, Steve; Chouinard, Caroline; Engelhardt, Barbara; Wilklow, Colette; Mutz, Darren; Knight, Russell; Rabideau, Gregg;
2005-01-01
Version 2 of Closed Loop Execution and Recovery (CLEaR) has been developed. CLEaR is an artificial intelligence computer program for use in planning and execution of actions of autonomous agents, including, for example, Deep Space Network (DSN) antenna ground stations, robotic exploratory ground vehicles (rovers), robotic aircraft (UAVs), and robotic spacecraft. CLEaR automates the generation and execution of command sequences, monitoring the sequence execution, and modifying the command sequence in response to execution deviations and failures as well as new goals for the agent to achieve. The development of CLEaR has focused on the unification of planning and execution to increase the ability of the autonomous agent to perform under tight resource and time constraints coupled with uncertainty in how much of resources and time will be required to perform a task. This unification is realized by extending the traditional three-tier robotic control architecture by increasing the interaction between the software components that perform deliberation and reactive functions. The increase in interaction reduces the need to replan, enables earlier detection of the need to replan, and enables replanning to occur before an agent enters a state of failure.
Automated Diagnosis and Control of Complex Systems
NASA Technical Reports Server (NTRS)
Kurien, James; Plaunt, Christian; Cannon, Howard; Shirley, Mark; Taylor, Will; Nayak, P.; Hudson, Benoit; Bachmann, Andrew; Brownston, Lee; Hayden, Sandra;
2007-01-01
Livingstone2 is a reusable, artificial intelligence (AI) software system designed to assist spacecraft, life support systems, chemical plants, or other complex systems by operating with minimal human supervision, even in the face of hardware failures or unexpected events. The software diagnoses the current state of the spacecraft or other system, and recommends commands or repair actions that will allow the system to continue operation. Livingstone2 is an enhancement of the Livingstone diagnosis system that was flight-tested onboard the Deep Space One spacecraft in 1999. This version tracks multiple diagnostic hypotheses, rather than just a single hypothesis as in the previous version. It is also able to revise diagnostic decisions made in the past when additional observations become available. In such cases, Livingstone might arrive at an incorrect hypothesis. Re-architecting and re-implementing the system in C++ has increased performance. Usability has been improved by creating a set of development tools that is closely integrated with the Livingstone2 engine. In addition to the core diagnosis engine, Livingstone2 includes a compiler that translates diagnostic models written in a Java-like language into Livingstone2's language, and a broad set of graphical tools for model development.
Wang, Xiuquan; Huang, Guohe; Zhao, Shan; Guo, Junhong
2015-09-01
This paper presents an open-source software package, rSCA, which is developed based upon a stepwise cluster analysis method and serves as a statistical tool for modeling the relationships between multiple dependent and independent variables. The rSCA package is efficient in dealing with both continuous and discrete variables, as well as nonlinear relationships between the variables. It divides the sample sets of dependent variables into different subsets (or subclusters) through a series of cutting and merging operations based upon the theory of multivariate analysis of variance (MANOVA). The modeling results are given by a cluster tree, which includes both intermediate and leaf subclusters as well as the flow paths from the root of the tree to each leaf subcluster specified by a series of cutting and merging actions. The rSCA package is a handy and easy-to-use tool and is freely available at http://cran.r-project.org/package=rSCA . By applying the developed package to air quality management in an urban environment, we demonstrate its effectiveness in dealing with the complicated relationships among multiple variables in real-world problems.
NASA Astrophysics Data System (ADS)
Hwang, L.; Kellogg, L. H.
2017-12-01
Curation of software promotes discoverability and accessibility and works hand in hand with scholarly citation to ascribe value to, and provide recognition for software development. To meet this challenge, the Computational Infrastructure for Geodynamics (CIG) maintains a community repository built on custom and open tools to promote discovery, access, identification, credit, and provenance of research software for the geodynamics community. CIG (geodynamics.org) originated from recognition of the tremendous effort required to develop sound software and the need to reduce duplication of effort and to sustain community codes. CIG curates software across 6 domains and has developed and follows software best practices that include establishing test cases, documentation, and a citable publication for each software package. CIG software landing web pages provide access to current and past releases; many are also accessible through the CIG community repository on github. CIG has now developed abc - attribution builder for citation to enable software users to give credit to software developers. abc uses zenodo as an archive and as the mechanism to obtain a unique identifier (DOI) for scientific software. To assemble the metadata, we searched the software's documentation and research publications and then requested the primary developers to verify. In this process, we have learned that each development community approaches software attribution differently. The metadata gathered is based on guidelines established by groups such as FORCE11 and OntoSoft. The rollout of abc is gradual as developers are forward-looking, rarely willing to go back and archive prior releases in zenodo. Going forward all actively developed packages will utilize the zenodo and github integration to automate the archival process when a new release is issued. How to handle legacy software, multi-authored libraries, and assigning roles to software remain open issues.
SEI Software Engineering Education Directory.
1987-02-01
Software Design and Development Gilbert. Philip Systems: CDC Cyber 170/750 CDC Cyber 170760 DEC POP 11/44 PRIME AT&T 3B5 IBM PC IBM XT IBM RT...Macintosh VAx 8300 Software System Development and Laboratory CS 480/480L U P X T Textbooks: Software Design and Development Gilbert, Philip Systems: CDC...Acting Chair (618) 692-2386 Courses: Software Design and Development CS 424 U P E Y Textbooks: Software Design and Development, Gilbert, Philip Topics
Health Physics Positions Data Base: Revision 1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kerr, G.D.; Borges, T.; Stafford, R.S.
1994-02-01
The Health Physics Positions (HPPOS) Data Base of the Nuclear Regulatory Commission (NRC) is a collection of NRC staff positions on a wide range of topics involving radiation protection (health physics). It consists of 328 documents in the form of letters, memoranda, and excerpts from technical reports. The HPPOS Data Base was developed by the NRC Headquarters and Regional Offices to help ensure uniformity in inspections, enforcement, and licensing actions. Staff members of the Oak Ridge National Laboratory (ORNL) have assisted the NRC staff in summarizing the documents during the preparation of this NUREG report. These summaries are also beingmore » made available as a {open_quotes}stand alone{close_quotes} software package for IBM and IBM-compatible personal computers. The software package for this report is called HPPOS Version 2.0. A variety of indexing schemes were used to increase the usefulness of the NUREG report and its associated software. The software package and the summaries in the report are written in the context of the {open_quotes}new{close_quotes} 10 CFR Part 20 ({section}{section}20.1001--20.2401). The purpose of this NUREG report is to allow interested individuals to familiarize themselves with the contents of the HPPOS Data Base and with the basis of many NRC decisions and regulations. The HPPOS summaries and original documents are intended to serve as a source of information for radiation protection programs at nuclear research and power reactors, nuclear medicine, and other industries that either process or use nuclear materials.« less
Advanced software development workstation project: Engineering scripting language. Graphical editor
NASA Technical Reports Server (NTRS)
1992-01-01
Software development is widely considered to be a bottleneck in the development of complex systems, both in terms of development and in terms of maintenance of deployed systems. Cost of software development and maintenance can also be very high. One approach to reducing costs and relieving this bottleneck is increasing the reuse of software designs and software components. A method for achieving such reuse is a software parts composition system. Such a system consists of a language for modeling software parts and their interfaces, a catalog of existing parts, an editor for combining parts, and a code generator that takes a specification and generates code for that application in the target language. The Advanced Software Development Workstation is intended to be an expert system shell designed to provide the capabilities of a software part composition system.
Ground control station software design for micro aerial vehicles
NASA Astrophysics Data System (ADS)
Walendziuk, Wojciech; Oldziej, Daniel; Binczyk, Dawid Przemyslaw; Slowik, Maciej
2017-08-01
This article describes the process of designing the equipment part and the software of a ground control station used for configuring and operating micro unmanned aerial vehicles (UAV). All the works were conducted on a quadrocopter model being a commonly accessible commercial construction. This article contains a characteristics of the research object, the basics of operating the micro aerial vehicles (MAV) and presents components of the ground control station model. It also describes the communication standards for the purpose of building a model of the station. Further part of the work concerns the software of the product - the GIMSO application (Generally Interactive Station for Mobile Objects), which enables the user to manage the actions and communication and control processes from the UAV. The process of creating the software and the field tests of a station model are also presented in the article.
DOE Office of Scientific and Technical Information (OSTI.GOV)
VOLTTRON is an agent execution platform providing services to its agents that allow them to easily communicate with physical devices and other resources. VOLTTRON delivers an innovative distributed control and sensing software platform that supports modern control strategies, including agent-based and transaction-based controls. It enables mobile and stationary software agents to perform information gathering, processing, and control actions. VOLTTRON can independently manage a wide range of applications, such as HVAC systems, electric vehicles, distributed energy or entire building loads, leading to improved operational efficiency.
A Prototype for the Support of Integrated Software Process Development and Improvement
NASA Astrophysics Data System (ADS)
Porrawatpreyakorn, Nalinpat; Quirchmayr, Gerald; Chutimaskul, Wichian
An efficient software development process is one of key success factors for quality software. Not only can the appropriate establishment but also the continuous improvement of integrated project management and of the software development process result in efficiency. This paper hence proposes a software process maintenance framework which consists of two core components: an integrated PMBOK-Scrum model describing how to establish a comprehensive set of project management and software engineering processes and a software development maturity model advocating software process improvement. Besides, a prototype tool to support the framework is introduced.
Four applications of a software data collection and analysis methodology
NASA Technical Reports Server (NTRS)
Basili, Victor R.; Selby, Richard W., Jr.
1985-01-01
The evaluation of software technologies suffers because of the lack of quantitative assessment of their effect on software development and modification. A seven-step data collection and analysis methodology couples software technology evaluation with software measurement. Four in-depth applications of the methodology are presented. The four studies represent each of the general categories of analyses on the software product and development process: blocked subject-project studies, replicated project studies, multi-project variation studies, and single project strategies. The four applications are in the areas of, respectively, software testing, cleanroom software development, characteristic software metric sets, and software error analysis.
Software Safety Progress in NASA
NASA Technical Reports Server (NTRS)
Radley, Charles F.
1995-01-01
NASA has developed guidelines for development and analysis of safety-critical software. These guidelines have been documented in a Guidebook for Safety Critical Software Development and Analysis. The guidelines represent a practical 'how to' approach, to assist software developers and safety analysts in cost effective methods for software safety. They provide guidance in the implementation of the recent NASA Software Safety Standard NSS-1740.13 which was released as 'Interim' version in June 1994, scheduled for formal adoption late 1995. This paper is a survey of the methods in general use, resulting in the NASA guidelines for safety critical software development and analysis.
pyam: Python Implementation of YaM
NASA Technical Reports Server (NTRS)
Myint, Steven; Jain, Abhinandan
2012-01-01
pyam is a software development framework with tools for facilitating the rapid development of software in a concurrent software development environment. pyam provides solutions for development challenges associated with software reuse, managing multiple software configurations, developing software product lines, and multiple platform development and build management. pyam uses release-early, release-often development cycles to allow developers to integrate their changes incrementally into the system on a continual basis. It facilitates the creation and merging of branches to support the isolated development of immature software to avoid impacting the stability of the development effort. It uses modules and packages to organize and share software across multiple software products, and uses the concepts of link and work modules to reduce sandbox setup times even when the code-base is large. One sidebenefit is the enforcement of a strong module-level encapsulation of a module s functionality and interface. This increases design transparency, system stability, and software reuse. pyam is written in Python and is organized as a set of utilities on top of the open source SVN software version control package. All development software is organized into a collection of modules. pyam packages are defined as sub-collections of the available modules. Developers can set up private sandboxes for module/package development. All module/package development takes place on private SVN branches. High-level pyam commands support the setup, update, and release of modules and packages. Released and pre-built versions of modules are available to developers. Developers can tailor the source/link module mix for their sandboxes so that new sandboxes (even large ones) can be built up easily and quickly by pointing to pre-existing module releases. All inter-module interfaces are publicly exported via links. A minimal, but uniform, convention is used for building modules.
Clinical software development for the Web: lessons learned from the BOADICEA project
2012-01-01
Background In the past 20 years, society has witnessed the following landmark scientific advances: (i) the sequencing of the human genome, (ii) the distribution of software by the open source movement, and (iii) the invention of the World Wide Web. Together, these advances have provided a new impetus for clinical software development: developers now translate the products of human genomic research into clinical software tools; they use open-source programs to build them; and they use the Web to deliver them. Whilst this open-source component-based approach has undoubtedly made clinical software development easier, clinical software projects are still hampered by problems that traditionally accompany the software process. This study describes the development of the BOADICEA Web Application, a computer program used by clinical geneticists to assess risks to patients with a family history of breast and ovarian cancer. The key challenge of the BOADICEA Web Application project was to deliver a program that was safe, secure and easy for healthcare professionals to use. We focus on the software process, problems faced, and lessons learned. Our key objectives are: (i) to highlight key clinical software development issues; (ii) to demonstrate how software engineering tools and techniques can facilitate clinical software development for the benefit of individuals who lack software engineering expertise; and (iii) to provide a clinical software development case report that can be used as a basis for discussion at the start of future projects. Results We developed the BOADICEA Web Application using an evolutionary software process. Our approach to Web implementation was conservative and we used conventional software engineering tools and techniques. The principal software development activities were: requirements, design, implementation, testing, documentation and maintenance. The BOADICEA Web Application has now been widely adopted by clinical geneticists and researchers. BOADICEA Web Application version 1 was released for general use in November 2007. By May 2010, we had > 1200 registered users based in the UK, USA, Canada, South America, Europe, Africa, Middle East, SE Asia, Australia and New Zealand. Conclusions We found that an evolutionary software process was effective when we developed the BOADICEA Web Application. The key clinical software development issues identified during the BOADICEA Web Application project were: software reliability, Web security, clinical data protection and user feedback. PMID:22490389
Clinical software development for the Web: lessons learned from the BOADICEA project.
Cunningham, Alex P; Antoniou, Antonis C; Easton, Douglas F
2012-04-10
In the past 20 years, society has witnessed the following landmark scientific advances: (i) the sequencing of the human genome, (ii) the distribution of software by the open source movement, and (iii) the invention of the World Wide Web. Together, these advances have provided a new impetus for clinical software development: developers now translate the products of human genomic research into clinical software tools; they use open-source programs to build them; and they use the Web to deliver them. Whilst this open-source component-based approach has undoubtedly made clinical software development easier, clinical software projects are still hampered by problems that traditionally accompany the software process. This study describes the development of the BOADICEA Web Application, a computer program used by clinical geneticists to assess risks to patients with a family history of breast and ovarian cancer. The key challenge of the BOADICEA Web Application project was to deliver a program that was safe, secure and easy for healthcare professionals to use. We focus on the software process, problems faced, and lessons learned. Our key objectives are: (i) to highlight key clinical software development issues; (ii) to demonstrate how software engineering tools and techniques can facilitate clinical software development for the benefit of individuals who lack software engineering expertise; and (iii) to provide a clinical software development case report that can be used as a basis for discussion at the start of future projects. We developed the BOADICEA Web Application using an evolutionary software process. Our approach to Web implementation was conservative and we used conventional software engineering tools and techniques. The principal software development activities were: requirements, design, implementation, testing, documentation and maintenance. The BOADICEA Web Application has now been widely adopted by clinical geneticists and researchers. BOADICEA Web Application version 1 was released for general use in November 2007. By May 2010, we had > 1200 registered users based in the UK, USA, Canada, South America, Europe, Africa, Middle East, SE Asia, Australia and New Zealand. We found that an evolutionary software process was effective when we developed the BOADICEA Web Application. The key clinical software development issues identified during the BOADICEA Web Application project were: software reliability, Web security, clinical data protection and user feedback.
Chopda, Viki R; Gomes, James; Rathore, Anurag S
2016-01-01
Bioreactor control significantly impacts both the amount and quality of the product being manufactured. The complexity of the control strategy that is implemented increases with reactor size, which may vary from thousands to tens of thousands of litres in commercial manufacturing. The Process Analytical Technology (PAT) initiative has highlighted the need for having robust monitoring tools and effective control schemes that are capable of taking real time information about the critical quality attributes (CQA) and the critical process parameters (CPP) and executing immediate response as soon as a deviation occurs. However, the limited flexibility that present commercial software packages offer creates a hurdle. Visual programming environments have gradually emerged as potential alternatives to the available text based languages. This paper showcases development of an integrated programme using a visual programming environment for a Sartorius BIOSTAT® B Plus 5L bioreactor through which various peripheral devices are interfaced. The proposed programme facilitates real-time access to data and allows for execution of control actions to follow the desired trajectory. Major benefits of such integrated software system include: (i) improved real time monitoring and control; (ii) reduced variability; (iii) improved performance; (iv) reduced operator-training time; (v) enhanced knowledge management; and (vi) easier PAT implementation. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Microscopic transport model animation visualisation on KML base
NASA Astrophysics Data System (ADS)
Yatskiv, I.; Savrasovs, M.
2012-10-01
By reading classical literature devoted to the simulation theory it could be found that one of the greatest possibilities of simulation is the ability to present processes inside the system by animation. This gives to the simulation model additional value during presentation of simulation results for the public and authorities who are not familiar enough with simulation. That is why most of universal and specialised simulation tools have the ability to construct 2D and 3D representation of the model. Usually the development of such representation could take much time and there must be put a lot forces into creating an adequate 3D representation of the model. For long years such well-known microscopic traffic flow simulation software tools as VISSIM, AIMSUN and PARAMICS have had a possibility to produce 2D and 3D animation. But creation of realistic 3D model of the place where traffic flows are simulated, even in these professional software tools it is a hard and time consuming action. The goal of this paper is to describe the concepts of use the existing on-line geographical information systems for visualisation of animation produced by simulation software. For demonstration purposes the following technologies and tools have been used: PTV VISION VISSIM, KML and Google Earth.
Web-Based Real-Time Emergency Monitoring
NASA Technical Reports Server (NTRS)
Harvey, Craig A.; Lawhead, Joel
2007-01-01
The Web-based Real-Time Asset Monitoring (RAM) module for emergency operations and facility management enables emergency personnel in federal agencies and local and state governments to monitor and analyze data in the event of a natural disaster or other crisis that threatens a large number of people and property. The software can manage many disparate sources of data within a facility, city, or county. It was developed on industry-standard Geo- Spatial software and is compliant with open GIS standards. RAM View can function as a standalone system, or as an integrated plugin module to Emergency Operations Center (EOC) software suites such as REACT (Real-time Emergency Action Coordination Tool), thus ensuring the widest possible distribution among potential users. RAM has the ability to monitor various data sources, including streaming data. Many disparate systems are included in the initial suite of supported hardware systems, such as mobile GPS units, ambient measurements of temperature, moisture and chemical agents, flow meters, air quality, asset location, and meteorological conditions. RAM View displays real-time data streams such as gauge heights from the U.S. Geological Survey gauging stations, flood crests from the National Weather Service, and meteorological data from numerous sources. Data points are clearly visible on the map interface, and attributes as specified in the user requirements can be viewed and queried.
Ford, P J; Hughes, C
2012-02-01
This project has investigated student and staff perceptions and experience of plagiarism in a large Australian dental school to develop a response to an external audit report. Workshops designed to enhance participants' understanding of plagiarism and to assist with practical ways to promote academic integrity within the school were provided to all students and staff. Anonymous surveys were used to investigate perceptions and experience of plagiarism and to assess the usefulness of the workshops. Most participants felt that plagiarism was not a problem in the school, but a significant number were undecided. The majority of participants reported that the guidelines for dealing with plagiarism were inadequate and most supported the mandatory use of text-matching software in all courses. High proportions of participants indicated that the workshops were useful and that they would consider improving their practice as a result. The study provided data that enhanced understanding of aspects of plagiarism highlighted in the report at the school level and identified areas in need of attention, such as refining and raising awareness of the guidelines and incorporation of text-matching software into courses, as well as cautions to be considered (how text-matching software is used) in planning responsive action. © 2011 John Wiley & Sons A/S.
Santos Vital Alves Coelho, Luana; Roner Vilanova Novais, Felipe; Armaneli Macedo, Giulia; Nunes Neves Dos Santos, Júlia; Lara Sousa, Vinícius; Mattos Mendes, Luis Augusto; Morais Dos Reis, Daniel; Caetano Romano, Márcia Christina
2016-06-01
To evaluate the effects of educational software to improve first grade school students' knowledge about prevention of overweight and obesity. This non-controlled trial with a before-and-after evaluation was carried out in an school located in the municipality of Divinópolis (Brazil) among 71 students aged 6 to 10 years. The educational software about prevention of overweight and obesity was designed and then validated. The educational intervention comprised the use of the software. Before and after of the intervention we applied a questionnaire based on the Ten Steps to Healthy Eating for Children, proposed by the Brazilian Ministry of Health. Comparing the times before and after application of the educational software, we observed statistically significant differences in proportion of questions answered correctly by first grade school students, mainly concerning daily eating of healthy and unhealthy food, adequate preparation of food and importance of exercise. This study highlights the importance of educational actions using software to build knowledge of first grade school students about prevention of overweight and obesity.
User systems guidelines for software projects
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abrahamson, L.
1986-04-01
This manual presents guidelines for software standards which were developed so that software project-development teams and management involved in approving the software could have a generalized view of all phases in the software production procedure and the steps involved in completing each phase. Guidelines are presented for six phases of software development: project definition, building a user interface, designing software, writing code, testing code, and preparing software documentation. The discussions for each phase include examples illustrating the recommended guidelines. 45 refs. (DWL)
Nacul, L C; Stewart, A; Alberg, C; Chowdhury, S; Darlison, M W; Grollman, C; Hall, A; Modell, B; Moorthie, S; Sagoo, G S; Burton, H
2014-06-01
In 2010 the World Health Assembly called for action to improve the care and prevention of congenital disorders, noting that technical guidance would be required for this task, especially in low- and middle-income countries. Responding to this call, we have developed a freely available web-accessible Toolkit for assessing health needs for congenital disorders. Materials for the Toolkit website (http://toolkit.phgfoundation.org) were prepared by an iterative process of writing, discussion and modification by the project team, with advice from external experts. A customized database was developed using epidemiological, demographic, socio-economic and health-services data from a range of validated sources. Document-processing and data integration software combines data from the database with a template to generate topic- and country-specific Calculator documents for quantitative analysis. The Toolkit guides users through selection of topics (including both clinical conditions and relevant health services), assembly and evaluation of qualitative and quantitative information, assessment of the potential effects of selected interventions, and planning and prioritization of actions to reduce the risk or prevalence of congenital disorders. The Toolkit enables users without epidemiological or public health expertise to undertake health needs assessment as a prerequisite for strategic planning in relation to congenital disorders in their country or region. © The Author 2013. Published by Oxford University Press on behalf of Faculty of Public Health.
Developing accreditation for community based surgery: the Irish experience.
Ní Riain, Ailís; Collins, Claire; O'Sullivan, Tony
2018-02-05
Purpose Carrying out minor surgery procedures in the primary care setting is popular with patients, cost effective and delivers at least as good outcomes as those performed in the hospital setting. This paper aims to describe the central role of clinical leadership in developing an accreditation system for general practitioners (GPs) undertaking community-based surgery in the Irish national setting where no mandatory accreditation process currently exists. Design/methodology/approach In all, 24 GPs were recruited to the GP network. Ten pilot standards were developed addressing GPs' experience and training, clinical activity and practice supporting infrastructure and tested, using information and document review, prospective collection of clinical data and a practice inspection visit. Two additional components were incorporated into the project (patient satisfaction survey and self-audit). A multi-modal evaluation was undertaken. A majority of GPs was included at all stages of the project, in line with the principles of action learning. The steering group had a majority of GPs with relevant expertise and representation of all other actors in the minor surgery arena. The GP research network contributed to each stage of the project. The project lead was a GP with minor surgery experience. Quantitative data collected were analysed using Predictive Analytic SoftWare. Krueger's framework analysis approach was used to analyse the qualitative data. Findings A total of 9 GPs achieved all standards at initial review, 14 successfully completed corrective actions and 1 GP did not achieve the required standard. Standards were then amended to reflect findings and a supporting framework was developed. Originality/value The flexibility of the action-learning approach and the clinical leadership design allowed for the development of robust quality standards in a short timeframe.
New technology continues to invade healthcare. What are the strategic implications/outcomes?
Smith, Coy
2004-01-01
Healthcare technology continues to advance and be implemented in healthcare organizations. Nurse executives must strategically evaluate the effectiveness of each proposed system or device using a strategic planning process. Clinical information systems, computer-chip-based clinical monitoring devices, advanced Web-based applications with remote, wireless communication devices, clinical decision support software--all compete for capital and registered nurse salary dollars. The concept of clinical transformation is developed with new models of care delivery being supported by technology rather than driving care delivery. Senior nursing leadership's role in clinical transformation and healthcare technology implementation is developed. Proposed standards, expert group action, business and consumer groups, and legislation are reviewed as strategic drivers in the development of an electronic health record and healthcare technology. A matrix of advancing technology and strategic decision-making parameters are outlined.
NASA Technical Reports Server (NTRS)
Mayer, Richard J.; Blinn, Thomas M.; Mayer, Paula S. D.; Reddy, Uday; Ackley, Keith; Futrell, Mike
1991-01-01
The Framework Programmable Software Development Platform (FPP) is a project aimed at combining effective tool and data integration mechanisms with a model of the software development process in an intelligent integrated software development environment. Guided by this model, this system development framework will take advantage of an integrated operating environment to automate effectively the management of the software development process so that costly mistakes during the development phase can be eliminated.
Using multi-attribute decision-making approaches in the selection of a hospital management system.
Arasteh, Mohammad Ali; Shamshirband, Shahaboddin; Yee, Por Lip
2018-01-01
The most appropriate organizational software is always a real challenge for managers, especially, the IT directors. The illustration of the term "enterprise software selection", is to purchase, create, or order a software that; first, is best adapted to require of the organization; and second, has suitable price and technical support. Specifying selection criteria and ranking them, is the primary prerequisite for this action. This article provides a method to evaluate, rank, and compare the available enterprise software for choosing the apt one. The prior mentioned method is constituted of three-stage processes. First, the method identifies the organizational requires and assesses them. Second, it selects the best method throughout three possibilities; indoor-production, buying software, and ordering special software for the native use. Third, the method evaluates, compares and ranks the alternative software. The third process uses different methods of multi attribute decision making (MADM), and compares the consequent results. Based on different characteristics of the problem; several methods had been tested, namely, Analytic Hierarchy Process (AHP), Technique for Order of Preference by Similarity to Ideal Solution (TOPSIS), Elimination and Choice Expressing Reality (ELECTURE), and easy weight method. After all, we propose the most practical method for same problems.
Ethical education in software engineering: responsibility in the production of complex systems.
Génova, Gonzalo; González, M Rosario; Fraga, Anabel
2007-12-01
Among the various contemporary schools of moral thinking, consequence-based ethics, as opposed to rule-based, seems to have a good acceptance among professionals such as software engineers. But naïve consequentialism is intellectually too weak to serve as a practical guide in the profession. Besides, the complexity of software systems makes it very hard to know in advance the consequences that will derive from professional activities in the production of software. Therefore, following the spirit of well-known codes of ethics such as the ACM/IEEE's, we advocate for a more solid position in the ethical education of software engineers, which we call 'moderate deontologism', that takes into account both rules and consequences to assess the goodness of actions, and at the same time pays an adequate consideration to the absolute values of human dignity. In order to educate responsible professionals, however, this position should be complemented with a pedagogical approach to virtue ethics.
A measurement system for large, complex software programs
NASA Technical Reports Server (NTRS)
Rone, Kyle Y.; Olson, Kitty M.; Davis, Nathan E.
1994-01-01
This paper describes measurement systems required to forecast, measure, and control activities for large, complex software development and support programs. Initial software cost and quality analysis provides the foundation for meaningful management decisions as a project evolves. In modeling the cost and quality of software systems, the relationship between the functionality, quality, cost, and schedule of the product must be considered. This explicit relationship is dictated by the criticality of the software being developed. This balance between cost and quality is a viable software engineering trade-off throughout the life cycle. Therefore, the ability to accurately estimate the cost and quality of software systems is essential to providing reliable software on time and within budget. Software cost models relate the product error rate to the percent of the project labor that is required for independent verification and validation. The criticality of the software determines which cost model is used to estimate the labor required to develop the software. Software quality models yield an expected error discovery rate based on the software size, criticality, software development environment, and the level of competence of the project and developers with respect to the processes being employed.
The dynamics of software development project management: An integrative systems dynamic perspective
NASA Technical Reports Server (NTRS)
Vandervelde, W. E.; Abdel-Hamid, T.
1984-01-01
Rather than continuing to focus on software development projects per se, the system dynamics modeling approach outlined is extended to investigate a broader set of issues pertaining to the software development organization. Rather than trace the life cycle(s) of one or more software projects, the focus is on the operations of a software development department as a continuous stream of software products are developed, placed into operation, and maintained. A number of research questions are ""ripe'' for investigating including: (1) the efficacy of different organizational structures in different software development environments, (2) personnel turnover, (3) impact of management approaches such as management by objectives, and (4) the organizational/environmental determinants of productivity.
Virtual reality for intelligent and interactive operating, training, and visualization systems
NASA Astrophysics Data System (ADS)
Freund, Eckhard; Rossmann, Juergen; Schluse, Michael
2000-10-01
Virtual Reality Methods allow a new and intuitive way of communication between man and machine. The basic idea of Virtual Reality (VR) is the generation of artificial computer simulated worlds, which the user not only can look at but also can interact with actively using data glove and data helmet. The main emphasis for the use of such techniques at the IRF is the development of a new generation of operator interfaces for the control of robots and other automation components and for intelligent training systems for complex tasks. The basic idea of the methods developed at the IRF for the realization of Projective Virtual Reality is to let the user work in the virtual world as he would act in reality. The user actions are recognized by the Virtual reality System and by means of new and intelligent control software projected onto the automation components like robots which afterwards perform the necessary actions in reality to execute the users task. In this operation mode the user no longer has to be a robot expert to generate tasks for robots or to program them, because intelligent control software recognizes the users intention and generated automatically the commands for nearly every automation component. Now, Virtual Reality Methods are ideally suited for universal man-machine-interfaces for the control and supervision of a big class of automation components, interactive training and visualization systems. The Virtual Reality System of the IRF-COSIMIR/VR- forms the basis for different projects starting with the control of space automation systems in the projects CIROS, VITAL and GETEX, the realization of a comprehensive development tool for the International Space Station and last but not least with the realistic simulation fire extinguishing, forest machines and excavators which will be presented in the final paper in addition to the key ideas of this Virtual Reality System.
Developing sustainable software solutions for bioinformatics by the “ Butterfly” paradigm
Ahmed, Zeeshan; Zeeshan, Saman; Dandekar, Thomas
2014-01-01
Software design and sustainable software engineering are essential for the long-term development of bioinformatics software. Typical challenges in an academic environment are short-term contracts, island solutions, pragmatic approaches and loose documentation. Upcoming new challenges are big data, complex data sets, software compatibility and rapid changes in data representation. Our approach to cope with these challenges consists of iterative intertwined cycles of development (“ Butterfly” paradigm) for key steps in scientific software engineering. User feedback is valued as well as software planning in a sustainable and interoperable way. Tool usage should be easy and intuitive. A middleware supports a user-friendly Graphical User Interface (GUI) as well as a database/tool development independently. We validated the approach of our own software development and compared the different design paradigms in various software solutions. PMID:25383181
Microcomputer software development facilities
NASA Technical Reports Server (NTRS)
Gorman, J. S.; Mathiasen, C.
1980-01-01
A more efficient and cost effective method for developing microcomputer software is to utilize a host computer with high-speed peripheral support. Application programs such as cross assemblers, loaders, and simulators are implemented in the host computer for each of the microcomputers for which software development is a requirement. The host computer is configured to operate in a time share mode for multiusers. The remote terminals, printers, and down loading capabilities provided are based on user requirements. With this configuration a user, either local or remote, can use the host computer for microcomputer software development. Once the software is developed (through the code and modular debug stage) it can be downloaded to the development system or emulator in a test area where hardware/software integration functions can proceed. The microcomputer software program sources reside in the host computer and can be edited, assembled, loaded, and then downloaded as required until the software development project has been completed.
Modular Rocket Engine Control Software (MRECS)
NASA Technical Reports Server (NTRS)
Tarrant, C.; Crook, J.
1998-01-01
The Modular Rocket Engine Control Software (MRECS) Program is a technology demonstration effort designed to advance the state-of-the-art in launch vehicle propulsion systems. Its emphasis is on developing and demonstrating a modular software architecture for advanced engine control systems that will result in lower software maintenance (operations) costs. It effectively accommodates software requirement changes that occur due to hardware technology upgrades and engine development testing. Ground rules directed by MSFC were to optimize modularity and implement the software in the Ada programming language. MRECS system software and the software development environment utilize Commercial-Off-the-Shelf (COTS) products. This paper presents the objectives, benefits, and status of the program. The software architecture, design, and development environment are described. MRECS tasks are defined and timing relationships given. Major accomplishments are listed. MRECS offers benefits to a wide variety of advanced technology programs in the areas of modular software architecture, reuse software, and reduced software reverification time related to software changes. MRECS was recently modified to support a Space Shuttle Main Engine (SSME) hot-fire test. Cold Flow and Flight Readiness Testing were completed before the test was cancelled. Currently, the program is focused on supporting NASA MSFC in accomplishing development testing of the Fastrac Engine, part of NASA's Low Cost Technologies (LCT) Program. MRECS will be used for all engine development testing.
Four simple recommendations to encourage best practices in research software
Jiménez, Rafael C.; Kuzak, Mateusz; Alhamdoosh, Monther; Barker, Michelle; Batut, Bérénice; Borg, Mikael; Capella-Gutierrez, Salvador; Chue Hong, Neil; Cook, Martin; Corpas, Manuel; Flannery, Madison; Garcia, Leyla; Gelpí, Josep Ll.; Gladman, Simon; Goble, Carole; González Ferreiro, Montserrat; Gonzalez-Beltran, Alejandra; Griffin, Philippa C.; Grüning, Björn; Hagberg, Jonas; Holub, Petr; Hooft, Rob; Ison, Jon; Katz, Daniel S.; Leskošek, Brane; López Gómez, Federico; Oliveira, Luis J.; Mellor, David; Mosbergen, Rowland; Mulder, Nicola; Perez-Riverol, Yasset; Pergl, Robert; Pichler, Horst; Pope, Bernard; Sanz, Ferran; Schneider, Maria V.; Stodden, Victoria; Suchecki, Radosław; Svobodová Vařeková, Radka; Talvik, Harry-Anton; Todorov, Ilian; Treloar, Andrew; Tyagi, Sonika; van Gompel, Maarten; Vaughan, Daniel; Via, Allegra; Wang, Xiaochuan; Watson-Haigh, Nathan S.; Crouch, Steve
2017-01-01
Scientific research relies on computer software, yet software is not always developed following practices that ensure its quality and sustainability. This manuscript does not aim to propose new software development best practices, but rather to provide simple recommendations that encourage the adoption of existing best practices. Software development best practices promote better quality software, and better quality software improves the reproducibility and reusability of research. These recommendations are designed around Open Source values, and provide practical suggestions that contribute to making research software and its source code more discoverable, reusable and transparent. This manuscript is aimed at developers, but also at organisations, projects, journals and funders that can increase the quality and sustainability of research software by encouraging the adoption of these recommendations. PMID:28751965
Four simple recommendations to encourage best practices in research software.
Jiménez, Rafael C; Kuzak, Mateusz; Alhamdoosh, Monther; Barker, Michelle; Batut, Bérénice; Borg, Mikael; Capella-Gutierrez, Salvador; Chue Hong, Neil; Cook, Martin; Corpas, Manuel; Flannery, Madison; Garcia, Leyla; Gelpí, Josep Ll; Gladman, Simon; Goble, Carole; González Ferreiro, Montserrat; Gonzalez-Beltran, Alejandra; Griffin, Philippa C; Grüning, Björn; Hagberg, Jonas; Holub, Petr; Hooft, Rob; Ison, Jon; Katz, Daniel S; Leskošek, Brane; López Gómez, Federico; Oliveira, Luis J; Mellor, David; Mosbergen, Rowland; Mulder, Nicola; Perez-Riverol, Yasset; Pergl, Robert; Pichler, Horst; Pope, Bernard; Sanz, Ferran; Schneider, Maria V; Stodden, Victoria; Suchecki, Radosław; Svobodová Vařeková, Radka; Talvik, Harry-Anton; Todorov, Ilian; Treloar, Andrew; Tyagi, Sonika; van Gompel, Maarten; Vaughan, Daniel; Via, Allegra; Wang, Xiaochuan; Watson-Haigh, Nathan S; Crouch, Steve
2017-01-01
Scientific research relies on computer software, yet software is not always developed following practices that ensure its quality and sustainability. This manuscript does not aim to propose new software development best practices, but rather to provide simple recommendations that encourage the adoption of existing best practices. Software development best practices promote better quality software, and better quality software improves the reproducibility and reusability of research. These recommendations are designed around Open Source values, and provide practical suggestions that contribute to making research software and its source code more discoverable, reusable and transparent. This manuscript is aimed at developers, but also at organisations, projects, journals and funders that can increase the quality and sustainability of research software by encouraging the adoption of these recommendations.
A Legal Guide for the Software Developer.
ERIC Educational Resources Information Center
Minnesota Small Business Assistance Office, St. Paul.
This booklet has been prepared to familiarize the inventor, creator, or developer of a new computer software product or software invention with the basic legal issues involved in developing, protecting, and distributing the software in the United States. Basic types of software protection and related legal matters are discussed in detail,…
Understanding Acceptance of Software Metrics--A Developer Perspective
ERIC Educational Resources Information Center
Umarji, Medha
2009-01-01
Software metrics are measures of software products and processes. Metrics are widely used by software organizations to help manage projects, improve product quality and increase efficiency of the software development process. However, metrics programs tend to have a high failure rate in organizations, and developer pushback is one of the sources…
Current Practice in Software Development for Computational Neuroscience and How to Improve It
Gewaltig, Marc-Oliver; Cannon, Robert
2014-01-01
Almost all research work in computational neuroscience involves software. As researchers try to understand ever more complex systems, there is a continual need for software with new capabilities. Because of the wide range of questions being investigated, new software is often developed rapidly by individuals or small groups. In these cases, it can be hard to demonstrate that the software gives the right results. Software developers are often open about the code they produce and willing to share it, but there is little appreciation among potential users of the great diversity of software development practices and end results, and how this affects the suitability of software tools for use in research projects. To help clarify these issues, we have reviewed a range of software tools and asked how the culture and practice of software development affects their validity and trustworthiness. We identified four key questions that can be used to categorize software projects and correlate them with the type of product that results. The first question addresses what is being produced. The other three concern why, how, and by whom the work is done. The answers to these questions show strong correlations with the nature of the software being produced, and its suitability for particular purposes. Based on our findings, we suggest ways in which current software development practice in computational neuroscience can be improved and propose checklists to help developers, reviewers, and scientists to assess the quality of software and whether particular pieces of software are ready for use in research. PMID:24465191
Current practice in software development for computational neuroscience and how to improve it.
Gewaltig, Marc-Oliver; Cannon, Robert
2014-01-01
Almost all research work in computational neuroscience involves software. As researchers try to understand ever more complex systems, there is a continual need for software with new capabilities. Because of the wide range of questions being investigated, new software is often developed rapidly by individuals or small groups. In these cases, it can be hard to demonstrate that the software gives the right results. Software developers are often open about the code they produce and willing to share it, but there is little appreciation among potential users of the great diversity of software development practices and end results, and how this affects the suitability of software tools for use in research projects. To help clarify these issues, we have reviewed a range of software tools and asked how the culture and practice of software development affects their validity and trustworthiness. We identified four key questions that can be used to categorize software projects and correlate them with the type of product that results. The first question addresses what is being produced. The other three concern why, how, and by whom the work is done. The answers to these questions show strong correlations with the nature of the software being produced, and its suitability for particular purposes. Based on our findings, we suggest ways in which current software development practice in computational neuroscience can be improved and propose checklists to help developers, reviewers, and scientists to assess the quality of software and whether particular pieces of software are ready for use in research.
Using WYSIWYG GUI Tools with UML
2009-06-01
diagram that shows the ports through which it sends and receives messages to and from other active classes. The behavior is described using a state...software that talks to its environment through ports (specified in the structure diagram), and performs actions as it transitions through a sequence of...used to trigger actions on the state diagram. As a result, the stimulus to these events is pro- vided by the system itself in the case of the timer port
Team Software Development for Aerothermodynamic and Aerodynamic Analysis and Design
NASA Technical Reports Server (NTRS)
Alexandrov, N.; Atkins, H. L.; Bibb, K. L.; Biedron, R. T.; Carpenter, M. H.; Gnoffo, P. A.; Hammond, D. P.; Jones, W. T.; Kleb, W. L.; Lee-Rausch, E. M.
2003-01-01
A collaborative approach to software development is described. The approach employs the agile development techniques: project retrospectives, Scrum status meetings, and elements of Extreme Programming to efficiently develop a cohesive and extensible software suite. The software product under development is a fluid dynamics simulator for performing aerodynamic and aerothermodynamic analysis and design. The functionality of the software product is achieved both through the merging, with substantial rewrite, of separate legacy codes and the authorship of new routines. Examples of rapid implementation of new functionality demonstrate the benefits obtained with this agile software development process. The appendix contains a discussion of coding issues encountered while porting legacy Fortran 77 code to Fortran 95, software design principles, and a Fortran 95 coding standard.
SAO mission support software and data standards, version 1.0
NASA Technical Reports Server (NTRS)
Hsieh, P.
1993-01-01
This document defines the software developed by the SAO AXAF Mission Support (MS) Program and defines standards for the software development process and control of data products generated by the software. The SAO MS is tasked to develop and use software to perform a variety of functions in support of the AXAF mission. Software is developed by software engineers and scientists, and commercial off-the-shelf (COTS) software is used either directly or customized through the use of scripts to implement analysis procedures. Software controls real-time laboratory instruments, performs data archiving, displays data, and generates model predictions. Much software is used in the analysis of data to generate data products that are required by the AXAF project, for example, on-orbit mirror performance predictions or detailed characterization of the mirror reflection performance with energy.
Software technology insertion: A study of success factors
NASA Technical Reports Server (NTRS)
Lydon, Tom
1990-01-01
Managing software development in large organizations has become increasingly difficult due to increasing technical complexity, stricter government standards, a shortage of experienced software engineers, competitive pressure for improved productivity and quality, the need to co-develop hardware and software together, and the rapid changes in both hardware and software technology. The 'software factory' approach to software development minimizes risks while maximizing productivity and quality through standardization, automation, and training. However, in practice, this approach is relatively inflexible when adopting new software technologies. The methods that a large multi-project software engineering organization can use to increase the likelihood of successful software technology insertion (STI), especially in a standardized engineering environment, are described.
NASA Technical Reports Server (NTRS)
Church, Victor E.; Long, D.; Hartenstein, Ray; Perez-Davila, Alfredo
1992-01-01
A set of functional requirements for software configuration management (CM) and metrics reporting for Space Station Freedom ground systems software are described. This report is one of a series from a study of the interfaces among the Ground Systems Development Environment (GSDE), the development systems for the Space Station Training Facility (SSTF) and the Space Station Control Center (SSCC), and the target systems for SSCC and SSTF. The focus is on the CM of the software following delivery to NASA and on the software metrics that relate to the quality and maintainability of the delivered software. The CM and metrics requirements address specific problems that occur in large-scale software development. Mechanisms to assist in the continuing improvement of mission operations software development are described.
ERIC Educational Resources Information Center
Ichu, Emmanuel A.
2010-01-01
Software quality is perhaps one of the most sought-after attributes in product development, however; this goal is unattained. Problem factors in software development and how these have affected the maintainability of the delivered software systems requires a thorough investigation. It was, therefore, very important to understand software…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Younkin, James R.; Garner, James R.
2017-04-01
Over the last five years, OLEM has been a collaborative development effort involving the IAEA, LANL, ORNL, URENCO, and the NNSA. The collective team has completed the following: design and modelling, software development, hardware integration, testing with the ORNL UF6 Flow Loop, a field trial at the Urenco facility in Almelo, the Netherlands, and a Demonstration at the Urenco USA facility in Eunice, New Mexico. This combined effort culminated in the deployment of several OLEM collection nodes in Iran. These OLEM units are one unattended monitoring system component of the Joint Comprehensive Plan of Action allowing the International Atomic Energymore » Agency to verify Iran’s compliance with the enrichment production aspects of the agreement.« less
Countermeasures for Time-Cheat Detection in Multiplayer Online Games
NASA Astrophysics Data System (ADS)
Ferretti, Stefano
Cheating is an important issue in games. Depending on the system over which the game is deployed, several types of malicious actions may be accomplished so as to take an unfair and unexpected advantage over the game and over the (digital, human) adversaries. When the game is a standalone application, cheats typically just relate to the specific software code being developed to build the application. It is not a surprise to find (in the Web and in specialized magazines) people that explain cheats on specific games stating, for instance, which configuration files can be altered (and how to do it) to automatically gain some bonus during the game. To avoid this, game developers are hence motivated to build stable code, with related data that should be securely managed and made difficult to alter.
Automated Software Development Workstation (ASDW)
NASA Technical Reports Server (NTRS)
Fridge, Ernie
1990-01-01
Software development is a serious bottleneck in the construction of complex automated systems. An increase of the reuse of software designs and components has been viewed as a way to relieve this bottleneck. One approach to achieving software reusability is through the development and use of software parts composition systems. A software parts composition system is a software development environment comprised of a parts description language for modeling parts and their interfaces, a catalog of existing parts, a composition editor that aids a user in the specification of a new application from existing parts, and a code generator that takes a specification and generates an implementation of a new application in a target language. The Automated Software Development Workstation (ASDW) is an expert system shell that provides the capabilities required to develop and manipulate these software parts composition systems. The ASDW is now in Beta testing at the Johnson Space Center. Future work centers on responding to user feedback for capability and usability enhancement, expanding the scope of the software lifecycle that is covered, and in providing solutions to handling very large libraries of reusable components.
Promoting Science Software Best Practices: A Scientist's Perspective (Invited)
NASA Astrophysics Data System (ADS)
Blanton, B. O.
2013-12-01
Software is at the core of most modern scientific activities, and as societal awareness of, and impacts from, extreme weather, disasters, and climate and global change continue to increase, the roles that scientific software play in analyses and decision-making are brought more to the forefront. Reproducibility of research results (particularly those that enter into the decision-making arena) and open access to the software is essential for scientific and scientists' credibility. This has been highlighted in a recent article by Joppa et al (Troubling Trends in Scientific Software Use, Science Magazine, May 2013) that describes reasons for particular software being chosen by scientists, including that the "developer is well-respected" and on "recommendation from a close colleague". This reliance on recommendation, Joppa et al conclude, is fraught with risks to both sciences and scientists. Scientists must frequently take software for granted, assuming that it performs as expected and advertised and that the software itself has been validated and results verified. This is largely due to the manner in which much software is written and developed; in an ad hoc manner, with an inconsistent funding stream, and with little application of core software engineering best practices. Insufficient documentation, limited test cases, and code unavailability are significant barriers to informed and intelligent science software usage. This situation is exacerbated when the scientist becomes the software developer out of necessity due to resource constraints. Adoption of, and adherence to, best practices in scientific software development will substantially increase intelligent software usage and promote a sustainable evolution of the science as encoded in the software. We describe a typical scientist's perspective on using and developing scientific software in the context of storm surge research and forecasting applications that have real-time objectives and regulatory constraints. This include perspectives on what scientists/users of software can contribute back to the software development process and examples of successful scientist/developer interactions, and the competition between "getting it done" and "getting it done right".
Reuse at the Software Productivity Consortium
NASA Technical Reports Server (NTRS)
Weiss, David M.
1989-01-01
The Software Productivity Consortium is sponsored by 14 aerospace companies as a developer of software engineering methods and tools. Software reuse and prototyping are currently the major emphasis areas. The Methodology and Measurement Project in the Software Technology Exploration Division has developed some concepts for reuse which they intend to develop into a synthesis process. They have identified two approaches to software reuse: opportunistic and systematic. The assumptions underlying the systematic approach, phrased as hypotheses, are the following: the redevelopment hypothesis, i.e., software developers solve the same problems repeatedly; the oracle hypothesis, i.e., developers are able to predict variations from one redevelopment to others; and the organizational hypothesis, i.e., software must be organized according to behavior and structure to take advantage of the predictions that the developers make. The conceptual basis for reuse includes: program families, information hiding, abstract interfaces, uses and information hiding hierarchies, and process structure. The primary reusable software characteristics are black-box descriptions, structural descriptions, and composition and decomposition based on program families. Automated support can be provided for systematic reuse, and the Consortium is developing a prototype reuse library and guidebook. The software synthesis process that the Consortium is aiming toward includes modeling, refinement, prototyping, reuse, assessment, and new construction.
Automated support for experience-based software management
NASA Technical Reports Server (NTRS)
Valett, Jon D.
1992-01-01
To effectively manage a software development project, the software manager must have access to key information concerning a project's status. This information includes not only data relating to the project of interest, but also, the experience of past development efforts within the environment. This paper describes the concepts and functionality of a software management tool designed to provide this information. This tool, called the Software Management Environment (SME), enables the software manager to compare an ongoing development effort with previous efforts and with models of the 'typical' project within the environment, to predict future project status, to analyze a project's strengths and weaknesses, and to assess the project's quality. In order to provide these functions the tool utilizes a vast corporate memory that includes a data base of software metrics, a set of models and relationships that describe the software development environment, and a set of rules that capture other knowledge and experience of software managers within the environment. Integrating these major concepts into one software management tool, the SME is a model of the type of management tool needed for all software development organizations.
Graziotin, Daniel; Wang, Xiaofeng; Abrahamsson, Pekka
2014-01-01
For more than thirty years, it has been claimed that a way to improve software developers' productivity and software quality is to focus on people and to provide incentives to make developers satisfied and happy. This claim has rarely been verified in software engineering research, which faces an additional challenge in comparison to more traditional engineering fields: software development is an intellectual activity and is dominated by often-neglected human factors (called human aspects in software engineering research). Among the many skills required for software development, developers must possess high analytical problem-solving skills and creativity for the software construction process. According to psychology research, affective states-emotions and moods-deeply influence the cognitive processing abilities and performance of workers, including creativity and analytical problem solving. Nonetheless, little research has investigated the correlation between the affective states, creativity, and analytical problem-solving performance of programmers. This article echoes the call to employ psychological measurements in software engineering research. We report a study with 42 participants to investigate the relationship between the affective states, creativity, and analytical problem-solving skills of software developers. The results offer support for the claim that happy developers are indeed better problem solvers in terms of their analytical abilities. The following contributions are made by this study: (1) providing a better understanding of the impact of affective states on the creativity and analytical problem-solving capacities of developers, (2) introducing and validating psychological measurements, theories, and concepts of affective states, creativity, and analytical-problem-solving skills in empirical software engineering, and (3) raising the need for studying the human factors of software engineering by employing a multidisciplinary viewpoint.
Software Quality Assurance Metrics
NASA Technical Reports Server (NTRS)
McRae, Kalindra A.
2004-01-01
Software Quality Assurance (SQA) is a planned and systematic set of activities that ensures conformance of software life cycle processes and products conform to requirements, standards and procedures. In software development, software quality means meeting requirements and a degree of excellence and refinement of a project or product. Software Quality is a set of attributes of a software product by which its quality is described and evaluated. The set of attributes includes functionality, reliability, usability, efficiency, maintainability, and portability. Software Metrics help us understand the technical process that is used to develop a product. The process is measured to improve it and the product is measured to increase quality throughout the life cycle of software. Software Metrics are measurements of the quality of software. Software is measured to indicate the quality of the product, to assess the productivity of the people who produce the product, to assess the benefits derived from new software engineering methods and tools, to form a baseline for estimation, and to help justify requests for new tools or additional training. Any part of the software development can be measured. If Software Metrics are implemented in software development, it can save time, money, and allow the organization to identify the caused of defects which have the greatest effect on software development. The summer of 2004, I worked with Cynthia Calhoun and Frank Robinson in the Software Assurance/Risk Management department. My task was to research and collect, compile, and analyze SQA Metrics that have been used in other projects that are not currently being used by the SA team and report them to the Software Assurance team to see if any metrics can be implemented in their software assurance life cycle process.
Stimfit: quantifying electrophysiological data with Python
Guzman, Segundo J.; Schlögl, Alois; Schmidt-Hieber, Christoph
2013-01-01
Intracellular electrophysiological recordings provide crucial insights into elementary neuronal signals such as action potentials and synaptic currents. Analyzing and interpreting these signals is essential for a quantitative understanding of neuronal information processing, and requires both fast data visualization and ready access to complex analysis routines. To achieve this goal, we have developed Stimfit, a free software package for cellular neurophysiology with a Python scripting interface and a built-in Python shell. The program supports most standard file formats for cellular neurophysiology and other biomedical signals through the Biosig library. To quantify and interpret the activity of single neurons and communication between neurons, the program includes algorithms to characterize the kinetics of presynaptic action potentials and postsynaptic currents, estimate latencies between pre- and postsynaptic events, and detect spontaneously occurring events. We validate and benchmark these algorithms, give estimation errors, and provide sample use cases, showing that Stimfit represents an efficient, accessible and extensible way to accurately analyze and interpret neuronal signals. PMID:24600389
Evaluation of Cable Harness Post-Installation Testing. Part B
NASA Technical Reports Server (NTRS)
King, M. S.; Iannello, C. J.
2011-01-01
The Cable Harness Post-Installation Testing Report was written in response to an action issued by the Ares Project Control Board (PCB). The action for the Ares I Avionics & Software Chief Engineer and the Avionics Integration and Vehicle Systems Test Work Breakdown Structure (WBS) Manager in the Vehicle Integration Office was to develop a set of guidelines for electrical cable harnesses. Research showed that post-installation tests have been done since the Apollo era. For Ares I-X, the requirement for post-installation testing was removed to make it consistent with the avionics processes used on the Atlas V expendable launch vehicle. Further research for the report involved surveying government and private sector launch vehicle developers, military and commercial aircraft, spacecraft developers, and harness vendors. Responses indicated crewed launch vehicles and military aircraft perform post-installation tests. Key findings in the report were as follows: Test requirements identify damage, human-rated vehicles should be tested despite the identification of statistically few failures, data does not support the claim that post-installation testing damages the harness insulation system, and proper planning can reduce overhead associated with testing. The primary recommendation of the report is for the Ares projects to retain the practice of post-fabrication and post-installation cable harness testing.
Cyberwar XXI: quantifying the unquantifiable: adaptive AI for next-generation conflict simulations
NASA Astrophysics Data System (ADS)
Miranda, Joseph; von Kleinsmid, Peter; Zalewski, Tony
2004-08-01
The era of the "Revolution in Military Affairs," "4th Generation Warfare" and "Asymmetric War" requires novel approaches to modeling warfare at the operational and strategic level of modern conflict. For example, "What if, in response to our planned actions, the adversary reacts in such-and-such a manner? What will our response be? What are the possible unintended consequences?" Next generation conflict simulation tools are required to help create and test novel courses of action (COA's) in support of real-world operations. Conflict simulations allow non-lethal and cost-effective exploration of the "what-if" of COA development. The challenge has been to develop an automated decision-support software tool which allows competing COA"s to be compared in simulated dynamic environments. Principal Investigator Joseph Miranda's research is based on modeling an integrated military, economic, social, infrastructure and information (PMESII) environment. The main effort was to develop an adaptive AI engine which models agents operating within an operational-strategic conflict environment. This was implemented in Cyberwar XXI - a simulation which models COA selection in a PMESII environment. Within this framework, agents simulate decision-making processes and provide predictive capability of the potential behavior of Command Entities. The 2003 Iraq is the first scenario ready for V&V testing.
Software Configuration Management Guidebook
NASA Technical Reports Server (NTRS)
1995-01-01
The growth in cost and importance of software to NASA has caused NASA to address the improvement of software development across the agency. One of the products of this program is a series of guidebooks that define a NASA concept of the assurance processes which are used in software development. The Software Assurance Guidebook, SMAP-GB-A201, issued in September, 1989, provides an overall picture of the concepts and practices of NASA in software assurance. Lower level guidebooks focus on specific activities that fall within the software assurance discipline, and provide more detailed information for the manager and/or practitioner. This is the Software Configuration Management Guidebook which describes software configuration management in a way that is compatible with practices in industry and at NASA Centers. Software configuration management is a key software development process, and is essential for doing software assurance.
Software Reuse Within the Earth Science Community
NASA Technical Reports Server (NTRS)
Marshall, James J.; Olding, Steve; Wolfe, Robert E.; Delnore, Victor E.
2006-01-01
Scientific missions in the Earth sciences frequently require cost-effective, highly reliable, and easy-to-use software, which can be a challenge for software developers to provide. The NASA Earth Science Enterprise (ESE) spends a significant amount of resources developing software components and other software development artifacts that may also be of value if reused in other projects requiring similar functionality. In general, software reuse is often defined as utilizing existing software artifacts. Software reuse can improve productivity and quality while decreasing the cost of software development, as documented by case studies in the literature. Since large software systems are often the results of the integration of many smaller and sometimes reusable components, ensuring reusability of such software components becomes a necessity. Indeed, designing software components with reusability as a requirement can increase the software reuse potential within a community such as the NASA ESE community. The NASA Earth Science Data Systems (ESDS) Software Reuse Working Group is chartered to oversee the development of a process that will maximize the reuse potential of existing software components while recommending strategies for maximizing the reusability potential of yet-to-be-designed components. As part of this work, two surveys of the Earth science community were conducted. The first was performed in 2004 and distributed among government employees and contractors. A follow-up survey was performed in 2005 and distributed among a wider community, to include members of industry and academia. The surveys were designed to collect information on subjects such as the current software reuse practices of Earth science software developers, why they choose to reuse software, and what perceived barriers prevent them from reusing software. In this paper, we compare the results of these surveys, summarize the observed trends, and discuss the findings. The results are very similar, with the second, larger survey confirming the basic results of the first, smaller survey. The results suggest that reuse of ESE software can drive down the cost and time of system development, increase flexibility and responsiveness of these systems to new technologies and requirements, and increase effective and accountable community participation.
Software engineering methodologies and tools
NASA Technical Reports Server (NTRS)
Wilcox, Lawrence M.
1993-01-01
Over the years many engineering disciplines have developed, including chemical, electronic, etc. Common to all engineering disciplines is the use of rigor, models, metrics, and predefined methodologies. Recently, a new engineering discipline has appeared on the scene, called software engineering. For over thirty years computer software has been developed and the track record has not been good. Software development projects often miss schedules, are over budget, do not give the user what is wanted, and produce defects. One estimate is there are one to three defects per 1000 lines of deployed code. More and more systems are requiring larger and more complex software for support. As this requirement grows, the software development problems grow exponentially. It is believed that software quality can be improved by applying engineering principles. Another compelling reason to bring the engineering disciplines to software development is productivity. It has been estimated that productivity of producing software has only increased one to two percent a year in the last thirty years. Ironically, the computer and its software have contributed significantly to the industry-wide productivity, but computer professionals have done a poor job of using the computer to do their job. Engineering disciplines and methodologies are now emerging supported by software tools that address the problems of software development. This paper addresses some of the current software engineering methodologies as a backdrop for the general evaluation of computer assisted software engineering (CASE) tools from actual installation of and experimentation with some specific tools.
Impact of Agile Software Development Model on Software Maintainability
ERIC Educational Resources Information Center
Gawali, Ajay R.
2012-01-01
Software maintenance and support costs account for up to 60% of the overall software life cycle cost and often burdens tightly budgeted information technology (IT) organizations. Agile software development approach delivers business value early, but implications on software maintainability are still unknown. The purpose of this quantitative study…
ERIC Educational Resources Information Center
Boyd, David W.
1993-01-01
Asserts that a new generation of software authoring applications has led to improvements in the development of economics education software. Describes new software development applications and discusses how to use them. Concludes that object-oriented programming helps economists develop their own courseware. (CFR)
ERIC Educational Resources Information Center
Kramer, Aleksey
2013-01-01
The topic of software security has become paramount in information technology (IT) related scholarly research. Researchers have addressed numerous software security topics touching on all phases of the Software Development Life Cycle (SDLC): requirements gathering phase, design phase, development phase, testing phase, and maintenance phase.…
Application of industry-standard guidelines for the validation of avionics software
NASA Technical Reports Server (NTRS)
Hayhurst, Kelly J.; Shagnea, Anita M.
1990-01-01
The application of industry standards to the development of avionics software is discussed, focusing on verification and validation activities. It is pointed out that the procedures that guide the avionics software development and testing process are under increased scrutiny. The DO-178A guidelines, Software Considerations in Airborne Systems and Equipment Certification, are used by the FAA for certifying avionics software. To investigate the effectiveness of the DO-178A guidelines for improving the quality of avionics software, guidance and control software (GCS) is being developed according to the DO-178A development method. It is noted that, due to the extent of the data collection and configuration management procedures, any phase in the life cycle of a GCS implementation can be reconstructed. Hence, a fundamental development and testing platform has been established that is suitable for investigating the adequacy of various software development processes. In particular, the overall effectiveness and efficiency of the development method recommended by the DO-178A guidelines are being closely examined.
A Comparison of Authoring Software for Developing Mathematics Self-Learning Software Packages.
ERIC Educational Resources Information Center
Suen, Che-yin; Pok, Yang-ming
Four years ago, the authors started to develop a self-paced mathematics learning software called NPMaths by using an authoring package called Tencore. However, NPMaths had some weak points. A development team was hence formed to develop similar software called Mathematics On Line. This time the team used another development language called…
NASA Technical Reports Server (NTRS)
1982-01-01
An effective data collection methodology for evaluating software development methodologies was applied to four different software development projects. Goals of the data collection included characterizing changes and errors, characterizing projects and programmers, identifying effective error detection and correction techniques, and investigating ripple effects. The data collected consisted of changes (including error corrections) made to the software after code was written and baselined, but before testing began. Data collection and validation were concurrent with software development. Changes reported were verified by interviews with programmers.
Final Documentation: Incident Management And Probabilities Courses of action Tool (IMPACT).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Edwards, Donna M.; Ray, Jaideep; Tucker, Mark D.
This report pulls together the documentation produced for the IMPACT tool, a software-based decision support tool that provides situational awareness, incident characterization, and guidance on public health and environmental response strategies for an unfolding bio-terrorism incident.
NASA Astrophysics Data System (ADS)
Venkateswarlu, P.; Reddy, M. A.; Prasad, A. T.
2003-12-01
Application of Remote Sensing and Geographic Information System for the development of land and water resources action plan at micro level for appropriate management of land/water resources of a watershed in rain fed region of Prakasam District in Andhra Pradesh, India forms the focal theme of this paper. The quantitative description of drainage basin geometry can be effectively determined using Remote Sensing and GIS techniques. Each of the sixty-two sub-watersheds of the study area have been studied in terms of the Morphometric parameters - Stream length, Bifurcation ratio, Length ratio, Drainage density, Stream frequency, Texture ratio, Form factor, Area Perimeters, Circularity ratio and Elongation ratio and prioritized all the sub-watersheds under study. The prioritization of sub sheds based on morphometry is compared with sediment yield prioritization and found nearly same for the study area. The information obtained from the thematic maps are integrated and action plans are suggested for land and water resources development on a sustainable basis. Landuse/Landcover, Hydrogeomorphology and Soil thematic maps were generated. In addition slope and Drainage maps were prepared from Survey of India toposheets. Based on the computerized database created using ARC/INFO software, information derived in terms of natural resources and their spatial distribution was then integrated with the socio economic data to formulate an action plan, which includes suggestion of alternative Landuse/Landcover practices. Such a plan is useful for natural resources management and for improving the socio-economic status of rural population on a sustainable basis. Keywords: Natural Resources, Remote Sensing, Morphometry sustainable development.
NASA Technical Reports Server (NTRS)
Mayer, Richard J.; Blinn, Thomas M.; Mayer, Paula S. D.; Ackley, Keith A.; Crump, Wes; Sanders, Les
1991-01-01
The design of the Framework Processor (FP) component of the Framework Programmable Software Development Platform (FFP) is described. The FFP is a project aimed at combining effective tool and data integration mechanisms with a model of the software development process in an intelligent integrated software development environment. Guided by the model, this Framework Processor will take advantage of an integrated operating environment to provide automated support for the management and control of the software development process so that costly mistakes during the development phase can be eliminated.
Dow, Rustam; Barnsley, Jan; Tu, Karen; Domb, Sharon; Jadad, Alejandro R.; Lemieux-Charles, Louise
2015-01-01
Research problem Tutorials and user manuals are important forms of impersonal support for using software applications including electronic medical records (EMRs). Differences between user- and vendor documentation may indicate support needs, which are not sufficiently addressed by the official documentation, and reveal new elements that may inform the design of tutorials and user manuals. Research question What are the differences between user-generated tutorials and manuals for an EMR and the official user manual from the software vendor? Literature review Effective design of tutorials and user manuals requires careful packaging of information, balance between declarative and procedural texts, an action and task-oriented approach, support for error recognition and recovery, and effective use of visual elements. No previous research compared these elements between formal and informal documents. Methodology We conducted an mixed methods study. Seven tutorials and two manuals for an EMR were collected from three family health teams and compared with the official user manual from the software vendor. Documents were qualitatively analyzed using a framework analysis approach in relation to the principles of technical documentation described above. Subsets of the data were quantitatively analyzed using cross-tabulation to compare the types of error information and visual cues in screen captures between user- and vendor-generated manuals. Results and discussion The user-developed tutorials and manuals differed from the vendor-developed manual in that they contained mostly procedural and not declarative information; were customized to the specific workflow, user roles, and patient characteristics; contained more error information related to work processes than to software usage; and used explicit visual cues on screen captures to help users identify window elements. These findings imply that to support EMR implementation, tutorials and manuals need to be customized and adapted to specific organizational contexts and workflows. The main limitation of the study is its generalizability. Future research should address this limitation and may explore alternative approaches to software documentation, such as modular manuals or participatory design. PMID:26190888
Organizational management practices for achieving software process improvement
NASA Technical Reports Server (NTRS)
Kandt, Ronald Kirk
2004-01-01
The crisis in developing software has been known for over thirty years. Problems that existed in developing software in the early days of computing still exist today. These problems include the delivery of low-quality products, actual development costs that exceed expected development costs, and actual development time that exceeds expected development time. Several solutions have been offered to overcome out inability to deliver high-quality software, on-time and within budget. One of these solutions involves software process improvement. However, such efforts often fail because of organizational management issues. This paper discusses business practices that organizations should follow to improve their chances of initiating and sustaining successful software process improvement efforts.
Blank, Antje; Prytherch, Helen; Kaltschmidt, Jens; Krings, Andreas; Sukums, Felix; Mensah, Nathan; Zakane, Alphonse; Loukanova, Svetla; Gustafsson, Lars L; Sauerborn, Rainer; Haefeli, Walter E
2013-04-10
Despite strong efforts to improve maternal care, its quality remains deficient in many countries of Sub-Saharan Africa as persistently high maternal mortality rates testify. The QUALMAT study seeks to improve the performance and motivation of rural health workers and ultimately quality of primary maternal health care services in three African countries Burkina Faso, Ghana, and Tanzania. One major intervention is the introduction of a computerized Clinical Decision Support System (CDSS) for rural primary health care centers to be used by health care workers of different educational levels. A stand-alone, java-based software, able to run on any standard hardware, was developed based on assessment of the health care situation in the involved countries. The software scope was defined and the final software was programmed under consideration of test experiences. Knowledge for the decision support derived from the World Health Organization (WHO) guideline "Pregnancy, Childbirth, Postpartum and Newborn Care; A Guide for Essential Practice". The QUALMAT CDSS provides computerized guidance and clinical decision support for antenatal care, and care during delivery and up to 24 hours post delivery. The decision support is based on WHO guidelines and designed using three principles: (1) Guidance through routine actions in maternal and perinatal care, (2) integration of clinical data to detect situations of concern by algorithms, and (3) electronic tracking of peri- and postnatal activities. In addition, the tool facilitates patient management and is a source of training material. The implementation of the software, which is embedded in a set of interventions comprising the QUALMAT study, is subject to various research projects assessing and quantifying the impact of the CDSS on quality of care, the motivation of health care staff (users) and its health economic aspects. The software will also be assessed for its usability and acceptance, as well as for its influence on workflows in the rural setting of primary health care in the three countries involved. The development and implementation of a CDSS in rural primary health care centres presents challenges, which may be overcome with careful planning and involvement of future users at an early stage. A tailored software with stable functionality should offer perspectives to improve maternal care in resource-poor settings.
2013-01-01
Background Despite strong efforts to improve maternal care, its quality remains deficient in many countries of Sub-Saharan Africa as persistently high maternal mortality rates testify. The QUALMAT study seeks to improve the performance and motivation of rural health workers and ultimately quality of primary maternal health care services in three African countries Burkina Faso, Ghana, and Tanzania. One major intervention is the introduction of a computerized Clinical Decision Support System (CDSS) for rural primary health care centers to be used by health care workers of different educational levels. Methods A stand-alone, java-based software, able to run on any standard hardware, was developed based on assessment of the health care situation in the involved countries. The software scope was defined and the final software was programmed under consideration of test experiences. Knowledge for the decision support derived from the World Health Organization (WHO) guideline “Pregnancy, Childbirth, Postpartum and Newborn Care; A Guide for Essential Practice”. Results The QUALMAT CDSS provides computerized guidance and clinical decision support for antenatal care, and care during delivery and up to 24 hours post delivery. The decision support is based on WHO guidelines and designed using three principles: (1) Guidance through routine actions in maternal and perinatal care, (2) integration of clinical data to detect situations of concern by algorithms, and (3) electronic tracking of peri- and postnatal activities. In addition, the tool facilitates patient management and is a source of training material. The implementation of the software, which is embedded in a set of interventions comprising the QUALMAT study, is subject to various research projects assessing and quantifying the impact of the CDSS on quality of care, the motivation of health care staff (users) and its health economic aspects. The software will also be assessed for its usability and acceptance, as well as for its influence on workflows in the rural setting of primary health care in the three countries involved. Conclusion The development and implementation of a CDSS in rural primary health care centres presents challenges, which may be overcome with careful planning and involvement of future users at an early stage. A tailored software with stable functionality should offer perspectives to improve maternal care in resource-poor settings. Trial registration http://www.clinicaltrials.gov/NCT01409824. PMID:23574764
Third-Party Software's Trust Quagmire.
Voas, J; Hurlburt, G
2015-12-01
Current software development has trended toward the idea of integrating independent software sub-functions to create more complete software systems. Software sub-functions are often not homegrown - instead they are developed by unknown 3 rd party organizations and reside in software marketplaces owned or controlled by others. Such software sub-functions carry plausible concern in terms of quality, origins, functionality, security, interoperability, to name a few. This article surveys key technical difficulties in confidently building systems from acquired software sub-functions by calling out the principle software supply chain actors.
Wang, Xiaofeng; Abrahamsson, Pekka
2014-01-01
For more than thirty years, it has been claimed that a way to improve software developers’ productivity and software quality is to focus on people and to provide incentives to make developers satisfied and happy. This claim has rarely been verified in software engineering research, which faces an additional challenge in comparison to more traditional engineering fields: software development is an intellectual activity and is dominated by often-neglected human factors (called human aspects in software engineering research). Among the many skills required for software development, developers must possess high analytical problem-solving skills and creativity for the software construction process. According to psychology research, affective states—emotions and moods—deeply influence the cognitive processing abilities and performance of workers, including creativity and analytical problem solving. Nonetheless, little research has investigated the correlation between the affective states, creativity, and analytical problem-solving performance of programmers. This article echoes the call to employ psychological measurements in software engineering research. We report a study with 42 participants to investigate the relationship between the affective states, creativity, and analytical problem-solving skills of software developers. The results offer support for the claim that happy developers are indeed better problem solvers in terms of their analytical abilities. The following contributions are made by this study: (1) providing a better understanding of the impact of affective states on the creativity and analytical problem-solving capacities of developers, (2) introducing and validating psychological measurements, theories, and concepts of affective states, creativity, and analytical-problem-solving skills in empirical software engineering, and (3) raising the need for studying the human factors of software engineering by employing a multidisciplinary viewpoint. PMID:24688866
Whole earth modeling: developing and disseminating scientific software for computational geophysics.
NASA Astrophysics Data System (ADS)
Kellogg, L. H.
2016-12-01
Historically, a great deal of specialized scientific software for modeling and data analysis has been developed by individual researchers or small groups of scientists working on their own specific research problems. As the magnitude of available data and computer power has increased, so has the complexity of scientific problems addressed by computational methods, creating both a need to sustain existing scientific software, and expand its development to take advantage of new algorithms, new software approaches, and new computational hardware. To that end, communities like the Computational Infrastructure for Geodynamics (CIG) have been established to support the use of best practices in scientific computing for solid earth geophysics research and teaching. Working as a scientific community enables computational geophysicists to take advantage of technological developments, improve the accuracy and performance of software, build on prior software development, and collaborate more readily. The CIG community, and others, have adopted an open-source development model, in which code is developed and disseminated by the community in an open fashion, using version control and software repositories like Git. One emerging issue is how to adequately identify and credit the intellectual contributions involved in creating open source scientific software. The traditional method of disseminating scientific ideas, peer reviewed publication, was not designed for review or crediting scientific software, although emerging publication strategies such software journals are attempting to address the need. We are piloting an integrated approach in which authors are identified and credited as scientific software is developed and run. Successful software citation requires integration with the scholarly publication and indexing mechanisms as well, to assign credit, ensure discoverability, and provide provenance for software.
Software Quality Perceptions of Stakeholders Involved in the Software Development Process
ERIC Educational Resources Information Center
Padmanabhan, Priya
2013-01-01
Software quality is one of the primary determinants of project management success. Stakeholders involved in software development widely agree that quality is important (Barney and Wohlin 2009). However, they may differ on what constitutes software quality, and which of its attributes are more important than others. Although, software quality…
Computer-aided software development process design
NASA Technical Reports Server (NTRS)
Lin, Chi Y.; Levary, Reuven R.
1989-01-01
The authors describe an intelligent tool designed to aid managers of software development projects in planning, managing, and controlling the development process of medium- to large-scale software projects. Its purpose is to reduce uncertainties in the budget, personnel, and schedule planning of software development projects. It is based on dynamic model for the software development and maintenance life-cycle process. This dynamic process is composed of a number of time-varying, interacting developmental phases, each characterized by its intended functions and requirements. System dynamics is used as a modeling methodology. The resulting Software LIfe-Cycle Simulator (SLICS) and the hybrid expert simulation system of which it is a subsystem are described.
A high order approach to flight software development and testing
NASA Technical Reports Server (NTRS)
Steinbacher, J.
1981-01-01
The use of a software development facility is discussed as a means of producing a reliable and maintainable ECS software system, and as a means of providing efficient use of the ECS hardware test facility. Principles applied to software design are given, including modularity, abstraction, hiding, and uniformity. The general objectives of each phase of the software life cycle are also given, including testing, maintenance, code development, and requirement specifications. Software development facility tools are summarized, and tool deficiencies recognized in the code development and testing phases are considered. Due to limited lab resources, the functional simulation capabilities may be indispensable in the testing phase.
Software Reliability Analysis of NASA Space Flight Software: A Practical Experience
Sukhwani, Harish; Alonso, Javier; Trivedi, Kishor S.; Mcginnis, Issac
2017-01-01
In this paper, we present the software reliability analysis of the flight software of a recently launched space mission. For our analysis, we use the defect reports collected during the flight software development. We find that this software was developed in multiple releases, each release spanning across all software life-cycle phases. We also find that the software releases were developed and tested for four different hardware platforms, spanning from off-the-shelf or emulation hardware to actual flight hardware. For releases that exhibit reliability growth or decay, we fit Software Reliability Growth Models (SRGM); otherwise we fit a distribution function. We find that most releases exhibit reliability growth, with Log-Logistic (NHPP) and S-Shaped (NHPP) as the best-fit SRGMs. For the releases that experience reliability decay, we investigate the causes for the same. We find that such releases were the first software releases to be tested on a new hardware platform, and hence they encountered major hardware integration issues. Also such releases seem to have been developed under time pressure in order to start testing on the new hardware platform sooner. Such releases exhibit poor reliability growth, and hence exhibit high predicted failure rate. Other problems include hardware specification changes and delivery delays from vendors. Thus, our analysis provides critical insights and inputs to the management to improve the software development process. As NASA has moved towards a product line engineering for its flight software development, software for future space missions will be developed in a similar manner and hence the analysis results for this mission can be considered as a baseline for future flight software missions. PMID:29278255
Software Reliability Analysis of NASA Space Flight Software: A Practical Experience.
Sukhwani, Harish; Alonso, Javier; Trivedi, Kishor S; Mcginnis, Issac
2016-01-01
In this paper, we present the software reliability analysis of the flight software of a recently launched space mission. For our analysis, we use the defect reports collected during the flight software development. We find that this software was developed in multiple releases, each release spanning across all software life-cycle phases. We also find that the software releases were developed and tested for four different hardware platforms, spanning from off-the-shelf or emulation hardware to actual flight hardware. For releases that exhibit reliability growth or decay, we fit Software Reliability Growth Models (SRGM); otherwise we fit a distribution function. We find that most releases exhibit reliability growth, with Log-Logistic (NHPP) and S-Shaped (NHPP) as the best-fit SRGMs. For the releases that experience reliability decay, we investigate the causes for the same. We find that such releases were the first software releases to be tested on a new hardware platform, and hence they encountered major hardware integration issues. Also such releases seem to have been developed under time pressure in order to start testing on the new hardware platform sooner. Such releases exhibit poor reliability growth, and hence exhibit high predicted failure rate. Other problems include hardware specification changes and delivery delays from vendors. Thus, our analysis provides critical insights and inputs to the management to improve the software development process. As NASA has moved towards a product line engineering for its flight software development, software for future space missions will be developed in a similar manner and hence the analysis results for this mission can be considered as a baseline for future flight software missions.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-02
... NUCLEAR REGULATORY COMMISSION [NRC-2012-0195] Developing Software Life Cycle Processes Used in... revised regulatory guide (RG), revision 1 of RG 1.173, ``Developing Software Life Cycle Processes for... Developing a Software Project Life Cycle Process,'' issued 2006, with the clarifications and exceptions as...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-02
... Software Developers on the Technical Specifications for Common Formats for Patient Safety Data Collection... software developers can provide input on these technical specifications for the Common Formats Version 1.1... specifications, which provide direction to software developers that plan to implement the Common Formats...
IT Software Development and IT Operations Strategic Alignment: An Agile DevOps Model
ERIC Educational Resources Information Center
Hart, Michael
2017-01-01
Information Technology (IT) departments that include development and operations are essential to develop software that meet customer needs. DevOps is a term originally constructed from software development and IT operations. DevOps includes the collaboration of all stakeholders such as software engineers and systems administrators involved in the…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-25
... Software Developers on the Technical Specifications for Common Formats for Patient Safety Data Collection... designed as an interactive forum where PSOs and software developers can provide input on these technical... updated event descriptions, forms, and technical specifications for software developers. As an update to...
NASA Astrophysics Data System (ADS)
Pajewski, Lara; Benedetto, Andrea; Loizos, Andreas; Slob, Evert; Tosti, Fabio
2013-04-01
Ground Penetrating Radar (GPR) is a safe, non-destructive and non-invasive imaging technique that can be effectively used for advanced inspection of composite structures and for diagnostics affecting the whole life-cycle of civil engineering works. GPR provides high resolution images of structures and subsurface through wide-band electromagnetic waves. It can be employed for the surveying of roads, pavements, bridges, tunnels, for detecting underground cavities and voids, for utility sensing, for the inspection of buildings, reinforced concrete and pre-cast concrete structures, for geotechnical investigation, in foundation design, as well as for several other purposes. Penetration and resolution of GPR depend primarily on the transmitting frequency of the equipment, the antenna characteristics, the electrical properties of the ground or of the surveyed material, and the contrasting electrical properties of the targets with respect to the surrounding medium. Generally there is a direct relationship between the transmitter frequency and the resolution that can be obtained; conversely there is an inverse relationship between frequency and penetration depth. GPR works best in dry ground environments, but can also give good results in wet, saturated materials; it does not work well in saline conditions, in high-conductivity media and through dense clays which limit signal penetration. Different approaches can be employed in the processing of collected GPR data. Once data have been processed, they still have to be analysed. This is a challenging problem, since interpretation of GPR radargrams is typically non-intuitive and considerable expertise is needed. In the presence of a complex scenario, an accurate electromagnetic forward solver is a fundamental tool for the validation of data interpretation. It can be employed for the characterization of scenarios, as a preliminary step that precedes a survey, or to gain a posteriori a better understanding of measured data. It can be used by GPR operators to identify the signatures generated by uncommon targets or by composite structures. Repeated evaluations of the electromagnetic field scattered by known targets can be performed by a forward solver, in order to estimate - through comparison with measured data - the physics and geometry of the region investigated by the GPR. It is possible to identify three main areas, in the GPR field, that have to be addressed in order to promote the use of this technology in the civil engineering. These are: a) increase of the system sensitivity to enable the usability in a wider range of conditions; b) research novel data processing algorithms/analysis tools for the interpretation of GPR results; c) contribute to the development of new standards and guidelines and to training of end users, that will also help to increase the awareness of operators. In this framework, the COST Action TU1208 "Civil Engineering Applications of Ground Penetrating Radar", proposed by Lara Pajewski, "Roma Tre" University, Rome, Italy, has been approved in November 2012 and is going to start in April 2013. It is a 4-years ambitious project already involving 17 European Countries (AT, BE, CH, CZ, DE, EL, ES, FI, FR, HR, IT, NL, NO, PL, PT, TR, UK), as well as Australia and U.S.A. The project will be developed within the frame of a unique approach based on the integrated contribution of University researchers, software developers, geophysics experts, Non-Destructive Testing equipment designers and producers, end users from private companies and public agencies. The main objective of the COST Action TU1208 is to exchange and increase scientific-technical knowledge and experience of GPR techniques in civil engineering, whilst promoting the effective use of this safe and non-destructive technique in the monitoring of systems. In this interdisciplinary Action, advantages and limitations of GPR will be highlighted, leading to the identification of gaps in knowledge and technology. Protocols and guidelines for European Standards will be developed, for an effective application of GPR in civil engineering. A novel GPR will be designed and realized: a multi-static system, with dedicated software and calibration procedures, able to construct real-time lane three-dimensional high resolution images of investigated areas. Advanced electromagnetic-scattering and data-processing techniques will be developed. The understanding of relationships between geophysical parameters and civil-engineering needs will be improved. Freeware software will be released, for inspection and monitoring of structures and infrastructures, buried-object localization, shape reconstruction and estimation of useful parameters. A high level training program will be organized. Mobility of early career researchers will be encouraged. The scientific work-plan of the Action is open, to ensure that experts all over the world, who did not participate in the preparation of the proposal but are interested in the project, may join the Action and participate in its activities. More information about the project can be found at http://www.cost.eu/domains_actions/tud/Actions/TU1208.
Hou, Jennifer H.; Kralj, Joel M.; Douglass, Adam D.; Engert, Florian; Cohen, Adam E.
2014-01-01
The cardiac action potential (AP) and the consequent cytosolic Ca2+ transient are key indicators of cardiac function. Natural developmental processes, as well as many drugs and pathologies change the waveform, propagation, or variability (between cells or over time) of these parameters. Here we apply a genetically encoded dual-function calcium and voltage reporter (CaViar) to study the development of the zebrafish heart in vivo between 1.5 and 4 days post fertilization (dpf). We developed a high-sensitivity spinning disk confocal microscope and associated software for simultaneous three-dimensional optical mapping of voltage and calcium. We produced a transgenic zebrafish line expressing CaViar under control of the heart-specific cmlc2 promoter, and applied ion channel blockers at a series of developmental stages to map the maturation of the action potential in vivo. Early in development, the AP initiated via a calcium current through L-type calcium channels. Between 90 and 102 h post fertilization (hpf), the ventricular AP switched to a sodium-driven upswing, while the atrial AP remained calcium driven. In the adult zebrafish heart, a sodium current drives the AP in both the atrium and ventricle. Simultaneous voltage and calcium imaging with genetically encoded reporters provides a new approach for monitoring cardiac development, and the effects of drugs on cardiac function. PMID:25309445
Hou, Jennifer H; Kralj, Joel M; Douglass, Adam D; Engert, Florian; Cohen, Adam E
2014-01-01
The cardiac action potential (AP) and the consequent cytosolic Ca(2+) transient are key indicators of cardiac function. Natural developmental processes, as well as many drugs and pathologies change the waveform, propagation, or variability (between cells or over time) of these parameters. Here we apply a genetically encoded dual-function calcium and voltage reporter (CaViar) to study the development of the zebrafish heart in vivo between 1.5 and 4 days post fertilization (dpf). We developed a high-sensitivity spinning disk confocal microscope and associated software for simultaneous three-dimensional optical mapping of voltage and calcium. We produced a transgenic zebrafish line expressing CaViar under control of the heart-specific cmlc2 promoter, and applied ion channel blockers at a series of developmental stages to map the maturation of the action potential in vivo. Early in development, the AP initiated via a calcium current through L-type calcium channels. Between 90 and 102 h post fertilization (hpf), the ventricular AP switched to a sodium-driven upswing, while the atrial AP remained calcium driven. In the adult zebrafish heart, a sodium current drives the AP in both the atrium and ventricle. Simultaneous voltage and calcium imaging with genetically encoded reporters provides a new approach for monitoring cardiac development, and the effects of drugs on cardiac function.
Implementing Extreme Programming in Distributed Software Project Teams: Strategies and Challenges
NASA Astrophysics Data System (ADS)
Maruping, Likoebe M.
Agile software development methods and distributed forms of organizing teamwork are two team process innovations that are gaining prominence in today's demanding software development environment. Individually, each of these innovations has yielded gains in the practice of software development. Agile methods have enabled software project teams to meet the challenges of an ever turbulent business environment through enhanced flexibility and responsiveness to emergent customer needs. Distributed software project teams have enabled organizations to access highly specialized expertise across geographic locations. Although much progress has been made in understanding how to more effectively manage agile development teams and how to manage distributed software development teams, managers have little guidance on how to leverage these two potent innovations in combination. In this chapter, I outline some of the strategies and challenges associated with implementing agile methods in distributed software project teams. These are discussed in the context of a study of a large-scale software project in the United States that lasted four months.
An assessment of space shuttle flight software development processes
NASA Technical Reports Server (NTRS)
1993-01-01
In early 1991, the National Aeronautics and Space Administration's (NASA's) Office of Space Flight commissioned the Aeronautics and Space Engineering Board (ASEB) of the National Research Council (NRC) to investigate the adequacy of the current process by which NASA develops and verifies changes and updates to the Space Shuttle flight software. The Committee for Review of Oversight Mechanisms for Space Shuttle Flight Software Processes was convened in Jan. 1992 to accomplish the following tasks: (1) review the entire flight software development process from the initial requirements definition phase to final implementation, including object code build and final machine loading; (2) review and critique NASA's independent verification and validation process and mechanisms, including NASA's established software development and testing standards; (3) determine the acceptability and adequacy of the complete flight software development process, including the embedded validation and verification processes through comparison with (1) generally accepted industry practices, and (2) generally accepted Department of Defense and/or other government practices (comparing NASA's program with organizations and projects having similar volumes of software development, software maturity, complexity, criticality, lines of code, and national standards); (4) consider whether independent verification and validation should continue. An overview of the study, independent verification and validation of critical software, and the Space Shuttle flight software development process are addressed. Findings and recommendations are presented.
NASA Technical Reports Server (NTRS)
Trevino, Luis; Patterson, Jonathan; Teare, David; Johnson, Stephen
2015-01-01
The engineering development of the new Space Launch System (SLS) launch vehicle requires cross discipline teams with extensive knowledge of launch vehicle subsystems, information theory, and autonomous algorithms dealing with all operations from pre-launch through on orbit operations. The characteristics of these spacecraft systems must be matched with the autonomous algorithm monitoring and mitigation capabilities for accurate control and response to abnormal conditions throughout all vehicle mission flight phases, including precipitating safing actions and crew aborts. This presents a large and complex system engineering challenge, which is being addressed in part by focusing on the specific subsystems involved in the handling of off-nominal mission and fault tolerance with response management. Using traditional model based system and software engineering design principles from the Unified Modeling Language (UML) and Systems Modeling Language (SysML), the Mission and Fault Management (M&FM) algorithms for the vehicle are crafted and vetted in specialized Integrated Development Teams (IDTs) composed of multiple development disciplines such as Systems Engineering (SE), Flight Software (FSW), Safety and Mission Assurance (S&MA) and the major subsystems and vehicle elements such as Main Propulsion Systems (MPS), boosters, avionics, Guidance, Navigation, and Control (GNC), Thrust Vector Control (TVC), and liquid engines. These model based algorithms and their development lifecycle from inception through Flight Software certification are an important focus of this development effort to further insure reliable detection and response to off-nominal vehicle states during all phases of vehicle operation from pre-launch through end of flight. NASA formed a dedicated M&FM team for addressing fault management early in the development lifecycle for the SLS initiative. As part of the development of the M&FM capabilities, this team has developed a dedicated testbed that integrates specific M&FM algorithms, specialized nominal and off-nominal test cases, and vendor-supplied physics-based launch vehicle subsystem models. Additionally, the team has developed processes for implementing and validating these algorithms for concept validation and risk reduction for the SLS program. The flexibility of the Vehicle Management End-to-end Testbed (VMET) enables thorough testing of the M&FM algorithms by providing configurable suites of both nominal and off-nominal test cases to validate the developed algorithms utilizing actual subsystem models such as MPS. The intent of VMET is to validate the M&FM algorithms and substantiate them with performance baselines for each of the target vehicle subsystems in an independent platform exterior to the flight software development infrastructure and its related testing entities. In any software development process there is inherent risk in the interpretation and implementation of concepts into software through requirements and test cases into flight software compounded with potential human errors throughout the development lifecycle. Risk reduction is addressed by the M&FM analysis group working with other organizations such as S&MA, Structures and Environments, GNC, Orion, the Crew Office, Flight Operations, and Ground Operations by assessing performance of the M&FM algorithms in terms of their ability to reduce Loss of Mission and Loss of Crew probabilities. In addition, through state machine and diagnostic modeling, analysis efforts investigate a broader suite of failure effects and associated detection and responses that can be tested in VMET to ensure that failures can be detected, and confirm that responses do not create additional risks or cause undesired states through interactive dynamic effects with other algorithms and systems. VMET further contributes to risk reduction by prototyping and exercising the M&FM algorithms early in their implementation and without any inherent hindrances such as meeting FSW processor scheduling constraints due to their target platform - ARINC 653 partitioned OS, resource limitations, and other factors related to integration with other subsystems not directly involved with M&FM such as telemetry packing and processing. The baseline plan for use of VMET encompasses testing the original M&FM algorithms coded in the same C++ language and state machine architectural concepts as that used by Flight Software. This enables the development of performance standards and test cases to characterize the M&FM algorithms and sets a benchmark from which to measure the effectiveness of M&FM algorithms performance in the FSW development and test processes.
Deindividuation and Internet software piracy.
Hinduja, Sameer
2008-08-01
Computer crime has increased exponentially in recent years as hardware, software, and network resources become more affordable and available to individuals from all walks of life. Software piracy is one prevalent type of cybercrime and has detrimentally affected the economic health of the software industry. Moreover, piracy arguably represents a rend in the moral fabric associated with the respect of intellectual property and reduces the financial incentive of product creation and innovation. Deindividuation theory, originating from the field of social psychology, argues that individuals are extricated from responsibility for their actions simply because they no longer have an acute awareness of the identity of self and of others. That is, external and internal constraints that would typically regulate questionable behavior are rendered less effective via certain anonymizing and disinhibiting conditions of the social and environmental context. This exploratory piece seeks to establish the role of deindividuation in liberating individuals to commit software piracy by testing the hypothesis that persons who prefer the anonymity and pseudonymity associated with interaction on the Internet are more likely to pirate software. Through this research, it is hoped that the empirical identification of such a social psychological determinant will help further illuminate the phenomenon.
University Approaches to Software Copyright and Licensure Policies.
ERIC Educational Resources Information Center
Hawkins, Brian L.
Issues of copyright policy and software licensure at Drexel University that were developed during the introduction of a new microcomputing program are discussed. Channels for software distribution include: individual purchase of externally-produced software, distribution of internally-developed software, institutional licensure, and "read…
Modular Rocket Engine Control Software (MRECS)
NASA Technical Reports Server (NTRS)
Tarrant, Charlie; Crook, Jerry
1997-01-01
The Modular Rocket Engine Control Software (MRECS) Program is a technology demonstration effort designed to advance the state-of-the-art in launch vehicle propulsion systems. Its emphasis is on developing and demonstrating a modular software architecture for a generic, advanced engine control system that will result in lower software maintenance (operations) costs. It effectively accommodates software requirements changes that occur due to hardware. technology upgrades and engine development testing. Ground rules directed by MSFC were to optimize modularity and implement the software in the Ada programming language. MRECS system software and the software development environment utilize Commercial-Off-the-Shelf (COTS) products. This paper presents the objectives and benefits of the program. The software architecture, design, and development environment are described. MRECS tasks are defined and timing relationships given. Major accomplishment are listed. MRECS offers benefits to a wide variety of advanced technology programs in the areas of modular software, architecture, reuse software, and reduced software reverification time related to software changes. Currently, the program is focused on supporting MSFC in accomplishing a Space Shuttle Main Engine (SSME) hot-fire test at Stennis Space Center and the Low Cost Boost Technology (LCBT) Program.
NA-42 TI Shared Software Component Library FY2011 Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Knudson, Christa K.; Rutz, Frederick C.; Dorow, Kevin E.
The NA-42 TI program initiated an effort in FY2010 to standardize its software development efforts with the long term goal of migrating toward a software management approach that will allow for the sharing and reuse of code developed within the TI program, improve integration, ensure a level of software documentation, and reduce development costs. The Pacific Northwest National Laboratory (PNNL) has been tasked with two activities that support this mission. PNNL has been tasked with the identification, selection, and implementation of a Shared Software Component Library. The intent of the library is to provide a common repository that is accessiblemore » by all authorized NA-42 software development teams. The repository facilitates software reuse through a searchable and easy to use web based interface. As software is submitted to the repository, the component registration process captures meta-data and provides version control for compiled libraries, documentation, and source code. This meta-data is then available for retrieval and review as part of library search results. In FY2010, PNNL and staff from the Remote Sensing Laboratory (RSL) teamed up to develop a software application with the goal of replacing the aging Aerial Measuring System (AMS). The application under development includes an Advanced Visualization and Integration of Data (AVID) framework and associated AMS modules. Throughout development, PNNL and RSL have utilized a common AMS code repository for collaborative code development. The AMS repository is hosted by PNNL, is restricted to the project development team, is accessed via two different geographic locations and continues to be used. The knowledge gained from the collaboration and hosting of this repository in conjunction with PNNL software development and systems engineering capabilities were used in the selection of a package to be used in the implementation of the software component library on behalf of NA-42 TI. The second task managed by PNNL is the development and continued maintenance of the NA-42 TI Software Development Questionnaire. This questionnaire is intended to help software development teams working under NA-42 TI in documenting their development activities. When sufficiently completed, the questionnaire illustrates that the software development activities recorded incorporate significant aspects of the software engineering lifecycle. The questionnaire template is updated as comments are received from NA-42 and/or its development teams and revised versions distributed to those using the questionnaire. PNNL also maintains a list of questionnaire recipients. The blank questionnaire template, the AVID and AMS software being developed, and the completed AVID AMS specific questionnaire are being used as the initial content to be established in the TI Component Library. This report summarizes the approach taken to identify requirements, search for and evaluate technologies, and the approach taken for installation of the software needed to host the component library. Additionally, it defines the process by which users request access for the contribution and retrieval of library content.« less
As EPA’s environmental research expands into new areas that involve the development of software, quality assurance concepts and procedures that were originally developed for environmental data collection may not be appropriate. Fortunately, software quality assurance is a ...
Precise Documentation: The Key to Better Software
NASA Astrophysics Data System (ADS)
Parnas, David Lorge
The prime cause of the sorry “state of the art” in software development is our failure to produce good design documentation. Poor documentation is the cause of many errors and reduces efficiency in every phase of a software product's development and use. Most software developers believe that “documentation” refers to a collection of wordy, unstructured, introductory descriptions, thousands of pages that nobody wanted to write and nobody trusts. In contrast, Engineers in more traditional disciplines think of precise blueprints, circuit diagrams, and mathematical specifications of component properties. Software developers do not know how to produce precise documents for software. Software developments also think that documentation is something written after the software has been developed. In other fields of Engineering much of the documentation is written before and during the development. It represents forethought not afterthought. Among the benefits of better documentation would be: easier reuse of old designs, better communication about requirements, more useful design reviews, easier integration of separately written modules, more effective code inspection, more effective testing, and more efficient corrections and improvements. This paper explains how to produce and use precise software documentation and illustrate the methods with several examples.
Ensemble Eclipse: A Process for Prefab Development Environment for the Ensemble Project
NASA Technical Reports Server (NTRS)
Wallick, Michael N.; Mittman, David S.; Shams, Khawaja, S.; Bachmann, Andrew G.; Ludowise, Melissa
2013-01-01
This software simplifies the process of having to set up an Eclipse IDE programming environment for the members of the cross-NASA center project, Ensemble. It achieves this by assembling all the necessary add-ons and custom tools/preferences. This software is unique in that it allows developers in the Ensemble Project (approximately 20 to 40 at any time) across multiple NASA centers to set up a development environment almost instantly and work on Ensemble software. The software automatically has the source code repositories and other vital information and settings included. The Eclipse IDE is an open-source development framework. The NASA (Ensemble-specific) version of the software includes Ensemble-specific plug-ins as well as settings for the Ensemble project. This software saves developers the time and hassle of setting up a programming environment, making sure that everything is set up in the correct manner for Ensemble development. Existing software (i.e., standard Eclipse) requires an intensive setup process that is both time-consuming and error prone. This software is built once by a single user and tested, allowing other developers to simply download and use the software
Shuttle avionics software trials, tribulations and success
NASA Technical Reports Server (NTRS)
Henderson, O. L.
1985-01-01
The early problems and the solutions developed to provide the required quality software needed to support the space shuttle engine development program are described. The decision to use a programmable digital control system on the space shuttle engine was primarily based upon the need for a flexible control system capable of supporting the total engine mission on a large complex pump fed engine. The mission definition included all control phases from ground checkout through post shutdown propellant dumping. The flexibility of the controller through reprogrammable software allowed the system to respond to the technical challenges and innovation required to develop both the engine and controller hardware. This same flexibility, however, placed a severe strain on the capability of the software development and verification organization. The overall development program required that the software facility accommodate significant growth in both the software requirements and the number of software packages delivered. This challenge was met by reorganization and evolution in the process of developing and verifying software.
Software-Engineering Process Simulation (SEPS) model
NASA Technical Reports Server (NTRS)
Lin, C. Y.; Abdel-Hamid, T.; Sherif, J. S.
1992-01-01
The Software Engineering Process Simulation (SEPS) model is described which was developed at JPL. SEPS is a dynamic simulation model of the software project development process. It uses the feedback principles of system dynamics to simulate the dynamic interactions among various software life cycle development activities and management decision making processes. The model is designed to be a planning tool to examine tradeoffs of cost, schedule, and functionality, and to test the implications of different managerial policies on a project's outcome. Furthermore, SEPS will enable software managers to gain a better understanding of the dynamics of software project development and perform postmodern assessments.
Knowledge focus via software agents
NASA Astrophysics Data System (ADS)
Henager, Donald E.
2001-09-01
The essence of military Command and Control (C2) is making knowledge intensive decisions in a limited amount of time using uncertain, incorrect, or outdated information. It is essential to provide tools to decision-makers that provide: * Management of friendly forces by treating the "friendly resources as a system". * Rapid assessment of effects of military actions againt the "enemy as a system". * Assessment of how an enemy should, can, and could react to friendly military activities. Software agents in the form of mission agents, target agents, maintenance agents, and logistics agents can meet this information challenge. The role of each agent is to know all the details about its assigned mission, target, maintenance, or logistics entity. The Mission Agent would fight for mission resources based on the mission priority and analyze the effect that a proposed mission's results would have on the enemy. The Target Agent (TA) communicates with other targets to determine its role in the system of targets. A system of TAs would be able to inform a planner or analyst of the status of a system of targets, the effect of that status, adn the effect of attacks on that system. The system of TAs would also be able to analyze possible enemy reactions to attack by determining ways to minimize the effect of attack, such as rerouting traffic or using deception. The Maintenance Agent would scheudle maintenance events and notify the maintenance unit. The Logistics Agent would manage shipment and delivery of supplies to maintain appropriate levels of weapons, fuel and spare parts. The central idea underlying this case of software agents is knowledge focus. Software agents are createad automatically to focus their attention on individual real-world entities (e.g., missions, targets) and view the world from that entities perspective. The agent autonomously monitors the entity, identifies problems/opportunities, formulates solutions, and informs the decision-maker. The agent must be able to communicate to receive and disseminate information and provide the decision-maker with assistance via focused knowledge. THe agent must also be able to monitor the state of its own environment and make decisions necessary to carry out its delegated tasks. Agents bring three elements to the C2 domain that offer to improve decision-making. First, they provide higher-quality feedback and provide it more often. In doing so, the feedback loop becomes nearly continuous, reducing or eliminating delays in situation updates to decision-makers. Working with the most current information possible improves the control process, thus enabling effects based operations. Second, the agents accept delegation of actions and perform those actions following an established process. Agents' consistent actions reduce the variability of human input and stabilize the control process. Third, through the delegation of actions, agents ensure 100 percent consideration of plan details.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-29
... INTERNATIONAL TRADE COMMISSION [Investigation No. 337-TA-724] Investigations: Terminations, Modifications and Rulings: Certain Electronic Devices With Image Processing Systems, Components Thereof, and Associated Software AGENCY: U.S. International Trade Commission. ACTION: Notice. SUMMARY: Notice is hereby...
PREP: Portal for Readiness Exercises & Planning v. 1.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Noel, Todd; Le, Tam; McNeil, Carrie
2016-10-28
The software includes a web-based template for recording actions taken during emergency preparedness exercises and planning workshops. In addition, a virtual outbreak prevention simulation exercise is also included. Both tools interact with a server which records user decisions and communications.
Quantitative Measures for Software Independent Verification and Validation
NASA Technical Reports Server (NTRS)
Lee, Alice
1996-01-01
As software is maintained or reused, it undergoes an evolution which tends to increase the overall complexity of the code. To understand the effects of this, we brought in statistics experts and leading researchers in software complexity, reliability, and their interrelationships. These experts' project has resulted in our ability to statistically correlate specific code complexity attributes, in orthogonal domains, to errors found over time in the HAL/S flight software which flies in the Space Shuttle. Although only a prototype-tools experiment, the result of this research appears to be extendable to all other NASA software, given appropriate data similar to that logged for the Shuttle onboard software. Our research has demonstrated that a more complete domain coverage can be mathematically demonstrated with the approach we have applied, thereby ensuring full insight into the cause-and-effects relationship between the complexity of a software system and the fault density of that system. By applying the operational profile we can characterize the dynamic effects of software path complexity under this same approach We now have the ability to measure specific attributes which have been statistically demonstrated to correlate to increased error probability, and to know which actions to take, for each complexity domain. Shuttle software verifiers can now monitor the changes in the software complexity, assess the added or decreased risk of software faults in modified code, and determine necessary corrections. The reports, tool documentation, user's guides, and new approach that have resulted from this research effort represent advances in the state of the art of software quality and reliability assurance. Details describing how to apply this technique to other NASA code are contained in this document.
Insights into software development in Japan
NASA Technical Reports Server (NTRS)
Duvall, Lorraine M.
1992-01-01
The interdependence of the U.S.-Japanese economies makes it imperative that we in the United States understand how business and technology developments take place in Japan. We can gain insight into these developments in software engineering by studying the context in which Japanese software is developed, the practices that are used, the problems encountered, the setting surrounding these problems, and the resolution of these problems. Context includes the technological and sociological characteristics of the software development environment, the software processes applied, personnel involved in the development process, and the corporate and social culture surrounding the development. Presented in this paper is a summary of results of a study that addresses these issues. Data for this study was collected during a three month visit to Japan where the author interviewed 20 software managers representing nine companies involved in developing software in Japan. These data are compared to similar data from the United States in which 12 managers from five companies were interviewed.
A strategy for electronic dissemination of NASA Langley technical publications
NASA Technical Reports Server (NTRS)
Roper, Donna G.; Mccaskill, Mary K.; Holland, Scott D.; Walsh, Joanne L.; Nelson, Michael L.; Adkins, Susan L.; Ambur, Manjula Y.; Campbell, Bryan A.
1994-01-01
To demonstrate NASA Langley Research Center's relevance and to transfer technology to external customers in a timely and efficient manner, Langley has formed a working group to study and recommend a course of action for the electronic dissemination of technical reports (EDTR). The working group identified electronic report requirements (e.g., accessibility, file format, search requirements) of customers in U.S. industry through numerous site visits and personal contacts. Internal surveys were also used to determine commonalities in document preparation methods. From these surveys, a set of requirements for an electronic dissemination system was developed. Two candidate systems were identified and evaluated against the set of requirements: the Full-Text Electronic Documents System (FEDS), which is a full-text retrieval system based on the commercial document management package Interleaf, and the Langley Technical Report Server (LTRS), which is a Langley-developed system based on the publicly available World Wide Web (WWW) software system. Factors that led to the selection of LTRS as the vehicle for electronic dissemination included searching and viewing capability, current system operability, and client software availability for multiple platforms at no cost to industry. This report includes the survey results, evaluations, a description of the LTRS architecture, recommended policy statement, and suggestions for future implementations.
Technical Support for Contaminated Sites | Science Inventory ...
In 1987, the U.S. Environmental Protection Agency’s (EPA) Office of Research and Development (ORD), Office of Land and Emergency Management, and EPA Regional waste management offices established the Technical Support Project. The creation of the Technical Support Project enabled ORD to provide effective technical assistance by ensuring ORD scientists and engineers were accessible to the Agency’s Office and Regional decision makers, including Remedial Project Managers, On-Scene Coordinators, and corrective action staff. Five ORD Technical Support Centers (TSCs) were created to facilitate this technical assistance. Three of the five TSCs are supported by the Sustainable and Healthy Communities Research Program, and are summarized in the poster being presented:• Engineering Technical Support Center (ETSC) in Cincinnati, Ohio• Ground Water Technical Support Center (GWTSC) in Ada, Oklahoma• Site Characterization and Monitoring Technical Support Center (SCMTSC) in Atlanta, GeorgiaOver the past 29 years, the Technical Support Centers have provided numerous influential products to its internal Agency clients and to those at the State level (through the EPA Regions). These products include, but are not limited to the following: Annual TSC reports from the three Centers, a hard-rock mining conference every other year, PRO-UCL software development for site characterization statistics, groundwater modeling using state-of-the-art modeling software, numerical mo
Software for Automated Image-to-Image Co-registration
NASA Technical Reports Server (NTRS)
Benkelman, Cody A.; Hughes, Heidi
2007-01-01
The project objectives are: a) Develop software to fine-tune image-to-image co-registration, presuming images are orthorectified prior to input; b) Create a reusable software development kit (SDK) to enable incorporation of these tools into other software; d) provide automated testing for quantitative analysis; and e) Develop software that applies multiple techniques to achieve subpixel precision in the co-registration of image pairs.
Measuring the software process and product: Lessons learned in the SEL
NASA Technical Reports Server (NTRS)
Basili, V. R.
1985-01-01
The software development process and product can and should be measured. The software measurement process at the Software Engineering Laboratory (SEL) has taught a major lesson: develop a goal-driven paradigm (also characterized as a goal/question/metric paradigm) for data collection. Project analysis under this paradigm leads to a design for evaluating and improving the methodology of software development and maintenance.
NASA Technical Reports Server (NTRS)
1976-01-01
Only a few efforts are currently underway to develop an adequate technology base for the various themes. Particular attention must be given to software commonality and evolutionary capability, to increased system integrity and autonomy; and to improved communications among the program users, the program developers, and the programs themselves. There is a need for quantum improvement in software development methods and increasing the awareness of software by all concerned. Major thrusts identified include: (1) data and systems management; (2) software technology for autonomous systems; (3) technology and methods for improving the software development process; (4) advances related to systems of software elements including their architecture, their attributes as systems, and their interfaces with users and other systems; and (5) applications of software including both the basic algorithms used in a number of applications and the software specific to a particular theme or discipline area. The impact of each theme on software is assessed.
NASA Astrophysics Data System (ADS)
Mallari, Lawrence Anthony Castro
This project proposes a manual specifically for remedying an ineffective Corrective Action Request System for Company ABC by providing dispositions within the company's quality procedure. A Corrective Action Request System is a corrective action tool that provides a means for employees to engage in the process improvement, problem elimination cycle. At Company ABC, Corrective Action Recommendations (CARs) are not provided with timely dispositions; CARs are being ignored due to a lack of training and awareness of Company ABC's personnel and quality procedures. In this project, Company ABC's quality management software database is scrutinized to identify the number of delinquent, non-dispositioned CARs in 2014. These CARs are correlated with the number of nonconformances generated for the same issue while the CAR is still open. Using secondary data, the primary investigator finds that nonconformances are being remediated at the operational level. However, at the administrative level, CARS are being ignored and forgotten.
Designing the user interface: strategies for effective human-computer interaction
NASA Astrophysics Data System (ADS)
Shneiderman, B.
1998-03-01
In revising this popular book, Ben Shneiderman again provides a complete, current and authoritative introduction to user-interface design. The user interface is the part of every computer system that determines how people control and operate that system. When the interface is well designed, it is comprehensible, predictable, and controllable; users feel competent, satisfied, and responsible for their actions. Shneiderman discusses the principles and practices needed to design such effective interaction. Based on 20 years experience, Shneiderman offers readers practical techniques and guidelines for interface design. He also takes great care to discuss underlying issues and to support conclusions with empirical results. Interface designers, software engineers, and product managers will all find this book an invaluable resource for creating systems that facilitate rapid learning and performance, yield low error rates, and generate high user satisfaction. Coverage includes the human factors of interactive software (with a new discussion of diverse user communities), tested methods to develop and assess interfaces, interaction styles such as direct manipulation for graphical user interfaces, and design considerations such as effective messages, consistent screen design, and appropriate color.
NASA Technical Reports Server (NTRS)
Trujillo, Anna C.; Gregory, Irene M.
2014-01-01
Control-theoretic modeling of human operator's dynamic behavior in manual control tasks has a long, rich history. There has been significant work on techniques used to identify the pilot model of a given structure. This research attempts to go beyond pilot identification based on experimental data to develop a predictor of pilot behavior. Two methods for pre-dicting pilot stick input during changing aircraft dynamics and deducing changes in pilot behavior are presented This approach may also have the capability to detect a change in a subject due to workload, engagement, etc., or the effects of changes in vehicle dynamics on the pilot. With this ability to detect changes in piloting behavior, the possibility now exists to mediate human adverse behaviors, hardware failures, and software anomalies with autono-my that may ameliorate these undesirable effects. However, appropriate timing of when au-tonomy should assume control is dependent on criticality of actions to safety, sensitivity of methods to accurately detect these adverse changes, and effects of changes in levels of auto-mation of the system as a whole.
RNAstructure: software for RNA secondary structure prediction and analysis.
Reuter, Jessica S; Mathews, David H
2010-03-15
To understand an RNA sequence's mechanism of action, the structure must be known. Furthermore, target RNA structure is an important consideration in the design of small interfering RNAs and antisense DNA oligonucleotides. RNA secondary structure prediction, using thermodynamics, can be used to develop hypotheses about the structure of an RNA sequence. RNAstructure is a software package for RNA secondary structure prediction and analysis. It uses thermodynamics and utilizes the most recent set of nearest neighbor parameters from the Turner group. It includes methods for secondary structure prediction (using several algorithms), prediction of base pair probabilities, bimolecular structure prediction, and prediction of a structure common to two sequences. This contribution describes new extensions to the package, including a library of C++ classes for incorporation into other programs, a user-friendly graphical user interface written in JAVA, and new Unix-style text interfaces. The original graphical user interface for Microsoft Windows is still maintained. The extensions to RNAstructure serve to make RNA secondary structure prediction user-friendly. The package is available for download from the Mathews lab homepage at http://rna.urmc.rochester.edu/RNAstructure.html.
Brahms Mobile Agents: Architecture and Field Tests
NASA Technical Reports Server (NTRS)
Clancey, William J.; Sierhuis, Maarten; Kaskiris, Charis; vanHoof, Ron
2002-01-01
We have developed a model-based, distributed architecture that integrates diverse components in a system designed for lunar and planetary surface operations: an astronaut's space suit, cameras, rover/All-Terrain Vehicle (ATV), robotic assistant, other personnel in a local habitat, and a remote mission support team (with time delay). Software processes, called agents, implemented in the Brahms language, run on multiple, mobile platforms. These mobile agents interpret and transform available data to help people and robotic systems coordinate their actions to make operations more safe and efficient. The Brahms-based mobile agent architecture (MAA) uses a novel combination of agent types so the software agents may understand and facilitate communications between people and between system components. A state-of-the-art spoken dialogue interface is integrated with Brahms models, supporting a speech-driven field observation record and rover command system (e.g., return here later and bring this back to the habitat ). This combination of agents, rover, and model-based spoken dialogue interface constitutes a personal assistant. An important aspect of the methodology involves first simulating the entire system in Brahms, then configuring the agents into a run-time system.
Learning Human Aspects of Collaborative Software Development
ERIC Educational Resources Information Center
Hadar, Irit; Sherman, Sofia; Hazzan, Orit
2008-01-01
Collaboration has become increasingly widespread in the software industry as systems have become larger and more complex, adding human complexity to the technological complexity already involved in developing software systems. To deal with this complexity, human-centric software development methods, such as Extreme Programming and other agile…
Estimating Software-Development Costs With Greater Accuracy
NASA Technical Reports Server (NTRS)
Baker, Dan; Hihn, Jairus; Lum, Karen
2008-01-01
COCOMOST is a computer program for use in estimating software development costs. The goal in the development of COCOMOST was to increase estimation accuracy in three ways: (1) develop a set of sensitivity software tools that return not only estimates of costs but also the estimation error; (2) using the sensitivity software tools, precisely define the quantities of data needed to adequately tune cost estimation models; and (3) build a repository of software-cost-estimation information that NASA managers can retrieve to improve the estimates of costs of developing software for their project. COCOMOST implements a methodology, called '2cee', in which a unique combination of well-known pre-existing data-mining and software-development- effort-estimation techniques are used to increase the accuracy of estimates. COCOMOST utilizes multiple models to analyze historical data pertaining to software-development projects and performs an exhaustive data-mining search over the space of model parameters to improve the performances of effort-estimation models. Thus, it is possible to both calibrate and generate estimates at the same time. COCOMOST is written in the C language for execution in the UNIX operating system.
Workstation-Based Avionics Simulator to Support Mars Science Laboratory Flight Software Development
NASA Technical Reports Server (NTRS)
Henriquez, David; Canham, Timothy; Chang, Johnny T.; McMahon, Elihu
2008-01-01
The Mars Science Laboratory developed the WorkStation TestSet (WSTS) to support flight software development. The WSTS is the non-real-time flight avionics simulator that is designed to be completely software-based and run on a workstation class Linux PC. This provides flight software developers with their own virtual avionics testbed and allows device-level and functional software testing when hardware testbeds are either not yet available or have limited availability. The WSTS has successfully off-loaded many flight software development activities from the project testbeds. At the writing of this paper, the WSTS has averaged an order of magnitude more usage than the project's hardware testbeds.
NASA Technical Reports Server (NTRS)
Gaffney, J. E., Jr.; Judge, R. W.
1981-01-01
A model of a software development process is described. The software development process is seen to consist of a sequence of activities, such as 'program design' and 'module development' (or coding). A manpower estimate is made by multiplying code size by the rates (man months per thousand lines of code) for each of the activities relevant to the particular case of interest and summing up the results. The effect of four objectively determinable factors (organization, software product type, computer type, and code type) on productivity values for each of nine principal software development activities was assessed. Four factors were identified which account for 39% of the observed productivity variation.
COSTMODL: An automated software development cost estimation tool
NASA Technical Reports Server (NTRS)
Roush, George B.
1991-01-01
The cost of developing computer software continues to consume an increasing portion of many organizations' total budgets, both in the public and private sector. As this trend develops, the capability to produce reliable estimates of the effort and schedule required to develop a candidate software product takes on increasing importance. The COSTMODL program was developed to provide an in-house capability to perform development cost estimates for NASA software projects. COSTMODL is an automated software development cost estimation tool which incorporates five cost estimation algorithms including the latest models for the Ada language and incrementally developed products. The principal characteristic which sets COSTMODL apart from other software cost estimation programs is its capacity to be completely customized to a particular environment. The estimation equations can be recalibrated to reflect the programmer productivity characteristics demonstrated by the user's organization, and the set of significant factors which effect software development costs can be customized to reflect any unique properties of the user's development environment. Careful use of a capability such as COSTMODL can significantly reduce the risk of cost overruns and failed projects.
SAGA: A project to automate the management of software production systems
NASA Technical Reports Server (NTRS)
Campbell, Roy H.; Beckman-Davies, C. S.; Benzinger, L.; Beshers, G.; Laliberte, D.; Render, H.; Sum, R.; Smith, W.; Terwilliger, R.
1986-01-01
Research into software development is required to reduce its production cost and to improve its quality. Modern software systems, such as the embedded software required for NASA's space station initiative, stretch current software engineering techniques. The requirements to build large, reliable, and maintainable software systems increases with time. Much theoretical and practical research is in progress to improve software engineering techniques. One such technique is to build a software system or environment which directly supports the software engineering process, i.e., the SAGA project, comprising the research necessary to design and build a software development which automates the software engineering process. Progress under SAGA is described.
NASA Software Assurance's Roles in Research and Technology
NASA Technical Reports Server (NTRS)
Wetherholt, Martha
2010-01-01
This slide presentation reviews the interactions between the scientist and engineers doing research and technology and the software developers and others who are doing software assurance. There is a discussion of the role of the Safety and Mission Assurance (SMA) in developing software to be used for research and technology, and the importance of this role as the technology moves to the higher levels of the technology readiness levels (TRLs). There is also a call to change the way the development of software is developed.
Espino, Jeremy U; Wagner, M; Szczepaniak, C; Tsui, F C; Su, H; Olszewski, R; Liu, Z; Chapman, W; Zeng, X; Ma, L; Lu, Z; Dara, J
2004-09-24
Computer-based outbreak and disease surveillance requires high-quality software that is well-supported and affordable. Developing software in an open-source framework, which entails free distribution and use of software and continuous, community-based software development, can produce software with such characteristics, and can do so rapidly. The objective of the Real-Time Outbreak and Disease Surveillance (RODS) Open Source Project is to accelerate the deployment of computer-based outbreak and disease surveillance systems by writing software and catalyzing the formation of a community of users, developers, consultants, and scientists who support its use. The University of Pittsburgh seeded the Open Source Project by releasing the RODS software under the GNU General Public License. An infrastructure was created, consisting of a website, mailing lists for developers and users, designated software developers, and shared code-development tools. These resources are intended to encourage growth of the Open Source Project community. Progress is measured by assessing website usage, number of software downloads, number of inquiries, number of system deployments, and number of new features or modules added to the code base. During September--November 2003, users generated 5,370 page views of the project website, 59 software downloads, 20 inquiries, one new deployment, and addition of four features. Thus far, health departments and companies have been more interested in using the software as is than in customizing or developing new features. The RODS laboratory anticipates that after initial installation has been completed, health departments and companies will begin to customize the software and contribute their enhancements to the public code base.
Statistical modeling of software reliability
NASA Technical Reports Server (NTRS)
Miller, Douglas R.
1992-01-01
This working paper discusses the statistical simulation part of a controlled software development experiment being conducted under the direction of the System Validation Methods Branch, Information Systems Division, NASA Langley Research Center. The experiment uses guidance and control software (GCS) aboard a fictitious planetary landing spacecraft: real-time control software operating on a transient mission. Software execution is simulated to study the statistical aspects of reliability and other failure characteristics of the software during development, testing, and random usage. Quantification of software reliability is a major goal. Various reliability concepts are discussed. Experiments are described for performing simulations and collecting appropriate simulated software performance and failure data. This data is then used to make statistical inferences about the quality of the software development and verification processes as well as inferences about the reliability of software versions and reliability growth under random testing and debugging.
Muscle Motion Solenoid Actuator
NASA Astrophysics Data System (ADS)
Obata, Shuji
It is one of our dreams to mechanically recover the lost body for damaged humans. Realistic humanoid robots composed of such machines require muscle motion actuators controlled by all pulling actions. Particularly, antagonistic pairs of bi-articular muscles are very important in animal's motions. A system of actuators is proposed using the electromagnetic force of the solenoids with the abilities of the stroke length over 10 cm and the strength about 20 N, which are needed to move the real human arm. The devised actuators are based on developments of recent modern electro-magnetic materials, where old time materials can not give such possibility. Composite actuators are controlled by a high ability computer and software making genuine motions.
ERIC Educational Resources Information Center
Marson, Guilherme A.; Torres, Bayardo B.
2011-01-01
This work presents a convenient framework for developing interactive chemical education software to facilitate the integration of macroscopic, microscopic, and symbolic dimensions of chemical concepts--specifically, via the development of software for gel permeation chromatography. The instructional role of the software was evaluated in a study…
Educational Affordances and Learning Design in Music Software Development
ERIC Educational Resources Information Center
Cheng, Lee; Leong, Samuel
2017-01-01
Although music software has become increasingly affordable and widely adopted in today's classrooms, concerns have been raised about a lack of consideration for users' needs during the software development process. This paper examines intra- and inter-sectoral communication pertaining to software development and music education to shed light on…
Recording and assessment of evoked potentials with electrode arrays.
Miljković, N; Malešević, N; Kojić, V; Bijelić, G; Keller, T; Popović, D B
2015-09-01
In order to optimize procedure for the assessment of evoked potentials and to provide visualization of the flow of action potentials along the motor systems, we introduced array electrodes for stimulation and recording and developed software for the analysis of the recordings. The system uses a stimulator connected to an electrode array for the generation of evoked potentials, an electrode array connected to the amplifier, A/D converter and computer for the recording of evoked potentials, and a dedicated software application. The method has been tested for the assessment of the H-reflex on the triceps surae muscle in six healthy humans. The electrode array with 16 pads was positioned over the posterior aspect of the thigh, while the recording electrode array with 16 pads was positioned over the triceps surae muscle. The stimulator activated all the pads of the stimulation electrode array asynchronously, while the signals were recorded continuously at all the recording sites. The results are topography maps (spatial distribution of evoked potentials) and matrices (spatial visualization of nerve excitability). The software allows the automatic selection of the lowest stimulation intensity to achieve maximal H-reflex amplitude and selection of the recording/stimulation pads according to predefined criteria. The analysis of results shows that the method provides rich information compared with the conventional recording of the H-reflex with regard the spatial distribution.
Modification of infant hypothyroidism and phenylketonuria screening program using electronic tools.
Taheri, Behjat; Haddadpoor, Asefeh; Mirkhalafzadeh, Mahmood; Mazroei, Fariba; Aghdak, Pezhman; Nasri, Mehran; Bahrami, Gholamreza
2017-01-01
Congenital hypothyroidism and phenylketonuria (PKU) are the most common cause for preventable mental retardation in infants worldwide. Timely diagnosis and treatment of these disorders can have lasting effects on the mental development of newborns. However, there are several problems at different stages of screening programs that along with imposing heavy costs can reduce the precision of the screening, increasing the chance of undiagnosed cases which in turn can have damaging consequences for the society. Therefore, given these problems and the importance of information systems in facilitating the management and improving the quality of health care the aim of this study was to improve the screening process of hypothyroidism and PKU in infants with the help of electronic resources. The current study is a qualitative, action research designed to improve the quality of screening, services, performance, implementation effectiveness, and management of hypothyroidism and PKU screening program in Isfahan province. To this end, web-based software was designed. Programming was carried out using Delphi.net software and used SQL Server 2008 for database management. Given the weaknesses, problems, and limitations of hypothyroidism and PKU screening program, and the importance of these diseases in a national scale, this study resulted in design of hypothyroidism and PKU screening software for infants in Isfahan province. The inputs and outputs of the software were designed in three levels including Health Care Centers in charge of the screening program, provincial reference lab, and health and treatment network of Isfahan province. Immediate registration of sample data at the time and location of sampling, providing the provincial reference Laboratory and Health Centers of different eparchies with the ability to instantly observe, monitor, and follow-up on the samples at any moment, online verification of samples by reference lab, creating a daily schedule for reference lab, and receiving of the results from analysis equipment; and entering the results into the database without the need for user input are among the features of this software. The implementation of hypothyroidism screening software led to an increase in the quality and efficiency of the screening program; minimized the risk of human error in the process and solved many of the previous limitations of the screening program which were the main goals for implementation of this software. The implementation of this software also resulted in improvement in precision and quality of services provided for these two diseases and better accuracy and precision for data inputs by providing the possibility of entering the sample data at the place and time of sampling which then resulted in the possibility of management based on precise data and also helped develop a comprehensive database and improved the satisfaction of service recipients.
An information model for use in software management estimation and prediction
NASA Technical Reports Server (NTRS)
Li, Ningda R.; Zelkowitz, Marvin V.
1993-01-01
This paper describes the use of cluster analysis for determining the information model within collected software engineering development data at the NASA/GSFC Software Engineering Laboratory. We describe the Software Management Environment tool that allows managers to predict development attributes during early phases of a software project and the modifications we propose to allow it to develop dynamic models for better predictions of these attributes.
Space and Missile Systems Center Standard: Software Development
2015-01-16
maintenance , or any other activity or combination of activities resulting in products . Within this standard, requirements to “develop,” “define...integration, reuse, reengineering, maintenance , or any other activity that results in products ). The term “developer” encompasses all software team...activities that results in software products . Software development includes new development, modification, reuse, reengineering, maintenance , and any other
Software Engineering for Human Spaceflight
NASA Technical Reports Server (NTRS)
Fredrickson, Steven E.
2014-01-01
The Spacecraft Software Engineering Branch of NASA Johnson Space Center (JSC) provides world-class products, leadership, and technical expertise in software engineering, processes, technology, and systems management for human spaceflight. The branch contributes to major NASA programs (e.g. ISS, MPCV/Orion) with in-house software development and prime contractor oversight, and maintains the JSC Engineering Directorate CMMI rating for flight software development. Software engineering teams work with hardware developers, mission planners, and system operators to integrate flight vehicles, habitats, robotics, and other spacecraft elements. They seek to infuse automation and autonomy into missions, and apply new technologies to flight processor and computational architectures. This presentation will provide an overview of key software-related projects, software methodologies and tools, and technology pursuits of interest to the JSC Spacecraft Software Engineering Branch.
NASA Technical Reports Server (NTRS)
Mayer, Richard J.; Blinn, Thomas M.; Mayer, Paula S. D.; Ackley, Keith A.; Crump, John W., IV; Henderson, Richard; Futrell, Michael T.
1991-01-01
The Framework Programmable Software Development Platform (FPP) is a project aimed at combining effective tool and data integration mechanisms with a model of the software development process in an intelligent integrated software environment. Guided by the model, this system development framework will take advantage of an integrated operating environment to automate effectively the management of the software development process so that costly mistakes during the development phase can be eliminated. The focus here is on the design of components that make up the FPP. These components serve as supporting systems for the Integration Mechanism and the Framework Processor and provide the 'glue' that ties the FPP together. Also discussed are the components that allow the platform to operate in a distributed, heterogeneous environment and to manage the development and evolution of software system artifacts.
LG based decision aid for naval tactical action officer's (TAO) workstation
NASA Astrophysics Data System (ADS)
Stilman, Boris; Yakhnis, Vladimir; Umanskiy, Oleg; Boyd, Ron
2005-05-01
In the increasingly NetCentric battlespace of the 21st century, Stilman Advanced Strategies Linguistic Geometry software has the potential to revolutionize the way that the Navy fights in two key areas: as a Tactical Decision Aid and for creating a relevant Common Operating Picture. Incorporating STILMAN's software into a prototype Tactical Action Officers (TAO) workstation as a Tactical Decision Aid (TDA) will allow warfighters to manage their assets more intelligently and effectively. This prototype workstation will be developed using human-centered design principles and will be an open, component-based architecture for combat control systems for future small surface combatants. It will integrate both uninhabited vehicles and onboard sensors and weapon systems across a squadron of small surface combatants. In addition, the hypergame representation of complex operations provides a paradigm for the presentation of a common operating picture to operators and personnel throughout the command hierarchy. In the hypergame technology there are game levels that span the range from the tactical to the global strategy level, with each level informing the others. This same principle will be applied to presenting the relevant common operating picture to operators. Each operator will receive a common operating picture that is appropriate for their level in the command hierarchy. The area covered by this operating picture and the level of detail contained within it will be dependent upon the specific tasks the operator is performing (supervisory vice tactical control) and the level of the operator (or command personnel) within the command hierarchy. Each level will inform the others to keep the picture concurrent and up-to-date.
Architecture independent environment for developing engineering software on MIMD computers
NASA Technical Reports Server (NTRS)
Valimohamed, Karim A.; Lopez, L. A.
1990-01-01
Engineers are constantly faced with solving problems of increasing complexity and detail. Multiple Instruction stream Multiple Data stream (MIMD) computers have been developed to overcome the performance limitations of serial computers. The hardware architectures of MIMD computers vary considerably and are much more sophisticated than serial computers. Developing large scale software for a variety of MIMD computers is difficult and expensive. There is a need to provide tools that facilitate programming these machines. First, the issues that must be considered to develop those tools are examined. The two main areas of concern were architecture independence and data management. Architecture independent software facilitates software portability and improves the longevity and utility of the software product. It provides some form of insurance for the investment of time and effort that goes into developing the software. The management of data is a crucial aspect of solving large engineering problems. It must be considered in light of the new hardware organizations that are available. Second, the functional design and implementation of a software environment that facilitates developing architecture independent software for large engineering applications are described. The topics of discussion include: a description of the model that supports the development of architecture independent software; identifying and exploiting concurrency within the application program; data coherence; engineering data base and memory management.
NASA Technical Reports Server (NTRS)
Hall, Drew P.; Ly, William; Howard, Richard T.; Weir, John; Rakoczy, John; Roe, Fred (Technical Monitor)
2002-01-01
The software development for an upgrade to the Hobby-Eberly Telescope (HET) was done in LABView. In order to improve the performance of the HET at the McDonald Observatory, a closed-loop system had to be implemented to keep the mirror segments aligned during periods of observation. The control system, called the Segment Alignment Maintenance System (SAMs), utilized inductive sensors to measure the relative motions of the mirror segments. Software was developed in LABView to tie the sensors, operator interface, and mirror-control motors together. Developing the software in LABView allowed the system to be flexible, understandable, and able to be modified by the end users. Since LABView is built using block diagrams, the software naturally followed the designed control system's block and flow diagrams, and individual software blocks could be easily verified. LABView's many built-in display routines allowed easy visualization of diagnostic and health-monitoring data during testing. Also, since LABView is a multi-platform software package, different programmers could develop the code remotely on various types of machines. LABView s ease of use facilitated rapid prototyping and field testing. There were some unanticipated difficulties in the software development, but the use of LABView as the software "language" for the development of SAMs contributed to the overall success of the project.
Technology-driven dietary assessment: a software developer’s perspective
Buday, Richard; Tapia, Ramsey; Maze, Gary R.
2015-01-01
Dietary researchers need new software to improve nutrition data collection and analysis, but creating information technology is difficult. Software development projects may be unsuccessful due to inadequate understanding of needs, management problems, technology barriers or legal hurdles. Cost overruns and schedule delays are common. Barriers facing scientific researchers developing software include workflow, cost, schedule, and team issues. Different methods of software development and the role that intellectual property rights play are discussed. A dietary researcher must carefully consider multiple issues to maximize the likelihood of success when creating new software. PMID:22591224
NASA Astrophysics Data System (ADS)
Kumlander, Deniss
The globalization of companies operations and competitor between software vendors demand improving quality of delivered software and decreasing the overall cost. The same in fact introduce a lot of problem into software development process as produce distributed organization breaking the co-location rule of modern software development methodologies. Here we propose a reformulation of the ambassador position increasing its productivity in order to bridge communication and workflow gap by managing the entire communication process rather than concentrating purely on the communication result.
NASA Technical Reports Server (NTRS)
Simmons, D. B.; Marchbanks, M. P., Jr.; Quick, M. J.
1982-01-01
The results of an effort to thoroughly and objectively analyze the statistical and historical information gathered during the development of the Shuttle Orbiter Primary Flight Software are given. The particular areas of interest include cost of the software, reliability of the software, requirements for the software and how the requirements changed during development of the system. Data related to the current version of the software system produced some interesting results. Suggestions are made for the saving of additional data which will allow additional investigation.
Ground Operations Autonomous Control and Integrated Health Management
NASA Technical Reports Server (NTRS)
Figueroa, Fernando; Walker, Mark; Wilkins, Kim; Johnson, Robert; Sass, Jared; Youney, Justin
2014-01-01
An intelligent autonomous control capability has been developed and is currently being validated in ground cryogenic fluid management operations. The capability embodies a physical architecture consistent with typical launch infrastructure and control systems, augmented by a higher level autonomous control (AC) system enabled to make knowledge-based decisions. The AC system is supported by an integrated system health management (ISHM) capability that detects anomalies, diagnoses causes, determines effects, and could predict future anomalies. AC is implemented using the concept of programmed sequences that could be considered to be building blocks of more generic mission plans. A sequence is a series of steps, and each executes actions once conditions for the step are met (e.g. desired temperatures or fluid state are achieved). For autonomous capability, conditions must consider also health management outcomes, as they will determine whether or not an action is executed, or how an action may be executed, or if an alternative action is executed instead. Aside from health, higher level objectives can also drive how a mission is carried out. The capability was developed using the G2 software environment (www.gensym.com) augmented by a NASA Toolkit that significantly shortens time to deployment. G2 is a commercial product to develop intelligent applications. It is fully object oriented. The core of the capability is a Domain Model of the system where all elements of the system are represented as objects (sensors, instruments, components, pipes, etc.). Reasoning and decision making can be done with all elements in the domain model. The toolkit also enables implementation of failure modes and effects analysis (FMEA), which are represented as root cause trees. FMEA's are programmed graphically, they are reusable, as they address generic FMEA referring to classes of subsystems or objects and their functional relationships. User interfaces for integrated awareness by operators have been created.
SAGA: A project to automate the management of software production systems
NASA Technical Reports Server (NTRS)
Campbell, Roy H.; Beckman, Carol S.; Benzinger, Leonora; Beshers, George; Hammerslag, David; Kimball, John; Kirslis, Peter A.; Render, Hal; Richards, Paul; Terwilliger, Robert
1985-01-01
The SAGA system is a software environment that is designed to support most of the software development activities that occur in a software lifecycle. The system can be configured to support specific software development applications using given programming languages, tools, and methodologies. Meta-tools are provided to ease configuration. The SAGA system consists of a small number of software components that are adapted by the meta-tools into specific tools for use in the software development application. The modules are design so that the meta-tools can construct an environment which is both integrated and flexible. The SAGA project is documented in several papers which are presented.
The Relevance of Software Development Education for Students
ERIC Educational Resources Information Center
Liebenberg, Janet; Huisman, Magda; Mentz, Elsa
2015-01-01
Despite a widely-acknowledged shortage of software developers, and reports of a gap between industry needs and software education, the possible gap between students' needs and software development education has not been explored in detail. In their university education, students want to take courses and carry out projects that clearly relate to…
Space Station Software Recommendations
NASA Technical Reports Server (NTRS)
Voigt, S. (Editor)
1985-01-01
Four panels of invited experts and NASA representatives focused on the following topics: software management, software development environment, languages, and software standards. Each panel deliberated in private, held two open sessions with audience participation, and developed recommendations for the NASA Space Station Program. The major thrusts of the recommendations were as follows: (1) The software management plan should establish policies, responsibilities, and decision points for software acquisition; (2) NASA should furnish a uniform modular software support environment and require its use for all space station software acquired (or developed); (3) The language Ada should be selected for space station software, and NASA should begin to address issues related to the effective use of Ada; and (4) The space station software standards should be selected (based upon existing standards where possible), and an organization should be identified to promulgate and enforce them. These and related recommendations are described in detail in the conference proceedings.
Software Management Environment (SME) concepts and architecture, revision 1
NASA Technical Reports Server (NTRS)
Hendrick, Robert; Kistler, David; Valett, Jon
1992-01-01
This document presents the concepts and architecture of the Software Management Environment (SME), developed for the Software Engineering Branch of the Flight Dynamic Division (FDD) of GSFC. The SME provides an integrated set of experience-based management tools that can assist software development managers in managing and planning flight dynamics software development projects. This document provides a high-level description of the types of information required to implement such an automated management tool.
Manager's handbook for software development, revision 1
NASA Technical Reports Server (NTRS)
1990-01-01
Methods and aids for the management of software development projects are presented. The recommendations are based on analyses and experiences of the Software Engineering Laboratory (SEL) with flight dynamics software development. The management aspects of the following subjects are described: organizing the project, producing a development plan, estimating costs, scheduling, staffing, preparing deliverable documents, using management tools, monitoring the project, conducting reviews, auditing, testing, and certifying.
Development of Automated Image Analysis Software for Suspended Marine Particle Classification
2003-09-30
Development of Automated Image Analysis Software for Suspended Marine Particle Classification Scott Samson Center for Ocean Technology...REPORT TYPE 3. DATES COVERED 00-00-2003 to 00-00-2003 4. TITLE AND SUBTITLE Development of Automated Image Analysis Software for Suspended...objective is to develop automated image analysis software to reduce the effort and time required for manual identification of plankton images. Automated
ERIC Educational Resources Information Center
Eisen, Daniel
2013-01-01
This study explores how project managers, working for private federal IT contractors, experience and understand managing the development of software applications for U.S. federal government agencies. Very little is known about how they manage their projects in this challenging environment. Software development is a complex task and only grows in…
The Particle-in-Cell and Kinetic Simulation Software Center
NASA Astrophysics Data System (ADS)
Mori, W. B.; Decyk, V. K.; Tableman, A.; Fonseca, R. A.; Tsung, F. S.; Hu, Q.; Winjum, B. J.; An, W.; Dalichaouch, T. N.; Davidson, A.; Hildebrand, L.; Joglekar, A.; May, J.; Miller, K.; Touati, M.; Xu, X. L.
2017-10-01
The UCLA Particle-in-Cell and Kinetic Simulation Software Center (PICKSC) aims to support an international community of PIC and plasma kinetic software developers, users, and educators; to increase the use of this software for accelerating the rate of scientific discovery; and to be a repository of knowledge and history for PIC. We discuss progress towards making available and documenting illustrative open-source software programs and distinct production programs; developing and comparing different PIC algorithms; coordinating the development of resources for the educational use of kinetic software; and the outcomes of our first sponsored OSIRIS users workshop. We also welcome input and discussion from anyone interested in using or developing kinetic software, in obtaining access to our codes, in collaborating, in sharing their own software, or in commenting on how PICKSC can better serve the DPP community. Supported by NSF under Grant ACI-1339893 and by the UCLA Institute for Digital Research and Education.
[Example of product development by industry and research solidarity].
Seki, Masayoshi
2014-01-01
When the industrial firms develop the product, the research result from research institutions is used or to reflect the ideas from users on the developed product would be significant in order to improve the product. To state the software product which developed jointly as an example to describe the adopted development technique and its result, and to consider the modality of the industry solidarity seen from the company side and joint development. The software development methods have the merit and demerit and necessary to choose the optimal development technique by the system which develops. We have been jointly developed the dose distribution browsing software. As the software development method, we adopted the prototype model. In order to display the dose distribution information, it is necessary to load four objects which are CT-Image, Structure Set, RT-Plan, and RT-Dose, are displayed in a composite manner. The prototype model which is the development technique was adopted by this joint development was optimal especially to develop the dose distribution browsing software. In a prototype model, since the detail design was created based on the program source code after the program was finally completed, there was merit on the period shortening of document written and consist in design and implementation. This software eventually opened to the public as an open source. Based on this developed prototype software, the release version of the dose distribution browsing software was developed. Developing this type of novelty software, it normally takes two to three years, but since the joint development was adopted, it shortens the development period to one year. Shortening the development period was able to hold down to the minimum development cost for a company and thus, this will be reflected to the product price. The specialists make requests on the product from user's point of view are important, but increase in specialists as professionals for product development will increase the expectations to develop a product to meet the users demand.
A Comparison of Learning Technologies for Teaching Spacecraft Software Development
ERIC Educational Resources Information Center
Straub, Jeremy
2014-01-01
The development of software for spacecraft represents a particular challenge and is, in many ways, a worst case scenario from a design perspective. Spacecraft software must be "bulletproof" and operate for extended periods of time without user intervention. If the software fails, it cannot be manually serviced. Software failure may…
Software Development Life Cycle Security Issues
NASA Astrophysics Data System (ADS)
Kaur, Daljit; Kaur, Parminder
2011-12-01
Security is now-a-days one of the major problems because of many reasons. Security is now-a-days one of the major problems because of many reasons. The main cause is that software can't withstand security attacks because of vulnerabilities in it which are caused by defective specifications design and implementation. We have conducted a survey asking software developers, project managers and other people in software development about their security awareness and implementation in Software Development Life Cycle (SDLC). The survey was open to participation for three weeks and this paper explains the survey results.
Development of a support software system for real-time HAL/S applications
NASA Technical Reports Server (NTRS)
Smith, R. S.
1984-01-01
Methodologies employed in defining and implementing a software support system for the HAL/S computer language for real-time operations on the Shuttle are detailed. Attention is also given to the management and validation techniques used during software development and software maintenance. Utilities developed to support the real-time operating conditions are described. With the support system being produced on Cyber computers and executable code then processed through Cyber or PDP machines, the support system has a production level status and can serve as a model for other software development projects.
The Many Faces of a Software Engineer in a Research Community
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marinovici, Maria C.; Kirkham, Harold
2013-10-14
The ability to gather, analyze and make decisions based on real world data is changing nearly every field of human endeavor. These changes are particularly challenging for software engineers working in a scientific community, designing and developing large, complex systems. To avoid the creation of a communications gap (almost a language barrier), the software engineers should possess an ‘adaptive’ skill. In the science and engineering research community, the software engineers must be responsible for more than creating mechanisms for storing and analyzing data. They must also develop a fundamental scientific and engineering understanding of the data. This paper looks atmore » the many faces that a software engineer should have: developer, domain expert, business analyst, security expert, project manager, tester, user experience professional, etc. Observations made during work on a power-systems scientific software development are analyzed and extended to describe more generic software development projects.« less
Idea Paper: The Lifecycle of Software for Scientific Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dubey, Anshu; McInnes, Lois C.
The software lifecycle is a well researched topic that has produced many models to meet the needs of different types of software projects. However, one class of projects, software development for scientific computing, has received relatively little attention from lifecycle researchers. In particular, software for end-to-end computations for obtaining scientific results has received few lifecycle proposals and no formalization of a development model. An examination of development approaches employed by the teams implementing large multicomponent codes reveals a great deal of similarity in their strategies. This idea paper formalizes these related approaches into a lifecycle model for end-to-end scientific applicationmore » software, featuring loose coupling between submodels for development of infrastructure and scientific capability. We also invite input from stakeholders to converge on a model that captures the complexity of this development processes and provides needed lifecycle guidance to the scientific software community.« less
Agile hardware and software systems engineering for critical military space applications
NASA Astrophysics Data System (ADS)
Huang, Philip M.; Knuth, Andrew A.; Krueger, Robert O.; Garrison-Darrin, Margaret A.
2012-06-01
The Multi Mission Bus Demonstrator (MBD) is a successful demonstration of agile program management and system engineering in a high risk technology application where utilizing and implementing new, untraditional development strategies were necessary. MBD produced two fully functioning spacecraft for a military/DOD application in a record breaking time frame and at dramatically reduced costs. This paper discloses the adaptation and application of concepts developed in agile software engineering to hardware product and system development for critical military applications. This challenging spacecraft did not use existing key technology (heritage hardware) and created a large paradigm shift from traditional spacecraft development. The insertion of new technologies and methods in space hardware has long been a problem due to long build times, the desire to use heritage hardware, and lack of effective process. The role of momentum in the innovative process can be exploited to tackle ongoing technology disruptions and allowing risk interactions to be mitigated in a disciplined manner. Examples of how these concepts were used during the MBD program will be delineated. Maintaining project momentum was essential to assess the constant non recurring technological challenges which needed to be retired rapidly from the engineering risk liens. Development never slowed due to tactical assessment of the hardware with the adoption of the SCRUM technique. We adapted this concept as a representation of mitigation of technical risk while allowing for design freeze later in the program's development cycle. By using Agile Systems Engineering and Management techniques which enabled decisive action, the product development momentum effectively was used to produce two novel space vehicles in a fraction of time with dramatically reduced cost.
NASA Astrophysics Data System (ADS)
Ames, D.; Kadlec, J.; Horsburgh, J. S.; Maidment, D. R.
2009-12-01
The Consortium of Universities for the Advancement of Hydrologic Sciences (CUAHSI) Hydrologic Information System (HIS) project includes extensive development of data storage and delivery tools and standards including WaterML (a language for sharing hydrologic data sets via web services); and HIS Server (a software tool set for delivering WaterML from a server); These and other CUASHI HIS tools have been under development and deployment for several years and together, present a relatively complete software “stack” to support the consistent storage and delivery of hydrologic and other environmental observation data. This presentation describes the development of a new HIS software tool called “HydroDesktop” and the development of an online open source software development community to update and maintain the software. HydroDesktop is a local (i.e. not server-based) client side software tool that ultimately will run on multiple operating systems and will provide a highly usable level of access to HIS services. The software provides many key capabilities including data query, map-based visualization, data download, local data maintenance, editing, graphing, data export to selected model-specific data formats, linkage with integrated modeling systems such as OpenMI, and ultimately upload to HIS servers from the local desktop software. As the software is presently in the early stages of development, this presentation will focus on design approach and paradigm and is viewed as an opportunity to encourage participation in the open development community. Indeed, recognizing the value of community based code development as a means of ensuring end-user adoption, this project has adopted an “iterative” or “spiral” software development approach which will be described in this presentation.
New technologies for supporting real-time on-board software development
NASA Astrophysics Data System (ADS)
Kerridge, D.
1995-03-01
The next generation of on-board data management systems will be significantly more complex than current designs, and will be required to perform more complex and demanding tasks in software. Improved hardware technology, in the form of the MA31750 radiation hard processor, is one key component in addressing the needs of future embedded systems. However, to complement these hardware advances, improved support for the design and implementation of real-time data management software is now needed. This will help to control the cost and risk assoicated with developing data management software development as it becomes an increasingly significant element within embedded systems. One particular problem with developing embedded software is managing the non-functional requirements in a systematic way. This paper identifies how Logica has exploited recent developments in hard real-time theory to address this problem through the use of new hard real-time analysis and design methods which can be supported by specialized tools. The first stage in transferring this technology from the research domain to industrial application has already been completed. The MA37150 Hard Real-Time Embedded Software Support Environment (HESSE) is a loosely integrated set of hardware and software tools which directly support the process of hard real-time analysis for software targeting the MA31750 processor. With further development, this HESSE promises to provide embedded system developers with software tools which can reduce the risks associated with developing complex hard real-time software. Supported in this way by more sophisticated software methods and tools, it is foreseen that MA31750 based embedded systems can meet the processing needs for the next generation of on-board data management systems.
Simple solution to the medical instrumentation software problem
NASA Astrophysics Data System (ADS)
Leif, Robert C.; Leif, Suzanne B.; Leif, Stephanie H.; Bingue, E.
1995-04-01
Medical devices now include a substantial software component, which is both difficult and expensive to produce and maintain. Medical software must be developed according to `Good Manufacturing Practices', GMP. Good Manufacturing Practices as specified by the FDA and ISO requires the definition and compliance to a software processes which ensures quality products by specifying a detailed method of software construction. The software process should be based on accepted standards. US Department of Defense software standards and technology can both facilitate the development and improve the quality of medical systems. We describe the advantages of employing Mil-Std-498, Software Development and Documentation, and the Ada programming language. Ada provides the very broad range of functionalities, from embedded real-time to management information systems required by many medical devices. It also includes advanced facilities for object oriented programming and software engineering.
Building quality into medical product software design.
Mallory, S R
1993-01-01
The software engineering and quality assurance disciplines are a requisite to the design of safe and effective software-based medical devices. It is in the areas of software methodology and process that the most beneficial application of these disciplines to software development can be made. Software is a product of complex operations and methodologies and is not amenable to the traditional electromechanical quality assurance processes. Software quality must be built in by the developers, with the software verification and validation engineers acting as the independent instruments for ensuring compliance with performance objectives and with development and maintenance standards. The implementation of a software quality assurance program is a complex process involving management support, organizational changes, and new skill sets, but the benefits are profound. Its rewards provide safe, reliable, cost-effective, maintainable, and manageable software, which may significantly speed the regulatory review process and therefore potentially shorten the overall time to market. The use of a trial project can greatly facilitate the learning process associated with the first-time application of a software quality assurance program.
The integration of the risk management process with the lifecycle of medical device software.
Pecoraro, F; Luzi, D
2014-01-01
The application of software in the Medical Device (MD) domain has become central to the improvement of diagnoses and treatments. The new European regulations that specifically address software as an important component of MD, require complex procedures to make software compliant with safety requirements, introducing thereby new challenges in the qualification and classification of MD software as well as in the performance of risk management activities. Under this perspective, the aim of this paper is to propose an integrated framework that combines the activities to be carried out by the manufacturer to develop safe software within the development lifecycle based on the regulatory requirements reported in US and European regulations as well as in the relevant standards and guidelines. A comparative analysis was carried out to identify the main issues related to the application of the current new regulations. In addition, standards and guidelines recently released to harmonise procedures for the validation of MD software have been used to define the risk management activities to be carried out by the manufacturer during the software development process. This paper highlights the main issues related to the qualification and classification of MD software, providing an analysis of the different regulations applied in Europe and the US. A model that integrates the risk management process within the software development lifecycle has been proposed too. It is based on regulatory requirements and considers software risk analysis as a central input to be managed by the manufacturer already at the initial stages of the software design, in order to prevent MD failures. Relevant changes in the process of MD development have been introduced with the recognition of software being an important component of MDs as stated in regulations and standards. This implies the performance of highly iterative processes that have to integrate the risk management in the framework of software development. It also makes it necessary to involve both medical and software engineering competences to safeguard patient and user safety.
Software Development in the Water Sciences: a view from the divide (Invited)
NASA Astrophysics Data System (ADS)
Miles, B.; Band, L. E.
2013-12-01
While training in statistical methods is an important part of many earth scientists' training, these scientists often learn the bulk of their software development skills in an ad hoc, just-in-time manner. Yet to carry out contemporary research scientists are spending more and more time developing software. Here I present perspectives - as an earth sciences graduate student with professional software engineering experience - on the challenges scientists face adopting software engineering practices, with an emphasis on areas of the science software development lifecycle that could benefit most from improved engineering. This work builds on experience gained as part of the NSF-funded Water Science Software Institute (WSSI) conceptualization award (NSF Award # 1216817). Throughout 2013, the WSSI team held a series of software scoping and development sprints with the goals of: (1) adding features to better model green infrastructure within the Regional Hydro-Ecological Simulation System (RHESSys); and (2) infusing test-driven agile software development practices into the processes employed by the RHESSys team. The goal of efforts such as the WSSI is to ensure that investments by current and future scientists in software engineering training will enable transformative science by improving both scientific reproducibility and researcher productivity. Experience with the WSSI indicates: (1) the potential for achieving this goal; and (2) while scientists are willing to adopt some software engineering practices, transformative science will require continued collaboration between domain scientists and cyberinfrastructure experts for the foreseeable future.
Software engineering standards and practices
NASA Technical Reports Server (NTRS)
Durachka, R. W.
1981-01-01
Guidelines are presented for the preparation of a software development plan. The various phases of a software development project are discussed throughout its life cycle including a general description of the software engineering standards and practices to be followed during each phase.
NASA Astrophysics Data System (ADS)
Hancock, Tira K.
A qualitative descriptive case study explored courses of action for educators and leaders of math and science educators to implement to help students achieve state assessment standard and postsecondary success. The problem focused on two demographically similar rural high schools in Southwest Washington that demonstrated inadequate rates of student achievement in mathematics and science. The research question investigated courses of action that may assist educators and leaders of secondary math and science educators to help students achieve WASL standards and postsecondary success in compliance with the No Child Left Behind (NCLB) Act of 2001. Senge's learning organization theory (1990, 2006) and Fullan's (2001) contributions to leading and learning in times of change provided the theoretical framework for the study. Twenty study participant responses analyzed with qualitative analysis software QSR NVivo 7 revealed six themes. Triangulation of responses with secondary data from WASL assessment scores and case study school assessment data identified 14 courses of action and three recommendations for educators and leaders of math and science educators to help students meet state standards and postsecondary success. Critical factors identified in the study as needed to assist educators to help students succeed included professional development, collaboration, teaching practices, funding, student accountability, and parental involvement.
Glowinski, Donald; Mancini, Maurizio; Cowie, Roddy; Camurri, Antonio; Chiorri, Carlo; Doherty, Cian
2013-01-01
When people perform a task as part of a joint action, their behavior is not the same as it would be if they were performing the same task alone, since it has to be adapted to facilitate shared understanding (or sometimes to prevent it). Joint performance of music offers a test bed for ecologically valid investigations of the way non-verbal behavior facilitates joint action. Here we compare the expressive movement of violinists when playing in solo and ensemble conditions. The first violinists of two string quartets (SQs), professional and student, were asked to play the same musical fragments in a solo condition and with the quartet. Synchronized multimodal recordings were created from the performances, using a specially developed software platform. Different patterns of head movement were observed. By quantifying them using an appropriate measure of entropy, we showed that head movements are more predictable in the quartet scenario. Rater evaluations showed that the change does not, as might be assumed, entail markedly reduced expression. They showed some ability to discriminate between solo and ensemble performances, but did not distinguish them in terms of emotional content or expressiveness. The data raise provocative questions about joint action in realistically complex scenarios. PMID:24312065
Space Station Mission Planning System (MPS) development study. Volume 1: Executive summary
NASA Technical Reports Server (NTRS)
Klus, W. J.
1987-01-01
The basic objective of the Space Station (SS) Mission Planning System (MPS) Development Study was to define a baseline Space Station mission plan and the associated hardware and software requirements for the system. A detailed definition of the Spacelab (SL) payload mission planning process and SL Mission Integration Planning System (MIPS) software was derived. A baseline concept was developed for performing SS manned base payload mission planning, and it was consistent with current Space Station design/operations concepts and philosophies. The SS MPS software requirements were defined. Also, requirements for new software include candidate programs for the application of artificial intelligence techniques to capture and make more effective use of mission planning expertise. A SS MPS Software Development Plan was developed which phases efforts for the development software to implement the SS mission planning concept.
Software Process Assurance for Complex Electronics
NASA Technical Reports Server (NTRS)
Plastow, Richard A.
2007-01-01
Complex Electronics (CE) now perform tasks that were previously handled in software, such as communication protocols. Many methods used to develop software bare a close resemblance to CE development. Field Programmable Gate Arrays (FPGAs) can have over a million logic gates while system-on-chip (SOC) devices can combine a microprocessor, input and output channels, and sometimes an FPGA for programmability. With this increased intricacy, the possibility of software-like bugs such as incorrect design, logic, and unexpected interactions within the logic is great. With CE devices obscuring the hardware/software boundary, we propose that mature software methodologies may be utilized with slight modifications in the development of these devices. Software Process Assurance for Complex Electronics (SPACE) is a research project that used standardized S/W Assurance/Engineering practices to provide an assurance framework for development activities. Tools such as checklists, best practices and techniques were used to detect missing requirements and bugs earlier in the development cycle creating a development process for CE that was more easily maintained, consistent and configurable based on the device used.
Framework Support For Knowledge-Based Software Development
NASA Astrophysics Data System (ADS)
Huseth, Steve
1988-03-01
The advent of personal engineering workstations has brought substantial information processing power to the individual programmer. Advanced tools and environment capabilities supporting the software lifecycle are just beginning to become generally available. However, many of these tools are addressing only part of the software development problem by focusing on rapid construction of self-contained programs by a small group of talented engineers. Additional capabilities are required to support the development of large programming systems where a high degree of coordination and communication is required among large numbers of software engineers, hardware engineers, and managers. A major player in realizing these capabilities is the framework supporting the software development environment. In this paper we discuss our research toward a Knowledge-Based Software Assistant (KBSA) framework. We propose the development of an advanced framework containing a distributed knowledge base that can support the data representation needs of tools, provide environmental support for the formalization and control of the software development process, and offer a highly interactive and consistent user interface.
Cost Estimation of Software Development and the Implications for the Program Manager
1992-06-01
Software Lifecycle Model (SLIM), the Jensen System-4 model, the Software Productivity, Quality, and Reliability Estimator ( SPQR \\20), the Constructive...function models in current use are the Software Productivity, Quality, and Reliability Estimator ( SPQR /20) and the Software Architecture Sizing and...Estimator ( SPQR /20) was developed by T. Capers Jones of Software Productivity Research, Inc., in 1985. The model is intended to estimate the outcome
[Construction of educational software about personality disorders].
Botti, Nadja Cristiane Lappann; Carneiro, Ana Luíza Marques; Almeida, Camila Souza; Pereira, Cíntia Braga Silva
2011-01-01
The study describes the experience of building educational software in the area of mental health. The software was developed to enable the nursing student identify personality disorders. In this process, we applied the pedagogical framework of Vygotsky and the theoretical framework of the diagnostic criteria defined by DSM-IV. From these references were identified personality disorders characters in stories and / or children's movies. The software development bank was built with multimedia graphics data, sound and explanatory. The software developed like educational game like questions with increasing levels of difficulty. The software was developed with Microsoft Office PowerPoint 2007. It is believed in the validity of this strategy for teaching-learning to the area of mental health nursing.
ERIC Educational Resources Information Center
Yee, Kevin; Hargis, Jace
2010-01-01
This article discusses the benefits of screencasts and its instructional uses. Well-known for some years to advanced technology users, Screen Capture Software (SCS) offers the promise of recording action on the computer desktop together with voiceover narration, all combined into a single movie file that can be shared, emailed, or uploaded.…
A Capital Assets Preservation Program.
ERIC Educational Resources Information Center
Heiman, Ralph
1989-01-01
New York State officials have created an efficient capital planning system that is a prescribed set of procedures and actions within a program planning manual and two software modules. The program is a series of logical steps that school districts must take to successfully implement their preservation plans. (MLF)
Campus Financial Systems for the Future.
ERIC Educational Resources Information Center
Jonas, Stephen; And Others
This handbook guides college and university business officers, from small liberal arts colleges to community colleges to research universities, through the complex set of decisions and actions associated with replacing financial management systems. It lists the steps necessary to evaluate an institution's current hardware, network, and software;…
48 CFR 204.7301 - Definitions.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 204.7301 Definitions. As used in this subpart— Adequate security means protective measures that are... restrictions. Cyber incident means actions taken through the use of computer networks that result in an actual.... Technical information means technical data or computer software, as those terms are defined in the clause at...
48 CFR 252.204-7012 - Safeguarding of unclassified controlled technical information.
Code of Federal Regulations, 2014 CFR
2014-10-01
.... Cyber incident means actions taken through the use of computer networks that result in an actual or... printed within an information system. Technical information means technical data or computer software, as..., catalog-item identifications, data sets, studies and analyses and related information, and computer...
NASA PC software evaluation project
NASA Technical Reports Server (NTRS)
Dominick, Wayne D. (Editor); Kuan, Julie C.
1986-01-01
The USL NASA PC software evaluation project is intended to provide a structured framework for facilitating the development of quality NASA PC software products. The project will assist NASA PC development staff to understand the characteristics and functions of NASA PC software products. Based on the results of the project teams' evaluations and recommendations, users can judge the reliability, usability, acceptability, maintainability and customizability of all the PC software products. The objective here is to provide initial, high-level specifications and guidelines for NASA PC software evaluation. The primary tasks to be addressed in this project are as follows: to gain a strong understanding of what software evaluation entails and how to organize a structured software evaluation process; to define a structured methodology for conducting the software evaluation process; to develop a set of PC software evaluation criteria and evaluation rating scales; and to conduct PC software evaluations in accordance with the identified methodology. Communication Packages, Network System Software, Graphics Support Software, Environment Management Software, General Utilities. This report represents one of the 72 attachment reports to the University of Southwestern Louisiana's Final Report on NASA Grant NGT-19-010-900. Accordingly, appropriate care should be taken in using this report out of context of the full Final Report.
The embedded software life cycle - An expanded view
NASA Technical Reports Server (NTRS)
Larman, Brian T.; Loesh, Robert E.
1989-01-01
Six common issues that are encountered in the development of software for embedded computer systems are discussed from the perspective of their interrelationships with the development process and/or the system itself. Particular attention is given to concurrent hardware/software development, prototyping, the inaccessibility of the operational system, fault tolerance, the long life cycle, and inheritance. It is noted that the life cycle for embedded software must include elements beyond simply the specification and implementation of the target software.
Creating an open environment software infrastructure
NASA Technical Reports Server (NTRS)
Jipping, Michael J.
1992-01-01
As the development of complex computer hardware accelerates at increasing rates, the ability of software to keep pace is essential. The development of software design tools, however, is falling behind the development of hardware for several reasons, the most prominent of which is the lack of a software infrastructure to provide an integrated environment for all parts of a software system. The research was undertaken to provide a basis for answering this problem by investigating the requirements of open environments.
The development of expertise using an intelligent computer-aided training system
NASA Technical Reports Server (NTRS)
Johnson, Debra Steele
1991-01-01
An initial examination was conducted of an Intelligent Tutoring System (ITS) developed for use in industry. The ITS, developed by NASA, simulated a satellite deployment task. More specifically, the PD (Payload Assist Module Deployment)/ICAT (Intelligent Computer Aided Training) System simulated a nominal Payload Assist Module (PAM) deployment. The development of expertise on this task was examined using three Flight Dynamics Officer (FDO) candidates who has no previous experience with this task. The results indicated that performance improved rapidly until Trial 5, followed by more gradual improvements through Trial 12. The performance dimensions measured included performance speed, actions completed, errors, help required, and display fields checked. Suggestions for further refining the software and for deciding when to expose trainees to more difficult task scenarios are discussed. Further, the results provide an initial demonstration of the effectiveness of the PD/ICAT system in training the nominal PAM deployment task and indicate the potential benefits of using ITS's for training other FDO tasks.
The development of expertise on an intelligent tutoring system
NASA Technical Reports Server (NTRS)
Johnson, Debra Steele
1989-01-01
An initial examination was conducted of an Intelligent Tutoring System (ITS) developed for use in industry. The ITS, developed by NASA, simulated a satellite deployment task. More specifically, the PD (Payload Assist Module Deployment)/ICAT (Intelligent Computer Aided Training) System simulated a nominal Payload Assist Module (PAM) deployment. The development of expertise on this task was examined using three Flight Dynamics Officer (FDO) candidates who had no previous experience with this task. The results indicated that performance improved rapidly until Trial 5, followed by more gradual improvements through Trial 12. The performance dimensions measured included performance speed, actions completed, errors, help required, and display fields checked. Suggestions for further refining the software and for deciding when to expose trainees to more difficult task scenarios are discussed. Further, the results provide an initial demonstration of the effectiveness of the PD/ICAT system in training the nominal PAM deployment task and indicate the potential benefits of using ITS's for training other FDO tasks.
Publishing Platform for Scientific Software - Lessons Learned
NASA Astrophysics Data System (ADS)
Hammitzsch, Martin; Fritzsch, Bernadette; Reusser, Dominik; Brembs, Björn; Deinzer, Gernot; Loewe, Peter; Fenner, Martin; van Edig, Xenia; Bertelmann, Roland; Pampel, Heinz; Klump, Jens; Wächter, Joachim
2015-04-01
Scientific software has become an indispensable commodity for the production, processing and analysis of empirical data but also for modelling and simulation of complex processes. Software has a significant influence on the quality of research results. For strengthening the recognition of the academic performance of scientific software development, for increasing its visibility and for promoting the reproducibility of research results, concepts for the publication of scientific software have to be developed, tested, evaluated, and then transferred into operations. For this, the publication and citability of scientific software have to fulfil scientific criteria by means of defined processes and the use of persistent identifiers, similar to data publications. The SciForge project is addressing these challenges. Based on interviews a blueprint for a scientific software publishing platform and a systematic implementation plan has been designed. In addition, the potential of journals, software repositories and persistent identifiers have been evaluated to improve the publication and dissemination of reusable software solutions. It is important that procedures for publishing software as well as methods and tools for software engineering are reflected in the architecture of the platform, in order to improve the quality of the software and the results of research. In addition, it is necessary to work continuously on improving specific conditions that promote the adoption and sustainable utilization of scientific software publications. Among others, this would include policies for the development and publication of scientific software in the institutions but also policies for establishing the necessary competencies and skills of scientists and IT personnel. To implement the concepts developed in SciForge a combined bottom-up / top-down approach is considered that will be implemented in parallel in different scientific domains, e.g. in earth sciences, climate research and the life sciences. Based on the developed blueprints a scientific software publishing platform will be iteratively implemented, tested, and evaluated. Thus the platform should be developed continuously on the basis of gained experiences and results. The platform services will be extended one by one corresponding to the requirements of the communities. Thus the implemented platform for the publication of scientific software can be improved and stabilized incrementally as a tool with software, science, publishing, and user oriented features.
Modeling strategic use of human computer interfaces with novel hidden Markov models
Mariano, Laura J.; Poore, Joshua C.; Krum, David M.; Schwartz, Jana L.; Coskren, William D.; Jones, Eric M.
2015-01-01
Immersive software tools are virtual environments designed to give their users an augmented view of real-world data and ways of manipulating that data. As virtual environments, every action users make while interacting with these tools can be carefully logged, as can the state of the software and the information it presents to the user, giving these actions context. This data provides a high-resolution lens through which dynamic cognitive and behavioral processes can be viewed. In this report, we describe new methods for the analysis and interpretation of such data, utilizing a novel implementation of the Beta Process Hidden Markov Model (BP-HMM) for analysis of software activity logs. We further report the results of a preliminary study designed to establish the validity of our modeling approach. A group of 20 participants were asked to play a simple computer game, instrumented to log every interaction with the interface. Participants had no previous experience with the game's functionality or rules, so the activity logs collected during their naïve interactions capture patterns of exploratory behavior and skill acquisition as they attempted to learn the rules of the game. Pre- and post-task questionnaires probed for self-reported styles of problem solving, as well as task engagement, difficulty, and workload. We jointly modeled the activity log sequences collected from all participants using the BP-HMM approach, identifying a global library of activity patterns representative of the collective behavior of all the participants. Analyses show systematic relationships between both pre- and post-task questionnaires, self-reported approaches to analytic problem solving, and metrics extracted from the BP-HMM decomposition. Overall, we find that this novel approach to decomposing unstructured behavioral data within software environments provides a sensible means for understanding how users learn to integrate software functionality for strategic task pursuit. PMID:26191026
Product-oriented Software Certification Process for Software Synthesis
NASA Technical Reports Server (NTRS)
Nelson, Stacy; Fischer, Bernd; Denney, Ewen; Schumann, Johann; Richardson, Julian; Oh, Phil
2004-01-01
The purpose of this document is to propose a product-oriented software certification process to facilitate use of software synthesis and formal methods. Why is such a process needed? Currently, software is tested until deemed bug-free rather than proving that certain software properties exist. This approach has worked well in most cases, but unfortunately, deaths still occur due to software failure. Using formal methods (techniques from logic and discrete mathematics like set theory, automata theory and formal logic as opposed to continuous mathematics like calculus) and software synthesis, it is possible to reduce this risk by proving certain software properties. Additionally, software synthesis makes it possible to automate some phases of the traditional software development life cycle resulting in a more streamlined and accurate development process.
Leveraging Existing Mission Tools in a Re-Usable, Component-Based Software Environment
NASA Technical Reports Server (NTRS)
Greene, Kevin; Grenander, Sven; Kurien, James; z,s (fshir. z[orttr); z,scer; O'Reilly, Taifun
2006-01-01
Emerging methods in component-based software development offer significant advantages but may seem incompatible with existing mission operations applications. In this paper we relate our positive experiences integrating existing mission applications into component-based tools we are delivering to three missions. In most operations environments, a number of software applications have been integrated together to form the mission operations software. In contrast, with component-based software development chunks of related functionality and data structures, referred to as components, can be individually delivered, integrated and re-used. With the advent of powerful tools for managing component-based development, complex software systems can potentially see significant benefits in ease of integration, testability and reusability from these techniques. These benefits motivate us to ask how component-based development techniques can be relevant in a mission operations environment, where there is significant investment in software tools that are not component-based and may not be written in languages for which component-based tools even exist. Trusted and complex software tools for sequencing, validation, navigation, and other vital functions cannot simply be re-written or abandoned in order to gain the advantages offered by emerging component-based software techniques. Thus some middle ground must be found. We have faced exactly this issue, and have found several solutions. Ensemble is an open platform for development, integration, and deployment of mission operations software that we are developing. Ensemble itself is an extension of an open source, component-based software development platform called Eclipse. Due to the advantages of component-based development, we have been able to vary rapidly develop mission operations tools for three surface missions by mixing and matching from a common set of mission operation components. We have also had to determine how to integrate existing mission applications for sequence development, sequence validation, and high level activity planning, and other functions into a component-based environment. For each of these, we used a somewhat different technique based upon the structure and usage of the existing application.
Space Shuttle Software Development and Certification
NASA Technical Reports Server (NTRS)
Orr, James K.; Henderson, Johnnie A
2000-01-01
Man-rated software, "software which is in control of systems and environments upon which human life is critically dependent," must be highly reliable. The Space Shuttle Primary Avionics Software System is an excellent example of such a software system. Lessons learn from more than 20 years of effort have identified basic elements that must be present to achieve this high degree of reliability. The elements include rigorous application of appropriate software development processes, use of trusted tools to support those processes, quantitative process management, and defect elimination and prevention. This presentation highlights methods used within the Space Shuttle project and raises questions that must be addressed to provide similar success in a cost effective manner on future long-term projects where key application development tools are COTS rather than internally developed custom application development tools
Reuse Metrics for Object Oriented Software
NASA Technical Reports Server (NTRS)
Bieman, James M.
1998-01-01
One way to increase the quality of software products and the productivity of software development is to reuse existing software components when building new software systems. In order to monitor improvements in reuse, the level of reuse must be measured. In this NASA supported project we (1) derived a suite of metrics which quantify reuse attributes for object oriented, object based, and procedural software, (2) designed prototype tools to take these measurements in Ada, C++, Java, and C software, (3) evaluated the reuse in available software, (4) analyzed the relationship between coupling, cohesion, inheritance, and reuse, (5) collected object oriented software systems for our empirical analyses, and (6) developed quantitative criteria and methods for restructuring software to improve reusability.
Towards Model-Driven End-User Development in CALL
ERIC Educational Resources Information Center
Farmer, Rod; Gruba, Paul
2006-01-01
The purpose of this article is to introduce end-user development (EUD) processes to the CALL software development community. EUD refers to the active participation of end-users, as non-professional developers, in the software development life cycle. Unlike formal software engineering approaches, the focus in EUD on means/ends development is…
Bringing Legacy Visualization Software to Modern Computing Devices via Application Streaming
NASA Astrophysics Data System (ADS)
Fisher, Ward
2014-05-01
Planning software compatibility across forthcoming generations of computing platforms is a problem commonly encountered in software engineering and development. While this problem can affect any class of software, data analysis and visualization programs are particularly vulnerable. This is due in part to their inherent dependency on specialized hardware and computing environments. A number of strategies and tools have been designed to aid software engineers with this task. While generally embraced by developers at 'traditional' software companies, these methodologies are often dismissed by the scientific software community as unwieldy, inefficient and unnecessary. As a result, many important and storied scientific software packages can struggle to adapt to a new computing environment; for example, one in which much work is carried out on sub-laptop devices (such as tablets and smartphones). Rewriting these packages for a new platform often requires significant investment in terms of development time and developer expertise. In many cases, porting older software to modern devices is neither practical nor possible. As a result, replacement software must be developed from scratch, wasting resources better spent on other projects. Enabled largely by the rapid rise and adoption of cloud computing platforms, 'Application Streaming' technologies allow legacy visualization and analysis software to be operated wholly from a client device (be it laptop, tablet or smartphone) while retaining full functionality and interactivity. It mitigates much of the developer effort required by other more traditional methods while simultaneously reducing the time it takes to bring the software to a new platform. This work will provide an overview of Application Streaming and how it compares against other technologies which allow scientific visualization software to be executed from a remote computer. We will discuss the functionality and limitations of existing application streaming frameworks and how a developer might prepare their software for application streaming. We will also examine the secondary benefits realized by moving legacy software to the cloud. Finally, we will examine the process by which a legacy Java application, the Integrated Data Viewer (IDV), is to be adapted for tablet computing via Application Streaming.
Software development: A paradigm for the future
NASA Technical Reports Server (NTRS)
Basili, Victor R.
1989-01-01
A new paradigm for software development that treats software development as an experimental activity is presented. It provides built-in mechanisms for learning how to develop software better and reusing previous experience in the forms of knowledge, processes, and products. It uses models and measures to aid in the tasks of characterization, evaluation and motivation. An organization scheme is proposed for separating the project-specific focus from the organization's learning and reuse focuses of software development. The implications of this approach for corporations, research and education are discussed and some research activities currently underway at the University of Maryland that support this approach are presented.
NASA Technical Reports Server (NTRS)
Mizell, Carolyn; Malone, Linda
2007-01-01
It is very difficult for project managers to develop accurate cost and schedule estimates for large, complex software development projects. None of the approaches or tools available today can estimate the true cost of software with any high degree of accuracy early in a project. This paper provides an approach that utilizes a software development process simulation model that considers and conveys the level of uncertainty that exists when developing an initial estimate. A NASA project will be analyzed using simulation and data from the Software Engineering Laboratory to show the benefits of such an approach.
Overview of software development at the parabolic dish test site
NASA Technical Reports Server (NTRS)
Miyazono, C. K.
1985-01-01
The development history of the data acquisition and data analysis software is discussed. The software development occurred between 1978 and 1984 in support of solar energy module testing at the Jet Propulsion Laboratory's Parabolic Dish Test Site, located within Edwards Test Station. The development went through incremental stages, starting with a simple single-user BASIC set of programs, and progressing to the relative complex multi-user FORTRAN system that was used until the termination of the project. Additional software in support of testing is discussed including software in support of a meteorological subsystem and the Test Bed Concentrator Control Console interface. Conclusions and recommendations for further development are discussed.
NASA Technical Reports Server (NTRS)
Stetson, Howard K.; Frank, Jeremy; Cornelius, Randy; Haddock, Angie; Wang, Lui; Garner, Larry
2015-01-01
NASA is investigating a range of future human spaceflight missions, including both Mars-distance and Near Earth Object (NEO) targets. Of significant importance for these missions is the balance between crew autonomy and vehicle automation. As distance from Earth results in increasing communication delays, future crews need both the capability and authority to independently make decisions. However, small crews cannot take on all functions performed by ground today, and so vehicles must be more automated to reduce the crew workload for such missions. NASA's Advanced Exploration Systems Program funded Autonomous Mission Operations (AMO) project conducted an autonomous command and control experiment on-board the International Space Station that demonstrated single action intelligent procedures for crew command and control. The target problem was to enable crew initialization of a facility class rack with power and thermal interfaces, and involving core and payload command and telemetry processing, without support from ground controllers. This autonomous operations capability is enabling in scenarios such as initialization of a medical facility to respond to a crew medical emergency, and representative of other spacecraft autonomy challenges. The experiment was conducted using the Expedite the Processing of Experiments for Space Station (EXPRESS) rack 7, which was located in the Port 2 location within the U.S Laboratory onboard the International Space Station (ISS). Activation and deactivation of this facility is time consuming and operationally intensive, requiring coordination of three flight control positions, 47 nominal steps, 57 commands, 276 telemetry checks, and coordination of multiple ISS systems (both core and payload). Utilization of Draper Laboratory's Timeliner software, deployed on-board the ISS within the Command and Control (C&C) computers and the Payload computers, allowed development of the automated procedures specific to ISS without having to certify and employ novel software for procedure development and execution. The procedures contained the ground procedure logic and actions as possible to include fault detection and recovery capabilities.
A Role-Playing Game for a Software Engineering Lab: Developing a Product Line
ERIC Educational Resources Information Center
Zuppiroli, Sara; Ciancarini, Paolo; Gabbrielli, Maurizio
2012-01-01
Software product line development refers to software engineering practices and techniques for creating families of similar software systems from a basic set of reusable components, called shared assets. Teaching how to deal with software product lines in a university lab course is a challenging task, because there are several practical issues that…
Software Tools for Development on the Peregrine System | High-Performance
Computing | NREL Software Tools for Development on the Peregrine System Software Tools for and manage software at the source code level. Cross-Platform Make and SCons The "Cross-Platform Make" (CMake) package is from Kitware, and SCons is a modern software build tool based on Python
Proceedings of the 19th Annual Software Engineering Workshop
NASA Technical Reports Server (NTRS)
1994-01-01
The Software Engineering Laboratory (SEL) is an organization sponsored by NASA/GSFC and created to investigate the effectiveness of software engineering technologies when applied to the development of applications software. The goals of the SEL are: (1) to understand the software development process in the GSFC environment; (2) to measure the effects of various methodologies, tools, and models on this process; and (3) to identify and then to apply successful development practices. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that include this document.
Software Management Environment (SME): Components and algorithms
NASA Technical Reports Server (NTRS)
Hendrick, Robert; Kistler, David; Valett, Jon
1994-01-01
This document presents the components and algorithms of the Software Management Environment (SME), a management tool developed for the Software Engineering Branch (Code 552) of the Flight Dynamics Division (FDD) of the Goddard Space Flight Center (GSFC). The SME provides an integrated set of visually oriented experienced-based tools that can assist software development managers in managing and planning software development projects. This document describes and illustrates the analysis functions that underlie the SME's project monitoring, estimation, and planning tools. 'SME Components and Algorithms' is a companion reference to 'SME Concepts and Architecture' and 'Software Engineering Laboratory (SEL) Relationships, Models, and Management Rules.'
Huang, Jiahua; Zhou, Hai; Zhang, Binbin; Ding, Biao
2015-09-01
This article develops a new failure database software for orthopaedics implants based on WEB. The software is based on B/S mode, ASP dynamic web technology is used as its main development language to achieve data interactivity, Microsoft Access is used to create a database, these mature technologies make the software extend function or upgrade easily. In this article, the design and development idea of the software, the software working process and functions as well as relative technical features are presented. With this software, we can store many different types of the fault events of orthopaedics implants, the failure data can be statistically analyzed, and in the macroscopic view, it can be used to evaluate the reliability of orthopaedics implants and operations, it also can ultimately guide the doctors to improve the clinical treatment level.
NASA Astrophysics Data System (ADS)
Zelt, C. A.
2017-12-01
Earth science attempts to understand how the earth works. This research often depends on software for modeling, processing, inverting or imaging. Freely sharing open-source software is essential to prevent reinventing the wheel and allows software to be improved and applied in ways the original author may never have envisioned. For young scientists, releasing software can increase their name ID when applying for jobs and funding, and create opportunities for collaborations when scientists who collect data want the software's creator to be involved in their project. However, we frequently hear scientists say software is a tool, it's not science. Creating software that implements a new or better way of earth modeling or geophysical processing, inverting or imaging should be viewed as earth science. Creating software for things like data visualization, format conversion, storage, or transmission, or programming to enhance computational performance, may be viewed as computer science. The former, ideally with an application to real data, can be published in earth science journals, the latter possibly in computer science journals. Citations in either case should accurately reflect the impact of the software on the community. Funding agencies need to support more software development and open-source releasing, and the community should give more high-profile awards for developing impactful open-source software. Funding support and community recognition for software development can have far reaching benefits when the software is used in foreseen and unforeseen ways, potentially for years after the original investment in the software development. For funding, an open-source release that is well documented should be required, with example input and output files. Appropriate funding will provide the incentive and time to release user-friendly software, and minimize the need for others to duplicate the effort. All funded software should be available through a single web site, ideally maintained by someone in a funded position. Perhaps the biggest challenge is the reality that researches who use software, as opposed to develop software, are more attractive university hires because they are more likely to be "big picture" scientists that publish in the highest profile journals, although sometimes the two go together.
OSIRIX: open source multimodality image navigation software
NASA Astrophysics Data System (ADS)
Rosset, Antoine; Pysher, Lance; Spadola, Luca; Ratib, Osman
2005-04-01
The goal of our project is to develop a completely new software platform that will allow users to efficiently and conveniently navigate through large sets of multidimensional data without the need of high-end expensive hardware or software. We also elected to develop our system on new open source software libraries allowing other institutions and developers to contribute to this project. OsiriX is a free and open-source imaging software designed manipulate and visualize large sets of medical images: http://homepage.mac.com/rossetantoine/osirix/
Proposing an Evidence-Based Strategy for Software Requirements Engineering.
Lindoerfer, Doris; Mansmann, Ulrich
2016-01-01
This paper discusses an evidence-based approach to software requirements engineering. The approach is called evidence-based, since it uses publications on the specific problem as a surrogate for stakeholder interests, to formulate risks and testing experiences. This complements the idea that agile software development models are more relevant, in which requirements and solutions evolve through collaboration between self-organizing cross-functional teams. The strategy is exemplified and applied to the development of a Software Requirements list used to develop software systems for patient registries.
Agile Development Methods for Space Operations
NASA Technical Reports Server (NTRS)
Trimble, Jay; Webster, Chris
2012-01-01
Main stream industry software development practice has gone from a traditional waterfall process to agile iterative development that allows for fast response to customer inputs and produces higher quality software at lower cost. How can we, the space ops community, adopt state of the art software development practice, achieve greater productivity at lower cost, and maintain safe and effective space flight operations? At NASA Ames, we are developing Mission Control Technologies Software, in collaboration with Johnson Space Center (JSC) and, more recently, the Jet Propulsion Laboratory (JPL).
Effective Software Engineering Leadership for Development Programs
ERIC Educational Resources Information Center
Cagle West, Marsha
2010-01-01
Software is a critical component of systems ranging from simple consumer appliances to complex health, nuclear, and flight control systems. The development of quality, reliable, and effective software solutions requires the incorporation of effective software engineering processes and leadership. Processes, approaches, and methodologies for…
Software Prototyping: Designing Systems for Users.
ERIC Educational Resources Information Center
Spies, Phyllis Bova
1983-01-01
Reports on major change in computer software development process--the prototype model, i.e., implementation of skeletal system that is enhanced during interaction with users. Expensive and unreliable software, software design errors, traditional development approach, resources required for prototyping, success stories, and systems designer's role…
Payload software technology: Software technology development plan
NASA Technical Reports Server (NTRS)
1977-01-01
Programmatic requirements for the advancement of software technology are identified for meeting the space flight requirements in the 1980 to 1990 time period. The development items are described, and software technology item derivation worksheets are presented along with the cost/time/priority assessments.
Framework Based Guidance Navigation and Control Flight Software Development
NASA Technical Reports Server (NTRS)
McComas, David
2007-01-01
This viewgraph presentation describes NASA's guidance navigation and control flight software development background. The contents include: 1) NASA/Goddard Guidance Navigation and Control (GN&C) Flight Software (FSW) Development Background; 2) GN&C FSW Development Improvement Concepts; and 3) GN&C FSW Application Framework.
TMT approach to observatory software development process
NASA Astrophysics Data System (ADS)
Buur, Hanne; Subramaniam, Annapurni; Gillies, Kim; Dumas, Christophe; Bhatia, Ravinder
2016-07-01
The purpose of the Observatory Software System (OSW) is to integrate all software and hardware components of the Thirty Meter Telescope (TMT) to enable observations and data capture; thus it is a complex software system that is defined by four principal software subsystems: Common Software (CSW), Executive Software (ESW), Data Management System (DMS) and Science Operations Support System (SOSS), all of which have interdependencies with the observatory control systems and data acquisition systems. Therefore, the software development process and plan must consider dependencies to other subsystems, manage architecture, interfaces and design, manage software scope and complexity, and standardize and optimize use of resources and tools. Additionally, the TMT Observatory Software will largely be developed in India through TMT's workshare relationship with the India TMT Coordination Centre (ITCC) and use of Indian software industry vendors, which adds complexity and challenges to the software development process, communication and coordination of activities and priorities as well as measuring performance and managing quality and risk. The software project management challenge for the TMT OSW is thus a multi-faceted technical, managerial, communications and interpersonal relations challenge. The approach TMT is using to manage this multifaceted challenge is a combination of establishing an effective geographically distributed software team (Integrated Product Team) with strong project management and technical leadership provided by the TMT Project Office (PO) and the ITCC partner to manage plans, process, performance, risk and quality, and to facilitate effective communications; establishing an effective cross-functional software management team composed of stakeholders, OSW leadership and ITCC leadership to manage dependencies and software release plans, technical complexities and change to approved interfaces, architecture, design and tool set, and to facilitate effective communications; adopting an agile-based software development process across the observatory to enable frequent software releases to help mitigate subsystem interdependencies; defining concise scope and work packages for each of the OSW subsystems to facilitate effective outsourcing of software deliverables to the ITCC partner, and to enable performance monitoring and risk management. At this stage, the architecture and high-level design of the software system has been established and reviewed. During construction each subsystem will have a final design phase with reviews, followed by implementation and testing. The results of the TMT approach to the Observatory Software development process will only be preliminary at the time of the submittal of this paper, but it is anticipated that the early results will be a favorable indication of progress.
Automated Translation of Safety Critical Application Software Specifications into PLC Ladder Logic
NASA Technical Reports Server (NTRS)
Leucht, Kurt W.; Semmel, Glenn S.
2008-01-01
The numerous benefits of automatic application code generation are widely accepted within the software engineering community. A few of these benefits include raising the abstraction level of application programming, shorter product development time, lower maintenance costs, and increased code quality and consistency. Surprisingly, code generation concepts have not yet found wide acceptance and use in the field of programmable logic controller (PLC) software development. Software engineers at the NASA Kennedy Space Center (KSC) recognized the need for PLC code generation while developing their new ground checkout and launch processing system. They developed a process and a prototype software tool that automatically translates a high-level representation or specification of safety critical application software into ladder logic that executes on a PLC. This process and tool are expected to increase the reliability of the PLC code over that which is written manually, and may even lower life-cycle costs and shorten the development schedule of the new control system at KSC. This paper examines the problem domain and discusses the process and software tool that were prototyped by the KSC software engineers.
Software Assurance Curriculum Project Volume 2: Undergraduate Course Outlines
2010-08-01
Contents Acknowledgments iii Abstract v 1 An Undergraduate Curriculum Focus on Software Assurance 1 2 Computer Science I 7 3 Computer Science II...confidence that can be integrated into traditional software development and acquisition process models . Thus, in addition to a technology focus...testing throughout the software development life cycle ( SDLC ) AP Security and complexity—system development challenges: security failures
Modular Infrastructure for Rapid Flight Software Development
NASA Technical Reports Server (NTRS)
Pires, Craig
2010-01-01
This slide presentation reviews the use of modular infrastructure to assist in the development of flight software. A feature of this program is the use of model based approach for application unique software. A review of two programs that this approach was use on are: the development of software for Hover Test Vehicle (HTV), and Lunar Atmosphere and Dust Environment Experiment (LADEE).
Emerging Software Development and Acquisition Approaches: Panacea or Villain
2011-05-16
2010 Carnegie Mellon University Emerging Software Development and Acquisition Approaches: Panacea or Villain Software Engineering Institute...aspect of this collection of information, including suggestions for reducing this burden, to Washington Headquarters Services , Directorate for...Emerging Software Development and Acquisition Approaches: Panacea or Villain 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S
NASA Software Cost Estimation Model: An Analogy Based Estimation Model
NASA Technical Reports Server (NTRS)
Hihn, Jairus; Juster, Leora; Menzies, Tim; Mathew, George; Johnson, James
2015-01-01
The cost estimation of software development activities is increasingly critical for large scale integrated projects such as those at DOD and NASA especially as the software systems become larger and more complex. As an example MSL (Mars Scientific Laboratory) developed at the Jet Propulsion Laboratory launched with over 2 million lines of code making it the largest robotic spacecraft ever flown (Based on the size of the software). Software development activities are also notorious for their cost growth, with NASA flight software averaging over 50% cost growth. All across the agency, estimators and analysts are increasingly being tasked to develop reliable cost estimates in support of program planning and execution. While there has been extensive work on improving parametric methods there is very little focus on the use of models based on analogy and clustering algorithms. In this paper we summarize our findings on effort/cost model estimation and model development based on ten years of software effort estimation research using data mining and machine learning methods to develop estimation models based on analogy and clustering. The NASA Software Cost Model performance is evaluated by comparing it to COCOMO II, linear regression, and K- nearest neighbor prediction model performance on the same data set.
Recommended approach to software development, revision 3
NASA Technical Reports Server (NTRS)
Landis, Linda; Waligora, Sharon; Mcgarry, Frank; Pajerski, Rose; Stark, Mike; Johnson, Kevin Orlin; Cover, Donna
1992-01-01
Guidelines for an organized, disciplined approach to software development that is based on studies conducted by the Software Engineering Laboratory (SEL) since 1976 are presented. It describes methods and practices for each phase of a software development life cycle that starts with requirements definition and ends with acceptance testing. For each defined life cycle phase, guidelines for the development process and its management, and for the products produced and their reviews are presented.
NASA Astrophysics Data System (ADS)
Riopel, Martin
To make science laboratory sessions more instructive, we have developed a learning environment that will allow students enrolled in a mechanics course at college or university level to engage in a scientific modelization process by combining computer-simulated experimentation and microcomputer-based laboratories. The main goal is to assist and facilitate both inductive and deductive reasoning. Within this computer application, each action can also be automatically recorded and identified while the student is using the software. The most original part of the environment is to let the student compare the simulated animation with the real video by superposing the images. We used the software with students and observed that they effectively engaged in a modelization process that included both inductive and deductive reasoning. We also observed that the students were able to use the software to produce adequate answers to questions concerning both previously taught and new theoretical concepts in physics. The students completed the experiment about twice as fast as usual and considered that using the software resulted in a better understanding of the phenomenon. We conclude that this use of the computer in science education can broaden the range of possibilities for learning and for teaching and can provide new avenues for researchers who can use it to record and study students' path of reasoning. We also believe that it would be interesting to investigate more some of the benefits associated with this environment, particularly the acceleration effect, the improvement of students' reasoning and the equilibrium between induction and deduction that we observed within this research.
Onboard Sensor Data Qualification in Human-Rated Launch Vehicles
NASA Technical Reports Server (NTRS)
Wong, Edmond; Melcher, Kevin J.; Maul, William A.; Chicatelli, Amy K.; Sowers, Thomas S.; Fulton, Christopher; Bickford, Randall
2012-01-01
The avionics system software for human-rated launch vehicles requires an implementation approach that is robust to failures, especially the failure of sensors used to monitor vehicle conditions that might result in an abort determination. Sensor measurements provide the basis for operational decisions on human-rated launch vehicles. This data is often used to assess the health of system or subsystem components, to identify failures, and to take corrective action. An incorrect conclusion and/or response may result if the sensor itself provides faulty data, or if the data provided by the sensor has been corrupted. Operational decisions based on faulty sensor data have the potential to be catastrophic, resulting in loss of mission or loss of crew. To prevent these later situations from occurring, a Modular Architecture and Generalized Methodology for Sensor Data Qualification in Human-rated Launch Vehicles has been developed. Sensor Data Qualification (SDQ) is a set of algorithms that can be implemented in onboard flight software, and can be used to qualify data obtained from flight-critical sensors prior to the data being used by other flight software algorithms. Qualified data has been analyzed by SDQ and is determined to be a true representation of the sensed system state; that is, the sensor data is determined not to be corrupted by sensor faults or signal transmission faults. Sensor data can become corrupted by faults at any point in the signal path between the sensor and the flight computer. Qualifying the sensor data has the benefit of ensuring that erroneous data is identified and flagged before otherwise being used for operational decisions, thus increasing confidence in the response of the other flight software processes using the qualified data, and decreasing the probability of false alarms or missed detections.
NASA Astrophysics Data System (ADS)
1992-06-01
The House Committee on Science, Space, and Technology asked NASA to study software development issues for the space station. How well NASA has implemented key software engineering practices for the station was asked. Specifically, the objectives were to determine: (1) if independent verification and validation techniques are being used to ensure that critical software meets specified requirements and functions; (2) if NASA has incorporated software risk management techniques into program; (3) whether standards are in place that will prescribe a disciplined, uniform approach to software development; and (4) if software support tools will help, as intended, to maximize efficiency in developing and maintaining the software. To meet the objectives, NASA proceeded: (1) reviewing and analyzing software development objectives and strategies contained in NASA conference publications; (2) reviewing and analyzing NASA, other government, and industry guidelines for establishing good software development practices; (3) reviewing and analyzing technical proposals and contracts; (4) reviewing and analyzing software management plans, risk management plans, and program requirements; (4) reviewing and analyzing reports prepared by NASA and contractor officials that identified key issues and challenges facing the program; (5) obtaining expert opinions on what constitutes appropriate independent V-and-V and software risk management activities; (6) interviewing program officials at NASA headquarters in Washington, DC; at the Space Station Program Office in Reston, Virginia; and at the three work package centers; Johnson in Houston, Texas; Marshall in Huntsville, Alabama; and Lewis in Cleveland, Ohio; and (7) interviewing contractor officials doing work for NASA at Johnson and Marshall. The audit work was performed in accordance with generally accepted government auditing standards, between April 1991 and May 1992.
NASA Technical Reports Server (NTRS)
1989-01-01
001 is an integrated tool suited for automatically developing ultra reliable models, simulations and software systems. Developed and marketed by Hamilton Technologies, Inc. (HTI), it has been applied in engineering, manufacturing, banking and software tools development. The software provides the ability to simplify the complex. A system developed with 001 can be a prototype or fully developed with production quality code. It is free of interface errors, consistent, logically complete and has no data or control flow errors. Systems can be designed, developed and maintained with maximum productivity. Margaret Hamilton, President of Hamilton Technologies, also directed the research and development of USE.IT, an earlier product which was the first computer aided software engineering product in the industry to concentrate on automatically supporting the development of an ultrareliable system throughout its life cycle. Both products originated in NASA technology developed under a Johnson Space Center contract.
NASA Technical Reports Server (NTRS)
Church, Victor E.; Long, D.; Hartenstein, Ray; Perez-Davila, Alfredo
1992-01-01
This report is one of a series discussing configuration management (CM) topics for Space Station ground systems software development. It provides a description of the Software Support Environment (SSE)-developed Software Test Management (STM) capability, and discusses the possible use of this capability for management of developed software during testing performed on target platforms. This is intended to supplement the formal documentation of STM provided by the SEE Project. How STM can be used to integrate contractor CM and formal CM for software before delivery to operations is described. STM provides a level of control that is flexible enough to support integration and debugging, but sufficiently rigorous to insure the integrity of the testing process.
Towards Archetypes-Based Software Development
NASA Astrophysics Data System (ADS)
Piho, Gunnar; Roost, Mart; Perkins, David; Tepandi, Jaak
We present a framework for the archetypes based engineering of domains, requirements and software (Archetypes-Based Software Development, ABD). An archetype is defined as a primordial object that occurs consistently and universally in business domains and in business software systems. An archetype pattern is a collaboration of archetypes. Archetypes and archetype patterns are used to capture conceptual information into domain specific models that are utilized by ABD. The focus of ABD is on software factories - family-based development artefacts (domain specific languages, patterns, frameworks, tools, micro processes, and others) that can be used to build the family members. We demonstrate the usage of ABD for developing laboratory information management system (LIMS) software for the Clinical and Biomedical Proteomics Group, at the Leeds Institute of Molecular Medicine, University of Leeds.
Children's Musical Empowerment in Two Composition Task Designs
ERIC Educational Resources Information Center
Bucura, Elizabeth; Weissberg, JulieAnne
2017-01-01
The purpose of this study was to investigate elementary students' creating processes and perspectives through composition. Two fourth-grade classes took part in this action research, which consisted of creating four compositions--two with acoustic instruments and two with computer software. For each of the two sound sources, the first composition…
Logistics hardware and services control system
NASA Technical Reports Server (NTRS)
Koromilas, A.; Miller, K.; Lamb, T.
1973-01-01
Software system permits onsite direct control of logistics operations, which include spare parts, initial installation, tool control, and repairable parts status and control, through all facets of operations. System integrates logistics actions and controls receipts, issues, loans, repairs, fabrications, and modifications and assets in predicting and allocating logistics parts and services effectively.
Enabling Creative Learning Design through Semantic Technologies
ERIC Educational Resources Information Center
Charlton, Patricia; Magoulas, George; Laurillard, Diana
2012-01-01
The paper advocates an approach to learning design that considers it as creating digital artefacts that can be extended, modified and used for different purposes. This is realised through an "act becoming artefact" cycle, where users' actions in the authors' software environment, named Learning Designer, are automatically interpreted on…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-14
... Initial Determination Granting Complainants' Unopposed Motion for Leave To Amend the Complaint and Notice of Investigation AGENCY: U.S. International Trade Commission. ACTION: Notice. SUMMARY: Notice is...' unopposed motion for leave to amend the complaint and notice of investigation. FOR FURTHER INFORMATION...
Reducing Plagiarism by Using Online Software: An Experimental Study
ERIC Educational Resources Information Center
Kose, Ozgur; Arikan, Arda
2011-01-01
This action research attempts to explore the perceptions of Turkish university students on plagiarism while evaluating the effectiveness of an online application used to deter plagiarism. The participants were 40 first year university students studying in two different sections of an academic writing class. The findings show that the participants…
Camera! Action! Collaborate with Digital Moviemaking
ERIC Educational Resources Information Center
Swan, Kathleen Owings; Hofer, Mark; Levstik, Linda S.
2007-01-01
Broadly defined, digital moviemaking integrates a variety of media (images, sound, text, video, narration) to communicate with an audience. There is near-ubiquitous access to the necessary software (MovieMaker and iMovie are bundled free with their respective operating systems) and hardware (computers with Internet access, digital cameras, etc.).…
The cost of software fault tolerance
NASA Technical Reports Server (NTRS)
Migneault, G. E.
1982-01-01
The proposed use of software fault tolerance techniques as a means of reducing software costs in avionics and as a means of addressing the issue of system unreliability due to faults in software is examined. A model is developed to provide a view of the relationships among cost, redundancy, and reliability which suggests strategies for software development and maintenance which are not conventional.
Are Earth System model software engineering practices fit for purpose? A case study.
NASA Astrophysics Data System (ADS)
Easterbrook, S. M.; Johns, T. C.
2009-04-01
We present some analysis and conclusions from a case study of the culture and practices of scientists at the Met Office and Hadley Centre working on the development of software for climate and Earth System models using the MetUM infrastructure. The study examined how scientists think about software correctness, prioritize their requirements in making changes, and develop a shared understanding of the resulting models. We conclude that highly customized techniques driven strongly by scientific research goals have evolved for verification and validation of such models. In a formal software engineering context these represents costly, but invaluable, software integration tests with considerable benefits. The software engineering practices seen also exhibit recognisable features of both agile and open source software development projects - self-organisation of teams consistent with a meritocracy rather than top-down organisation, extensive use of informal communication channels, and software developers who are generally also users and science domain experts. We draw some general conclusions on whether these practices work well, and what new software engineering challenges may lie ahead as Earth System models become ever more complex and petascale computing becomes the norm.