Software Past, Present, and Future: Views from Government, Industry and Academia
NASA Technical Reports Server (NTRS)
Holcomb, Lee; Page, Jerry; Evangelist, Michael
2000-01-01
Views from the NASA CIO NASA Software Engineering Workshop on software development from the past, present, and future are presented. The topics include: 1) Software Past; 2) Software Present; 3) NASA's Largest Software Challenges; 4) 8330 Software Projects in Industry Standish Groups 1994 Report; 5) Software Future; 6) Capability Maturity Model (CMM): Software Engineering Institute (SEI) levels; 7) System Engineering Quality Also Part of the Problem; 8) University Environment Trends Will Increase the Problem in Software Engineering; and 9) NASA Software Engineering Goals.
NASA Technical Reports Server (NTRS)
Voigt, S. (Editor); Beskenis, S. (Editor)
1985-01-01
Issues in the development of software for the Space Station are discussed. Software acquisition and management, software development environment, standards, information system support for software developers, and a future software advisory board are addressed.
The software product assurance metrics study: JPL's software systems quality and productivity
NASA Technical Reports Server (NTRS)
Bush, Marilyn W.
1989-01-01
The findings are reported of the Jet Propulsion Laboratory (JPL)/Software Product Assurance (SPA) Metrics Study, conducted as part of a larger JPL effort to improve software quality and productivity. Until recently, no comprehensive data had been assembled on how JPL manages and develops software-intensive systems. The first objective was to collect data on software development from as many projects and for as many years as possible. Results from five projects are discussed. These results reflect 15 years of JPL software development, representing over 100 data points (systems and subsystems), over a third of a billion dollars, over four million lines of code and 28,000 person months. Analysis of this data provides a benchmark for gauging the effectiveness of past, present and future software development work. In addition, the study is meant to encourage projects to record existing metrics data and to gather future data. The SPA long term goal is to integrate the collection of historical data and ongoing project data with future project estimations.
Educational Software: A Developer's Perspective.
ERIC Educational Resources Information Center
Armstrong, Timothy C.; Loane, Russell F.
1994-01-01
Examines the current status and short-term future of computer software development in higher education. Topics discussed include educational advantages of software; current program development techniques, including object oriented programming; and market trends, including IBM versus Macintosh and multimedia programs. (LRW)
NASA Technical Reports Server (NTRS)
1976-01-01
A software analysis was performed of known STS sortie payload elements and their associated experiments. This provided basic data for STS payload software characteristics and sizes. A set of technology drivers was identified based on a survey of future technology needs and an assessment of current software technology. The results will be used to evolve a planned approach to software technology development. The purpose of this plan is to ensure that software technology is advanced at a pace and a depth sufficient to fulfill the identified future needs.
Software production methodology tested project
NASA Technical Reports Server (NTRS)
Tausworthe, R. C.
1976-01-01
The history and results of a 3 1/2-year study in software development methodology are reported. The findings of this study have become the basis for DSN software development guidelines and standard practices. The article discusses accomplishments, discoveries, problems, recommendations and future directions.
The VLBI Data Analysis Software νSolve: Development Progress and Plans for the Future
NASA Astrophysics Data System (ADS)
Bolotin, S.; Baver, K.; Gipson, J.; Gordon, D.; MacMillan, D.
2014-12-01
The program νSolve is a part of the CALC/SOLVE VLBI data analysis system. It is a replacement for interactive SOLVE, the part of CALC/SOLVE that is used for preliminary data analysis of new VLBI sessions. νSolve is completely new software. It is written in C++ and has a modern graphical user interface. In this article we present the capabilities of the software, its current status, and our plans for future development.
Development of N-version software samples for an experiment in software fault tolerance
NASA Technical Reports Server (NTRS)
Lauterbach, L.
1987-01-01
The report documents the task planning and software development phases of an effort to obtain twenty versions of code independently designed and developed from a common specification. These versions were created for use in future experiments in software fault tolerance, in continuation of the experimental series underway at the Systems Validation Methods Branch (SVMB) at NASA Langley Research Center. The 20 versions were developed under controlled conditions at four U.S. universities, by 20 teams of two researchers each. The versions process raw data from a modified Redundant Strapped Down Inertial Measurement Unit (RSDIMU). The specifications, and over 200 questions submitted by the developers concerning the specifications, are included as appendices to this report. Design documents, and design and code walkthrough reports for each version, were also obtained in this task for use in future studies.
Architectural Implementation of NASA Space Telecommunications Radio System Specification
NASA Technical Reports Server (NTRS)
Peters, Kenneth J.; Lux, James P.; Lang, Minh; Duncan, Courtney B.
2012-01-01
This software demonstrates a working implementation of the NASA STRS (Space Telecommunications Radio System) architecture specification. This is a developing specification of software architecture and required interfaces to provide commonality among future NASA and commercial software-defined radios for space, and allow for easier mixing of software and hardware from different vendors. It provides required functions, and supports interaction with STRS-compliant simple test plug-ins ("waveforms"). All of it is programmed in "plain C," except where necessary to interact with C++ plug-ins. It offers a small footprint, suitable for use in JPL radio hardware. Future NASA work is expected to develop into fully capable software-defined radios for use on the space station, other space vehicles, and interplanetary probes.
Software Development in the Water Sciences: a view from the divide (Invited)
NASA Astrophysics Data System (ADS)
Miles, B.; Band, L. E.
2013-12-01
While training in statistical methods is an important part of many earth scientists' training, these scientists often learn the bulk of their software development skills in an ad hoc, just-in-time manner. Yet to carry out contemporary research scientists are spending more and more time developing software. Here I present perspectives - as an earth sciences graduate student with professional software engineering experience - on the challenges scientists face adopting software engineering practices, with an emphasis on areas of the science software development lifecycle that could benefit most from improved engineering. This work builds on experience gained as part of the NSF-funded Water Science Software Institute (WSSI) conceptualization award (NSF Award # 1216817). Throughout 2013, the WSSI team held a series of software scoping and development sprints with the goals of: (1) adding features to better model green infrastructure within the Regional Hydro-Ecological Simulation System (RHESSys); and (2) infusing test-driven agile software development practices into the processes employed by the RHESSys team. The goal of efforts such as the WSSI is to ensure that investments by current and future scientists in software engineering training will enable transformative science by improving both scientific reproducibility and researcher productivity. Experience with the WSSI indicates: (1) the potential for achieving this goal; and (2) while scientists are willing to adopt some software engineering practices, transformative science will require continued collaboration between domain scientists and cyberinfrastructure experts for the foreseeable future.
Software Reliability Analysis of NASA Space Flight Software: A Practical Experience
Sukhwani, Harish; Alonso, Javier; Trivedi, Kishor S.; Mcginnis, Issac
2017-01-01
In this paper, we present the software reliability analysis of the flight software of a recently launched space mission. For our analysis, we use the defect reports collected during the flight software development. We find that this software was developed in multiple releases, each release spanning across all software life-cycle phases. We also find that the software releases were developed and tested for four different hardware platforms, spanning from off-the-shelf or emulation hardware to actual flight hardware. For releases that exhibit reliability growth or decay, we fit Software Reliability Growth Models (SRGM); otherwise we fit a distribution function. We find that most releases exhibit reliability growth, with Log-Logistic (NHPP) and S-Shaped (NHPP) as the best-fit SRGMs. For the releases that experience reliability decay, we investigate the causes for the same. We find that such releases were the first software releases to be tested on a new hardware platform, and hence they encountered major hardware integration issues. Also such releases seem to have been developed under time pressure in order to start testing on the new hardware platform sooner. Such releases exhibit poor reliability growth, and hence exhibit high predicted failure rate. Other problems include hardware specification changes and delivery delays from vendors. Thus, our analysis provides critical insights and inputs to the management to improve the software development process. As NASA has moved towards a product line engineering for its flight software development, software for future space missions will be developed in a similar manner and hence the analysis results for this mission can be considered as a baseline for future flight software missions. PMID:29278255
Software Reliability Analysis of NASA Space Flight Software: A Practical Experience.
Sukhwani, Harish; Alonso, Javier; Trivedi, Kishor S; Mcginnis, Issac
2016-01-01
In this paper, we present the software reliability analysis of the flight software of a recently launched space mission. For our analysis, we use the defect reports collected during the flight software development. We find that this software was developed in multiple releases, each release spanning across all software life-cycle phases. We also find that the software releases were developed and tested for four different hardware platforms, spanning from off-the-shelf or emulation hardware to actual flight hardware. For releases that exhibit reliability growth or decay, we fit Software Reliability Growth Models (SRGM); otherwise we fit a distribution function. We find that most releases exhibit reliability growth, with Log-Logistic (NHPP) and S-Shaped (NHPP) as the best-fit SRGMs. For the releases that experience reliability decay, we investigate the causes for the same. We find that such releases were the first software releases to be tested on a new hardware platform, and hence they encountered major hardware integration issues. Also such releases seem to have been developed under time pressure in order to start testing on the new hardware platform sooner. Such releases exhibit poor reliability growth, and hence exhibit high predicted failure rate. Other problems include hardware specification changes and delivery delays from vendors. Thus, our analysis provides critical insights and inputs to the management to improve the software development process. As NASA has moved towards a product line engineering for its flight software development, software for future space missions will be developed in a similar manner and hence the analysis results for this mission can be considered as a baseline for future flight software missions.
Design, Fabrication, and Testing of a Hopper Spacecraft Simulator
NASA Astrophysics Data System (ADS)
Mucasey, Evan Phillip Krell
A robust test bed is needed to facilitate future development of guidance, navigation, and control software for future vehicles capable of vertical takeoff and landings. Specifically, this work aims to develop both a hardware and software simulator that can be used for future flight software development for extra-planetary vehicles. To achieve the program requirements of a high thrust to weight ratio with large payload capability, the vehicle is designed to have a novel combination of electric motors and a micro jet engine is used to act as the propulsion elements. The spacecraft simulator underwent several iterations of hardware development using different materials and fabrication methods. The final design used a combination of carbon fiber and fiberglass that was cured under vacuum to serve as the frame of the vehicle which provided a strong, lightweight platform for all flight components and future payloads. The vehicle also uses an open source software development platform, Arduino, to serve as the initial flight computer and has onboard accelerometers, gyroscopes, and magnetometers to sense the vehicles attitude. To prevent instability due to noise, a polynomial kalman filter was designed and this fed the sensed angles and rates into a robust attitude controller which autonomously control the vehicle' s yaw, pitch, and roll angles. In addition to the hardware development of the vehicle itself, both a software simulation and a real time data acquisition interface was written in MATLAB/SIMULINK so that real flight data could be taken and then correlated to the simulation to prove the accuracy of the analytical model. In result, the full scale vehicle was designed and own outside of the lab environment and data showed that the software model accurately predicted the flight dynamics of the vehicle.
Engineering software development with HyperCard
NASA Technical Reports Server (NTRS)
Darko, Robert J.
1990-01-01
The successful and unsuccessful techniques used in the development of software using HyperCard are described. The viability of the HyperCard for engineering is evaluated and the future use of HyperCard by this particular group of developers is discussed.
Software development environments: Present and future, appendix D
NASA Technical Reports Server (NTRS)
Riddle, W. E.
1980-01-01
Computerized environments which facilitate the development of appropriately functioning software systems are discussed. Their current status is reviewed and several trends exhibited by their history are identified. A number of principles, some at (slight) variance with the historical trends, are suggested and it is argued that observance of these principles is critical to achieving truly effective and efficient software development support environments.
Future Software Sizing Metrics and Estimation Challenges
2011-07-01
systems 4. Ultrahigh software system assurance 5. Legacy maintenance and Brownfield development 6. Agile and Lean/ Kanban development. This paper...refined as the design of the maintenance modifications or Brownfield re-engineering is determined. VII. 6. AGILE AND LEAN/ KANBAN DEVELOPMENT The...difficulties of software maintenance estimation can often be mitigated by using lean workflow management techniques such as Kanban [25]. In Kanban
The Environmental Control and Life Support System (ECLSS) advanced automation project
NASA Technical Reports Server (NTRS)
Dewberry, Brandon S.; Carnes, Ray
1990-01-01
The objective of the environmental control and life support system (ECLSS) Advanced Automation Project is to influence the design of the initial and evolutionary Space Station Freedom Program (SSFP) ECLSS toward a man-made closed environment in which minimal flight and ground manpower is needed. Another objective includes capturing ECLSS design and development knowledge future missions. Our approach has been to (1) analyze the SSFP ECLSS, (2) envision as our goal a fully automated evolutionary environmental control system - an augmentation of the baseline, and (3) document the advanced software systems, hooks, and scars which will be necessary to achieve this goal. From this analysis, prototype software is being developed, and will be tested using air and water recovery simulations and hardware subsystems. In addition, the advanced software is being designed, developed, and tested using automation software management plan and lifecycle tools. Automated knowledge acquisition, engineering, verification and testing tools are being used to develop the software. In this way, we can capture ECLSS development knowledge for future use develop more robust and complex software, provide feedback to the knowledge based system tool community, and ensure proper visibility of our efforts.
2012-03-13
Legacy Maintenance and Brownfield Development 6.6.6 Agile and Kanban Development 6.6.7 Putting It All Together at the Large-Project or Enterprise Level...NDI)-intensive systems Ultrahigh software system assurance; Legacy maintenance and brownfield development; and Agile and kanban development. This...be furnished by NDI components or may need to be developed for special systems. Legacy Maintenance and Brownfield Development Fewer and fewer software
Pricing Software and Information on CD-ROM.
ERIC Educational Resources Information Center
Gibbins, Patrick
1987-01-01
Examines the relationships between purchases of optical data disk products, publishers, and software suppliers. The discussion covers current pricing strategies for optical data disk software and information products, and possible future developments in marketing and pricing. (CLB)
Díaz-Zuccarini, V.; Narracott, A.J.; Burriesci, G.; Zervides, C.; Rafiroiu, D.; Jones, D.; Hose, D.R.; Lawford, P.V.
2009-01-01
This paper describes the use of diverse software tools in cardiovascular applications. These tools were primarily developed in the field of engineering and the applications presented push the boundaries of the software to address events related to venous and arterial valve closure, exploration of dynamic boundary conditions or the inclusion of multi-scale boundary conditions from protein to organ levels. The future of cardiovascular research and the challenges that modellers and clinicians face from validation to clinical uptake are discussed from an end-user perspective. PMID:19487202
Díaz-Zuccarini, V; Narracott, A J; Burriesci, G; Zervides, C; Rafiroiu, D; Jones, D; Hose, D R; Lawford, P V
2009-07-13
This paper describes the use of diverse software tools in cardiovascular applications. These tools were primarily developed in the field of engineering and the applications presented push the boundaries of the software to address events related to venous and arterial valve closure, exploration of dynamic boundary conditions or the inclusion of multi-scale boundary conditions from protein to organ levels. The future of cardiovascular research and the challenges that modellers and clinicians face from validation to clinical uptake are discussed from an end-user perspective.
An Open Avionics and Software Architecture to Support Future NASA Exploration Missions
NASA Technical Reports Server (NTRS)
Schlesinger, Adam
2017-01-01
The presentation describes an avionics and software architecture that has been developed through NASAs Advanced Exploration Systems (AES) division. The architecture is open-source, highly reliable with fault tolerance, and utilizes standard capabilities and interfaces, which are scalable and customizable to support future exploration missions. Specific focus areas of discussion will include command and data handling, software, human interfaces, communication and wireless systems, and systems engineering and integration.
NASA Technical Reports Server (NTRS)
Hebert, Phillip W., Sr.; Davis, Dawn M.; Turowski, Mark P.; Holladay, Wendy T.; Hughes, Mark S.
2012-01-01
The advent of the commercial space launch industry and NASA's more recent resumption of operation of Stennis Space Center's large test facilities after thirty years of contractor control resulted in a need for a non-proprietary data acquisition systems (DAS) software to support government and commercial testing. The software is designed for modularity and adaptability to minimize the software development effort for current and future data systems. An additional benefit of the software's architecture is its ability to easily migrate to other testing facilities thus providing future commonality across Stennis. Adapting the software to other Rocket Propulsion Test (RPT) Centers such as MSFC, White Sands, and Plumbrook Station would provide additional commonality and help reduce testing costs for NASA. Ultimately, the software provides the government with unlimited rights and guarantees privacy of data to commercial entities. The project engaged all RPT Centers and NASA's Independent Verification & Validation facility to enhance product quality. The design consists of a translation layer which provides the transparency of the software application layers to underlying hardware regardless of test facility location and a flexible and easily accessible database. This presentation addresses system technical design, issues encountered, and the status of Stennis development and deployment.
Software development: A paradigm for the future
NASA Technical Reports Server (NTRS)
Basili, Victor R.
1989-01-01
A new paradigm for software development that treats software development as an experimental activity is presented. It provides built-in mechanisms for learning how to develop software better and reusing previous experience in the forms of knowledge, processes, and products. It uses models and measures to aid in the tasks of characterization, evaluation and motivation. An organization scheme is proposed for separating the project-specific focus from the organization's learning and reuse focuses of software development. The implications of this approach for corporations, research and education are discussed and some research activities currently underway at the University of Maryland that support this approach are presented.
2012-03-13
Brownfield Development 6.6.6 Agile and Kanban Development 6.6.7 Putting It All Together at the Large-Project or Enterprise Level 6.7 References 7...Ultrahigh software system assurance; Legacy maintenance and brownfield development; and Agile and kanban development. This chapter summarizes each...components or may need to be developed for special systems. Legacy Maintenance and Brownfield Development Fewer and fewer software-intensive systems have
eXascale PRogramming Environment and System Software (XPRESS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chapman, Barbara; Gabriel, Edgar
Exascale systems, with a thousand times the compute capacity of today’s leading edge petascale computers, are expected to emerge during the next decade. Their software systems will need to facilitate the exploitation of exceptional amounts of concurrency in applications, and ensure that jobs continue to run despite the occurrence of system failures and other kinds of hard and soft errors. Adapting computations at runtime to cope with changes in the execution environment, as well as to improve power and performance characteristics, is likely to become the norm. As a result, considerable innovation is required to develop system support to meetmore » the needs of future computing platforms. The XPRESS project aims to develop and prototype a revolutionary software system for extreme-scale computing for both exascale and strongscaled problems. The XPRESS collaborative research project will advance the state-of-the-art in high performance computing and enable exascale computing for current and future DOE mission-critical applications and supporting systems. The goals of the XPRESS research project are to: A. enable exascale performance capability for DOE applications, both current and future, B. develop and deliver a practical computing system software X-stack, OpenX, for future practical DOE exascale computing systems, and C. provide programming methods and environments for effective means of expressing application and system software for portable exascale system execution.« less
Looking to 2050: The USGS Integrated Software for Imagers and Spectrometers (ISIS)
NASA Astrophysics Data System (ADS)
Becker, T. L.; Edmundson, K. L.; Sides, S.; Hare, T. M.; Laura, J. R.
2017-02-01
Astrogeology Science Center develops and maintains software (ISIS) in support of planetary data for a diverse set of missions. We plan to provide support through the future while adapting to changes in hardware, software, and science requirements.
Automation of the Environmental Control and Life Support System
NASA Technical Reports Server (NTRS)
Dewberry, Brandon S.; Carnes, J. Ray
1990-01-01
The objective of the Environmental Control and Life Support System (ECLSS) Advanced Automation Project is to recommend and develop advanced software for the initial and evolutionary Space Station Freedom (SSF) ECLS system which will minimize the crew and ground manpower needed for operations. Another objective includes capturing ECLSS design and development knowledge for future missions. This report summarizes our results from Phase I, the ECLSS domain analysis phase, which we broke down into three steps: 1) Analyze and document the baselined ECLS system, 2) envision as our goal an evolution to a fully automated regenerative life support system, built upon an augmented baseline, and 3) document the augmentations (hooks and scars) and advanced software systems which we see as necessary in achieving minimal manpower support for ECLSS operations. In addition, Phase I included development of an advanced software life cycle testing tools will be used in the development of the software. In this way, we plan in preparation for phase II and III, the development and integration phases, respectively. Automated knowledge acquisition, engineering, verification, and can capture ECLSS development knowledge for future use, develop more robust and complex software, provide feedback to the KBS tool community, and insure proper visibility of our efforts.
Firing Room Remote Application Software Development
NASA Technical Reports Server (NTRS)
Liu, Kan
2015-01-01
The Engineering and Technology Directorate (NE) at National Aeronautics and Space Administration (NASA) Kennedy Space Center (KSC) is designing a new command and control system for the checkout and launch of Space Launch System (SLS) and future rockets. The purposes of the semester long internship as a remote application software developer include the design, development, integration, and verification of the software and hardware in the firing rooms, in particular with the Mobile Launcher (ML) Launch Accessories (LACC) subsystem. In addition, a software test verification procedure document was created to verify and checkout LACC software for Launch Equipment Test Facility (LETF) testing.
Woynaroski, Tiffany; Oller, D. Kimbrough; Keceli-Kaysili, Bahar; Xu, Dongxin; Richards, Jeffrey A.; Gilkerson, Jill; Gray, Sharmistha; Yoder, Paul
2017-01-01
Theory and research suggest that vocal development predicts “useful speech” in preschoolers with autism spectrum disorder (ASD), but conventional methods for measurement of vocal development are costly and time consuming. This longitudinal correlational study examines the reliability and validity of several automated indices of vocalization development relative to an index derived from human coded, conventional communication samples in a sample of preverbal preschoolers with ASD. Automated indices of vocal development were derived using software that is presently “in development” and/or only available for research purposes and using commercially available Language ENvironment Analysis (LENA) software. Indices of vocal development that could be derived using the software available for research purposes: (a) were highly stable with a single day-long audio recording, (b) predicted future spoken vocabulary to a degree that was nonsignificantly different from the index derived from conventional communication samples, and (c) continued to predict future spoken vocabulary even after controlling for concurrent vocabulary in our sample. The score derived from standard LENA software was similarly stable, but was not significantly correlated with future spoken vocabulary. Findings suggest that automated vocal analysis is a valid and reliable alternative to time intensive and expensive conventional communication samples for measurement of vocal development of preverbal preschoolers with ASD in research and clinical practice. PMID:27459107
INTERIM -- Starlink Software Environment
NASA Astrophysics Data System (ADS)
Pearce, Dave; Pavelin, Cliff; Lawden, M. D.
Early versions of this paper were based on a number of other papers produced at a very early stage of the Starlink project. They contained a description of a specific implementation of a subroutine library, speculations on the desirable attributes of a software environment, and future development plans. They reflected the experimental nature of the Starlink software environment at that time. Since then, the situation has changed. The implemented subroutine library, INTERIM_DIR:INTERIM.OLB, is now a well established and widely used piece of software. A completely new Starlink software environment (ADAM) has been developed and distributed. Thus the library released in 1980 as `STARLINK' and now called `INTERIM' has reached the end of its development cycle and is now frozen in its current state, apart from bug corrections. This paper has, therefore, been completely rewritten and restructured to reflect the new situation. Its aim is to describe the facilities of the INTERIM subroutine library as clearly and concisely as possible. It avoids speculation, discussion of design decisions, and announcements of future plans.
Assessment Environment for Complex Systems Software Guide
NASA Technical Reports Server (NTRS)
2013-01-01
This Software Guide (SG) describes the software developed to test the Assessment Environment for Complex Systems (AECS) by the West Virginia High Technology Consortium (WVHTC) Foundation's Mission Systems Group (MSG) for the National Aeronautics and Space Administration (NASA) Aeronautics Research Mission Directorate (ARMD). This software is referred to as the AECS Test Project throughout the remainder of this document. AECS provides a framework for developing, simulating, testing, and analyzing modern avionics systems within an Integrated Modular Avionics (IMA) architecture. The purpose of the AECS Test Project is twofold. First, it provides a means to test the AECS hardware and system developed by MSG. Second, it provides an example project upon which future AECS research may be based. This Software Guide fully describes building, installing, and executing the AECS Test Project as well as its architecture and design. The design of the AECS hardware is described in the AECS Hardware Guide. Instructions on how to configure, build and use the AECS are described in the User's Guide. Sample AECS software, developed by the WVHTC Foundation, is presented in the AECS Software Guide. The AECS Hardware Guide, AECS User's Guide, and AECS Software Guide are authored by MSG. The requirements set forth for AECS are presented in the Statement of Work for the Assessment Environment for Complex Systems authored by NASA Dryden Flight Research Center (DFRC). The intended audience for this document includes software engineers, hardware engineers, project managers, and quality assurance personnel from WVHTC Foundation (the suppliers of the software), NASA (the customer), and future researchers (users of the software). Readers are assumed to have general knowledge in the field of real-time, embedded computer software development.
Advanced software integration: The case for ITV facilities
NASA Technical Reports Server (NTRS)
Garman, John R.
1990-01-01
The array of technologies and methodologies involved in the development and integration of avionics software has moved almost as rapidly as computer technology itself. Future avionics systems involve major advances and risks in the following areas: (1) Complexity; (2) Connectivity; (3) Security; (4) Duration; and (5) Software engineering. From an architectural standpoint, the systems will be much more distributed, involve session-based user interfaces, and have the layered architectures typified in the layers of abstraction concepts popular in networking. Typified in the NASA Space Station Freedom will be the highly distributed nature of software development itself. Systems composed of independent components developed in parallel must be bound by rigid standards and interfaces, the clean requirements and specifications. Avionics software provides a challenge in that it can not be flight tested until the first time it literally flies. It is the binding of requirements for such an integration environment into the advances and risks of future avionics systems that form the basis of the presented concept and the basic Integration, Test, and Verification concept within the development and integration life cycle of Space Station Mission and Avionics systems.
NASA Technical Reports Server (NTRS)
Hebert, Phillip W., Sr.; Hughes, Mark S.; Davis, Dawn M.; Turowski, Mark P.; Holladay, Wendy T.; Marshall, PeggL.; Duncan, Michael E.; Morris, Jon A.; Franzl, Richard W.
2012-01-01
The advent of the commercial space launch industry and NASA's more recent resumption of operation of Stennis Space Center's large test facilities after thirty years of contractor control resulted in a need for a non-proprietary data acquisition system (DAS) software to support government and commercial testing. The software is designed for modularity and adaptability to minimize the software development effort for current and future data systems. An additional benefit of the software's architecture is its ability to easily migrate to other testing facilities thus providing future commonality across Stennis. Adapting the software to other Rocket Propulsion Test (RPT) Centers such as MSFC, White Sands, and Plumbrook Station would provide additional commonality and help reduce testing costs for NASA. Ultimately, the software provides the government with unlimited rights and guarantees privacy of data to commercial entities. The project engaged all RPT Centers and NASA's Independent Verification & Validation facility to enhance product quality. The design consists of a translation layer which provides the transparency of the software application layers to underlying hardware regardless of test facility location and a flexible and easily accessible database. This presentation addresses system technical design, issues encountered, and the status of Stennis' development and deployment.
Report on the Third Workshop on Sustainable Software for Science: Practice and Experiences (WSSSPE3)
NASA Astrophysics Data System (ADS)
Katz, Daniel S.; Choi, Sou-Cheng T.; Niemeyer, Kyle E.; Hetherington, James; Löffler, Frank; Gunter, Dan; Idaszak, Ray; Brandt, Steven R.; Miller, Mark A.; Gesing, Sandra; Jones, Nick D.; Weber, Nic; Marru, Suresh; Allen, Gabrielle; Penzenstadler, Birgit; Venters, Colin C.; Davis, Ethan; Hwang, Lorraine; Todorov, Ilian; Patra, Abani; de Val-Borro, Miguel
2016-02-01
This report records and discusses the Third Workshop on Sustainable Software for Science: Practice and Experiences (WSSSPE3). The report includes a description of the keynote presentation of the workshop, which served as an overview of sustainable scientific software. It also summarizes a set of lightning talks in which speakers highlighted to-the-point lessons and challenges pertaining to sustaining scientific software. The final and main contribution of the report is a summary of the discussions, future steps, and future organization for a set of self-organized working groups on topics including developing pathways to funding scientific software; constructing useful common metrics for crediting software stakeholders; identifying principles for sustainable software engineering design; reaching out to research software organizations around the world; and building communities for software sustainability. For each group, we include a point of contact and a landing page that can be used by those who want to join that group's future activities. The main challenge left by the workshop is to see if the groups will execute these activities that they have scheduled, and how the WSSSPE community can encourage this to happen.
GERICOS: A Generic Framework for the Development of On-Board Software
NASA Astrophysics Data System (ADS)
Plasson, P.; Cuomo, C.; Gabriel, G.; Gauthier, N.; Gueguen, L.; Malac-Allain, L.
2016-08-01
This paper presents an overview of the GERICOS framework (GEneRIC Onboard Software), its architecture, its various layers and its future evolutions. The GERICOS framework, developed and qualified by LESIA, offers a set of generic, reusable and customizable software components for the rapid development of payload flight software. The GERICOS framework has a layered structure. The first layer (GERICOS::CORE) implements the concept of active objects and forms an abstraction layer over the top of real-time kernels. The second layer (GERICOS::BLOCKS) offers a set of reusable software components for building flight software based on generic solutions to recurrent functionalities. The third layer (GERICOS::DRIVERS) implements software drivers for several COTS IP cores of the LEON processor ecosystem.
ERIC Educational Resources Information Center
Drachova-Strang, Svetlana V.
2013-01-01
As computing becomes ubiquitous, software correctness has a fundamental role in ensuring the safety and security of the systems we build. To design and develop software correctly according to their formal contracts, CS students, the future software practitioners, need to learn a critical set of skills that are necessary and sufficient for…
NASA Technical Reports Server (NTRS)
Pitman, C. L.; Erb, D. M.; Izygon, M. E.; Fridge, E. M., III; Roush, G. B.; Braley, D. M.; Savely, R. T.
1992-01-01
The United State's big space projects of the next decades, such as Space Station and the Human Exploration Initiative, will need the development of many millions of lines of mission critical software. NASA-Johnson (JSC) is identifying and developing some of the Computer Aided Software Engineering (CASE) technology that NASA will need to build these future software systems. The goal is to improve the quality and the productivity of large software development projects. New trends are outlined in CASE technology and how the Software Technology Branch (STB) at JSC is endeavoring to provide some of these CASE solutions for NASA is described. Key software technology components include knowledge-based systems, software reusability, user interface technology, reengineering environments, management systems for the software development process, software cost models, repository technology, and open, integrated CASE environment frameworks. The paper presents the status and long-term expectations for CASE products. The STB's Reengineering Application Project (REAP), Advanced Software Development Workstation (ASDW) project, and software development cost model (COSTMODL) project are then discussed. Some of the general difficulties of technology transfer are introduced, and a process developed by STB for CASE technology insertion is described.
Space Shuttle Software Development and Certification
NASA Technical Reports Server (NTRS)
Orr, James K.; Henderson, Johnnie A
2000-01-01
Man-rated software, "software which is in control of systems and environments upon which human life is critically dependent," must be highly reliable. The Space Shuttle Primary Avionics Software System is an excellent example of such a software system. Lessons learn from more than 20 years of effort have identified basic elements that must be present to achieve this high degree of reliability. The elements include rigorous application of appropriate software development processes, use of trusted tools to support those processes, quantitative process management, and defect elimination and prevention. This presentation highlights methods used within the Space Shuttle project and raises questions that must be addressed to provide similar success in a cost effective manner on future long-term projects where key application development tools are COTS rather than internally developed custom application development tools
Collaboration Strategies to Reduce Technical Debt
ERIC Educational Resources Information Center
Miko, Jeffrey Allen
2017-01-01
Inadequate software development collaboration processes can allow technical debt to accumulate increasing future maintenance costs and the chance of system failures. The purpose of this qualitative case study was to explore collaboration strategies software development leaders use to reduce the amount of technical debt created by software…
Social network analyzer on the example of Twitter
NASA Astrophysics Data System (ADS)
Gorodetskaia, Mariia; Khruslova, Diana
2017-09-01
Social networks are powerful sources of data due to their popularity. Twitter is one of the networks providing a lot of data. There is need to collect this data for future usage from linguistics to SMM and marketing. The report examines the existing software solutions and provides new ones. The study includes information about the software developed. Some future features are listed.
Automated Software Development Workstation (ASDW)
NASA Technical Reports Server (NTRS)
Fridge, Ernie
1990-01-01
Software development is a serious bottleneck in the construction of complex automated systems. An increase of the reuse of software designs and components has been viewed as a way to relieve this bottleneck. One approach to achieving software reusability is through the development and use of software parts composition systems. A software parts composition system is a software development environment comprised of a parts description language for modeling parts and their interfaces, a catalog of existing parts, a composition editor that aids a user in the specification of a new application from existing parts, and a code generator that takes a specification and generates an implementation of a new application in a target language. The Automated Software Development Workstation (ASDW) is an expert system shell that provides the capabilities required to develop and manipulate these software parts composition systems. The ASDW is now in Beta testing at the Johnson Space Center. Future work centers on responding to user feedback for capability and usability enhancement, expanding the scope of the software lifecycle that is covered, and in providing solutions to handling very large libraries of reusable components.
Automated support for experience-based software management
NASA Technical Reports Server (NTRS)
Valett, Jon D.
1992-01-01
To effectively manage a software development project, the software manager must have access to key information concerning a project's status. This information includes not only data relating to the project of interest, but also, the experience of past development efforts within the environment. This paper describes the concepts and functionality of a software management tool designed to provide this information. This tool, called the Software Management Environment (SME), enables the software manager to compare an ongoing development effort with previous efforts and with models of the 'typical' project within the environment, to predict future project status, to analyze a project's strengths and weaknesses, and to assess the project's quality. In order to provide these functions the tool utilizes a vast corporate memory that includes a data base of software metrics, a set of models and relationships that describe the software development environment, and a set of rules that capture other knowledge and experience of software managers within the environment. Integrating these major concepts into one software management tool, the SME is a model of the type of management tool needed for all software development organizations.
Achieving Agility and Stability in Large-Scale Software Development
2013-01-16
temporary team is assigned to prepare layers and frameworks for future feature teams. Presentation Layer Domain Layer Data Access Layer...http://www.sei.cmu.edu/training/ elearning ~ Software Engineering Institute CarnegieMellon
Warfighting Concepts to Future Weapon System Designs (WARCON)
2003-09-12
34* Software design documents rise to litigation. "* A Material List "Cost information that may support, or may * Final Engineering Process Maps be...document may include design the system as derived from the engineering design, software development, SRD. MTS Technologies, Inc. 26 FOR OFFICIAL USE...document, early in the development phase. It is software engineers produce the vision of important to establish a standard, formal the design effort. As
Development of a New VLBI Data Analysis Software
NASA Technical Reports Server (NTRS)
Bolotin, Sergei; Gipson, John M.; MacMillan, Daniel S.
2010-01-01
We present an overview of a new VLBI analysis software under development at NASA GSFC. The new software will replace CALC/SOLVE and many related utility programs. It will have the capabilities of the current system as well as incorporate new models and data analysis techniques. In this paper we give a conceptual overview of the new software. We formulate the main goals of the software. The software should be flexible and modular to implement models and estimation techniques that currently exist or will appear in future. On the other hand it should be reliable and possess production quality for processing standard VLBI sessions. Also, it needs to be capable of processing observations from a fully deployed network of VLBI2010 stations in a reasonable time. We describe the software development process and outline the software architecture.
Toward full life cycle control: Adding maintenance measurement to the SEL
NASA Technical Reports Server (NTRS)
Rombach, H. Dieter; Ulery, Bradford T.; Valett, Jon D.
1992-01-01
Organization-wide measurement of software products and processes is needed to establish full life cycle control over software products. The Software Engineering Laboratory (SEL)--a joint venture between NASA GSFC, the University of Maryland, and Computer Sciences Corporation--started measurement of software development more than 15 years ago. Recently, the measurement of maintenance was added to the scope of the SEL. In this article, the maintenance measurement program is presented as an addition to the already existing and well-established SEL development measurement program and evaluated in terms of its immediate benefits and long-term improvement potential. Immediate benefits of this program for the SEL include an increased understanding of the maintenance domain, the differences and commonalities between development and maintenance, and the cause-effect relationships between development and maintenance. Initial results from a sample maintenance study are presented to substantiate these benefits. The long-term potential of this program includes the use of maintenance baselines to better plan and manage future projects and to improve development and maintenance practices for future projects wherever warranted.
Software development environment, appendix F
NASA Technical Reports Server (NTRS)
Riddle, W. E.
1980-01-01
The current status in the area of software development environments is assessed. The purposes of environments, the types of environments, the constituents of an environment, the issue of environment integration, and the problems which must be solved in preparing an environment are discussed. Some general maxims to guide near-term future work are proposed.
Achieving Agility and Stability in Large-Scale Software Development
2013-01-16
temporary team is assigned to prepare layers and frameworks for future feature teams. Presentation Layer Domain Layer Data Access Layer Framework...http://www.sei.cmu.edu/training/ elearning ~ Software Engineering Institute CarnegieMellon
Software development for airborne radar
NASA Astrophysics Data System (ADS)
Sundstrom, Ingvar G.
Some aspects for development of software in a modern multimode airborne nose radar are described. First, an overview of where software is used in the radar units is presented. The development phases-system design, functional design, detailed design, function verification, and system verification-are then used as the starting point for the discussion. Methods, tools, and the most important documents are described. The importance of video flight recording in the early stages and use of a digital signal generators for performance verification is emphasized. Some future trends are discussed.
Proceedings of the Twenty-Fourth Annual Software Engineering Workshop
NASA Technical Reports Server (NTRS)
2000-01-01
On December 1 and 2, the Software Engineering Laboratory (SEL), a consortium composed of NASA/Goddard, the University of Maryland, and CSC, held the 24th Software Engineering Workshop (SEW), the last of the millennium. Approximately 240 people attended the 2-day workshop. Day 1 was composed of four sessions: International Influence of the Software Engineering Laboratory; Object Oriented Testing and Reading; Software Process Improvement; and Space Software. For the first session, three internationally known software process experts discussed the influence of the SEL with respect to software engineering research. In the Space Software session, prominent representatives from three different NASA sites- GSFC's Marti Szczur, the Jet Propulsion Laboratory's Rick Doyle, and the Ames Research Center IV&V Facility's Lou Blazy- discussed the future of space software in their respective centers. At the end of the first day, the SEW sponsored a reception at the GSFC Visitors' Center. Day 2 also provided four sessions: Using the Experience Factory; A panel discussion entitled "Software Past, Present, and Future: Views from Government, Industry, and Academia"; Inspections; and COTS. The day started with an excellent talk by CSC's Frank McGarry on "Attaining Level 5 in CMM Process Maturity." Session 2, the panel discussion on software, featured NASA Chief Information Officer Lee Holcomb (Government), our own Jerry Page (Industry), and Mike Evangelist of the National Science Foundation (Academia). Each presented his perspective on the most important developments in software in the past 10 years, in the present, and in the future.
Top 10 metrics for life science software good practices.
Artaza, Haydee; Chue Hong, Neil; Corpas, Manuel; Corpuz, Angel; Hooft, Rob; Jimenez, Rafael C; Leskošek, Brane; Olivier, Brett G; Stourac, Jan; Svobodová Vařeková, Radka; Van Parys, Thomas; Vaughan, Daniel
2016-01-01
Metrics for assessing adoption of good development practices are a useful way to ensure that software is sustainable, reusable and functional. Sustainability means that the software used today will be available - and continue to be improved and supported - in the future. We report here an initial set of metrics that measure good practices in software development. This initiative differs from previously developed efforts in being a community-driven grassroots approach where experts from different organisations propose good software practices that have reasonable potential to be adopted by the communities they represent. We not only focus our efforts on understanding and prioritising good practices, we assess their feasibility for implementation and publish them here.
NASA Technical Reports Server (NTRS)
Reinhart, Richard C.; Kacpura, Thomas J.; Johnson, Sandra K.; Lux, James P.
2010-01-01
NASA is developing an experimental flight payload (referred to as the Space Communication and Navigation (SCAN) Test Bed) to investigate software defined radio (SDR), networking, and navigation technologies, operationally in the space environment. The payload consists of three software defined radios each compliant to NASA s Space Telecommunications Radio System Architecture, a common software interface description standard for software defined radios. The software defined radios are new technology developments underway by NASA and industry partners. Planned for launch in early 2012, the payload will be externally mounted to the International Space Station truss and conduct experiments representative of future mission capability.
Top 10 metrics for life science software good practices
2016-01-01
Metrics for assessing adoption of good development practices are a useful way to ensure that software is sustainable, reusable and functional. Sustainability means that the software used today will be available - and continue to be improved and supported - in the future. We report here an initial set of metrics that measure good practices in software development. This initiative differs from previously developed efforts in being a community-driven grassroots approach where experts from different organisations propose good software practices that have reasonable potential to be adopted by the communities they represent. We not only focus our efforts on understanding and prioritising good practices, we assess their feasibility for implementation and publish them here. PMID:27635232
A new approach for instrument software at Gemini
NASA Astrophysics Data System (ADS)
Gillies, Kim; Nunez, Arturo; Dunn, Jennifer
2008-07-01
Gemini Observatory is now developing its next generation of astronomical instruments, the Aspen instruments. These new instruments are sophisticated and costly requiring large distributed, collaborative teams. Instrument software groups often include experienced team members with existing mature code. Gemini has taken its experience from the previous generation of instruments and current hardware and software technology to create an approach for developing instrument software that takes advantage of the strengths of our instrument builders and our own operations needs. This paper describes this new software approach that couples a lightweight infrastructure and software library with aspects of modern agile software development. The Gemini Planet Imager instrument project, which is currently approaching its critical design review, is used to demonstrate aspects of this approach. New facilities under development will face similar issues in the future, and the approach presented here can be applied to other projects.
Resource Sharing of Micro-Software, or, What Ever Happened to All That CP/M Compatibility?
ERIC Educational Resources Information Center
DeYoung, Barbara
1984-01-01
Explores incompatible operating systems as the basic reason why software packages will not work on different microcomputers; defines operating system; explores compatibility issues surrounding the IBM MS-DOS; and presents two future trends in hardware and software developments which indicate a return to true compatibility. (Author/MBR)
Woynaroski, Tiffany; Oller, D Kimbrough; Keceli-Kaysili, Bahar; Xu, Dongxin; Richards, Jeffrey A; Gilkerson, Jill; Gray, Sharmistha; Yoder, Paul
2017-03-01
Theory and research suggest that vocal development predicts "useful speech" in preschoolers with autism spectrum disorder (ASD), but conventional methods for measurement of vocal development are costly and time consuming. This longitudinal correlational study examines the reliability and validity of several automated indices of vocalization development relative to an index derived from human coded, conventional communication samples in a sample of preverbal preschoolers with ASD. Automated indices of vocal development were derived using software that is presently "in development" and/or only available for research purposes and using commercially available Language ENvironment Analysis (LENA) software. Indices of vocal development that could be derived using the software available for research purposes: (a) were highly stable with a single day-long audio recording, (b) predicted future spoken vocabulary to a degree that was nonsignificantly different from the index derived from conventional communication samples, and (c) continued to predict future spoken vocabulary even after controlling for concurrent vocabulary in our sample. The score derived from standard LENA software was similarly stable, but was not significantly correlated with future spoken vocabulary. Findings suggest that automated vocal analysis is a valid and reliable alternative to time intensive and expensive conventional communication samples for measurement of vocal development of preverbal preschoolers with ASD in research and clinical practice. Autism Res 2017, 10: 508-519. © 2016 International Society for Autism Research, Wiley Periodicals, Inc. © 2016 International Society for Autism Research, Wiley Periodicals, Inc.
ACRF Ingest Software Status: New, Current, and Future - April 2008
DOE Office of Scientific and Technical Information (OSTI.GOV)
AS Koontz; S Choudhury; BD Ermold
2008-04-01
The purpose of this report is to provide status of the ingest software used to process instrument data for the Atmospheric Radiation Measurement Program Climate Research Facility (ACRF). The report is divided into 4 sections: (1) for news about ingests currently under development, (2) for current production ingests, (3) for future ingest development plans, and (4) for information on retired ingests. Please note that datastreams beginning in “xxx” indicate cases where ingests run at multiple ACRF sites, which results in a datastream(s) for each location.
ACRF Ingest Software Status: New, Current, and Future (September 2007)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Koontz, AS; Choudhury, S; Ermold, BD
2007-04-01
The purpose of this report is to provide status of the ingest software used to process instrument data for the Atmospheric Radiation Measurement Program Climate Research Facility (ACRF). The report is divided into 4 sections: (1) for news about ingests currently under development, (2) for current production ingests, (3) for future ingest development plans, and (4) for information on retired ingests. Please note that datastreams beginning in “xxx” indicate cases where ingests run at multiple ACRF sites, which results in a datastream(s) for each location.
ACRF Ingest Software Status: New, Current, and Future - May 2008
DOE Office of Scientific and Technical Information (OSTI.GOV)
AS Koontz; S Choudhury; BD Ermold
2008-05-01
The purpose of this report is to provide status of the ingest software used to process instrument data for the Atmospheric Radiation Measurement Program Climate Research Facility (ACRF). The report is divided into 4 sections: (1) for news about ingests currently under development, (2) for current production ingests, (3) for future ingest development plans, and (4) for information on retired ingests. Please note that datastreams beginning in “xxx” indicate cases where ingests run at multiple ACRF sites, which results in a datastream(s) for each location.
National meeting to review IPAD status and goals. [Integrated Programs for Aerospace-vehicle Design
NASA Technical Reports Server (NTRS)
Fulton, R. E.
1980-01-01
A joint NASA/industry project called Integrated Programs for Aerospace-vehicle Design (IPAD) is described, which has the goal of raising aerospace-industry productivity through the application of computers to integrate company-wide management of engineering data. Basically a general-purpose interactive computing system developed to support engineering design processes, the IPAD design is composed of three major software components: the executive, data management, and geometry and graphics software. Results of IPAD activities include a comprehensive description of a future representative aerospace vehicle design process and its interface to manufacturing, and requirements and preliminary design of a future IPAD software system to integrate engineering activities of an aerospace company having several products under simultaneous development.
NASA Technical Reports Server (NTRS)
Condon, Steven; Hendrick, Robert; Stark, Michael E.; Steger, Warren
1997-01-01
The Flight Dynamics Division (FDD) of NASA's Goddard Space Flight Center (GSFC) recently embarked on a far-reaching revision of its process for developing and maintaining satellite support software. The new process relies on an object-oriented software development method supported by a domain specific library of generalized components. This Generalized Support Software (GSS) Domain Engineering Process is currently in use at the NASA GSFC Software Engineering Laboratory (SEL). The key facets of the GSS process are (1) an architecture for rapid deployment of FDD applications, (2) a reuse asset library for FDD classes, and (3) a paradigm shift from developing software to configuring software for mission support. This paper describes the GSS architecture and process, results of fielding the first applications, lessons learned, and future directions
Software architecture and engineering for patient records: current and future.
Weng, Chunhua; Levine, Betty A; Mun, Seong K
2009-05-01
During the "The National Forum on the Future of the Defense Health Information System," a track focusing on "Systems Architecture and Software Engineering" included eight presenters. These presenters identified three key areas of interest in this field, which include the need for open enterprise architecture and a federated database design, net centrality based on service-oriented architecture, and the need for focus on software usability and reusability. The eight panelists provided recommendations related to the suitability of service-oriented architecture and the enabling technologies of grid computing and Web 2.0 for building health services research centers and federated data warehouses to facilitate large-scale collaborative health care and research. Finally, they discussed the need to leverage industry best practices for software engineering to facilitate rapid software development, testing, and deployment.
Software Construction and Analysis Tools for Future Space Missions
NASA Technical Reports Server (NTRS)
Lowry, Michael R.; Clancy, Daniel (Technical Monitor)
2002-01-01
NASA and its international partners will increasingly depend on software-based systems to implement advanced functions for future space missions, such as Martian rovers that autonomously navigate long distances exploring geographic features formed by surface water early in the planet's history. The software-based functions for these missions will need to be robust and highly reliable, raising significant challenges in the context of recent Mars mission failures attributed to software faults. After reviewing these challenges, this paper describes tools that have been developed at NASA Ames that could contribute to meeting these challenges; 1) Program synthesis tools based on automated inference that generate documentation for manual review and annotations for automated certification. 2) Model-checking tools for concurrent object-oriented software that achieve memorability through synergy with program abstraction and static analysis tools.
xSDK Foundations: Toward an Extreme-scale Scientific Software Development Kit
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heroux, Michael A.; Bartlett, Roscoe; Demeshko, Irina
Here, extreme-scale computational science increasingly demands multiscale and multiphysics formulations. Combining software developed by independent groups is imperative: no single team has resources for all predictive science and decision support capabilities. Scientific libraries provide high-quality, reusable software components for constructing applications with improved robustness and portability. However, without coordination, many libraries cannot be easily composed. Namespace collisions, inconsistent arguments, lack of third-party software versioning, and additional difficulties make composition costly. The Extreme-scale Scientific Software Development Kit (xSDK) defines community policies to improve code quality and compatibility across independently developed packages (hypre, PETSc, SuperLU, Trilinos, and Alquimia) and provides a foundationmore » for addressing broader issues in software interoperability, performance portability, and sustainability. The xSDK provides turnkey installation of member software and seamless combination of aggregate capabilities, and it marks first steps toward extreme-scale scientific software ecosystems from which future applications can be composed rapidly with assured quality and scalability.« less
xSDK Foundations: Toward an Extreme-scale Scientific Software Development Kit
Heroux, Michael A.; Bartlett, Roscoe; Demeshko, Irina; ...
2017-03-01
Here, extreme-scale computational science increasingly demands multiscale and multiphysics formulations. Combining software developed by independent groups is imperative: no single team has resources for all predictive science and decision support capabilities. Scientific libraries provide high-quality, reusable software components for constructing applications with improved robustness and portability. However, without coordination, many libraries cannot be easily composed. Namespace collisions, inconsistent arguments, lack of third-party software versioning, and additional difficulties make composition costly. The Extreme-scale Scientific Software Development Kit (xSDK) defines community policies to improve code quality and compatibility across independently developed packages (hypre, PETSc, SuperLU, Trilinos, and Alquimia) and provides a foundationmore » for addressing broader issues in software interoperability, performance portability, and sustainability. The xSDK provides turnkey installation of member software and seamless combination of aggregate capabilities, and it marks first steps toward extreme-scale scientific software ecosystems from which future applications can be composed rapidly with assured quality and scalability.« less
Are Future Teachers Methodically Trained to Distinguish Good from Bad Educational Software?
ERIC Educational Resources Information Center
Pjanic, Karmelita; Hamzabegovic, Jasna
2016-01-01
In the era of information technology and general digitization of society, invasion of every kind of software is evident. No matter how laudable is the existence and development of educational software, taking into account its role, its quality and whether it achieves the desired goal is very important. In addition to programming experts it is…
Caesy: A software tool for computer-aided engineering
NASA Technical Reports Server (NTRS)
Wette, Matt
1993-01-01
A new software tool, Caesy, is described. This tool provides a strongly typed programming environment for research in the development of algorithms and software for computer-aided control system design. A description of the user language and its implementation as they currently stand are presented along with a description of work in progress and areas of future work.
NASA Technical Reports Server (NTRS)
Garcia, Janette
2016-01-01
The National Aeronautics and Space Administration (NASA) is creating a way to send humans beyond low Earth orbit, and later to Mars. Kennedy Space Center (KSC) is working to make this possible by developing a Spaceport Command and Control System (SCCS) which will allow the launch of Space Launch System (SLS). This paper's focus is on the work performed by the author in her first and second part of the internship as a remote application software developer. During the first part of her internship, the author worked on the SCCS's software application layer by assisting multiple ground subsystems teams including Launch Accessories (LACC) and Environmental Control System (ECS) on the design, development, integration, and testing of remote control software applications. Then, on the second part of the internship, the author worked on the development of robot software at the Swamp Works Laboratory which is a research and technology development group which focuses on inventing new technology to help future In-Situ Resource Utilization (ISRU) missions.
Current And Future Directions Of Lens Design Software
NASA Astrophysics Data System (ADS)
Gustafson, Darryl E.
1983-10-01
The most effective environment for doing lens design continues to evolve as new computer hardware and software tools become available. Important recent hardware developments include: Low-cost but powerful interactive multi-user 32 bit computers with virtual memory that are totally software-compatible with prior larger and more expensive members of the family. A rapidly growing variety of graphics devices for both hard-copy and screen graphics, including many with color capability. In addition, with optical design software readily accessible in many forms, optical design has become a part-time activity for a large number of engineers instead of being restricted to a small number of full-time specialists. A designer interface that is friendly for the part-time user while remaining efficient for the full-time designer is thus becoming more important as well as more practical. Along with these developments, software tools in other scientific and engineering disciplines are proliferating. Thus, the optical designer is less and less unique in his use of computer-aided techniques and faces the challenge and opportunity of efficiently communicating his designs to other computer-aided-design (CAD), computer-aided-manufacturing (CAM), structural, thermal, and mechanical software tools. This paper will address the impact of these developments on the current and future directions of the CODE VTM optical design software package, its implementation, and the resulting lens design environment.
Open source hardware and software platform for robotics and artificial intelligence applications
NASA Astrophysics Data System (ADS)
Liang, S. Ng; Tan, K. O.; Lai Clement, T. H.; Ng, S. K.; Mohammed, A. H. Ali; Mailah, Musa; Azhar Yussof, Wan; Hamedon, Zamzuri; Yussof, Zulkifli
2016-02-01
Recent developments in open source hardware and software platforms (Android, Arduino, Linux, OpenCV etc.) have enabled rapid development of previously expensive and sophisticated system within a lower budget and flatter learning curves for developers. Using these platform, we designed and developed a Java-based 3D robotic simulation system, with graph database, which is integrated in online and offline modes with an Android-Arduino based rubbish picking remote control car. The combination of the open source hardware and software system created a flexible and expandable platform for further developments in the future, both in the software and hardware areas, in particular in combination with graph database for artificial intelligence, as well as more sophisticated hardware, such as legged or humanoid robots.
Software development to support sensor control of robot arc welding
NASA Technical Reports Server (NTRS)
Silas, F. R., Jr.
1986-01-01
The development of software for a Digital Equipment Corporation MINC-23 Laboratory Computer to provide functions of a workcell host computer for Space Shuttle Main Engine (SSME) robotic welding is documented. Routines were written to transfer robot programs between the MINC and an Advanced Robotic Cyro 750 welding robot. Other routines provide advanced program editing features while additional software allows communicatin with a remote computer aided design system. Access to special robot functions were provided to allow advanced control of weld seam tracking and process control for future development programs.
Software Development Standard Processes (SDSP)
NASA Technical Reports Server (NTRS)
Lavin, Milton L.; Wang, James J.; Morillo, Ronald; Mayer, John T.; Jamshidian, Barzia; Shimizu, Kenneth J.; Wilkinson, Belinda M.; Hihn, Jairus M.; Borgen, Rosana B.; Meyer, Kenneth N.;
2011-01-01
A JPL-created set of standard processes is to be used throughout the lifecycle of software development. These SDSPs cover a range of activities, from management and engineering activities, to assurance and support activities. These processes must be applied to software tasks per a prescribed set of procedures. JPL s Software Quality Improvement Project is currently working at the behest of the JPL Software Process Owner to ensure that all applicable software tasks follow these procedures. The SDSPs are captured as a set of 22 standards in JPL s software process domain. They were developed in-house at JPL by a number of Subject Matter Experts (SMEs) residing primarily within the Engineering and Science Directorate, but also from the Business Operations Directorate and Safety and Mission Success Directorate. These practices include not only currently performed best practices, but also JPL-desired future practices in key thrust areas like software architecting and software reuse analysis. Additionally, these SDSPs conform to many standards and requirements to which JPL projects are beholden.
Systems Engineering and Integration (SE and I)
NASA Technical Reports Server (NTRS)
Chevers, ED; Haley, Sam
1990-01-01
The issue of technology advancement and future space transportation vehicles is addressed. The challenge is to develop systems which can be evolved and improved in small incremental steps where each increment reduces present cost, improves, reliability, or does neither but sets the stage for a second incremental upgrade that does. Future requirements are interface standards for commercial off the shelf products to aid in the development of integrated facilities; enhanced automated code generation system slightly coupled to specification and design documentation; modeling tools that support data flow analysis; and shared project data bases consisting of technical characteristics cast information, measurement parameters, and reusable software programs. Topics addressed include: advanced avionics development strategy; risk analysis and management; tool quality management; low cost avionics; cost estimation and benefits; computer aided software engineering; computer systems and software safety; system testability; and advanced avionics laboratories - and rapid prototyping. This presentation is represented by viewgraphs only.
NASA Technical Reports Server (NTRS)
Khambatta, Cyrus F.
2007-01-01
A technique for automated development of scenarios for use in the Multi-Center Traffic Management Advisor (McTMA) software simulations is described. The resulting software is designed and implemented to automate the generation of simulation scenarios with the intent of reducing the time it currently takes using an observational approach. The software program is effective in achieving this goal. The scenarios created for use in the McTMA simulations are based on data taken from data files from the McTMA system, and were manually edited before incorporation into the simulations to ensure accuracy. Despite the software s overall favorable performance, several key software issues are identified. Proposed solutions to these issues are discussed. Future enhancements to the scenario generator software may address the limitations identified in this paper.
A knowledge based software engineering environment testbed
NASA Technical Reports Server (NTRS)
Gill, C.; Reedy, A.; Baker, L.
1985-01-01
The Carnegie Group Incorporated and Boeing Computer Services Company are developing a testbed which will provide a framework for integrating conventional software engineering tools with Artifical Intelligence (AI) tools to promote automation and productivity. The emphasis is on the transfer of AI technology to the software development process. Experiments relate to AI issues such as scaling up, inference, and knowledge representation. In its first year, the project has created a model of software development by representing software activities; developed a module representation formalism to specify the behavior and structure of software objects; integrated the model with the formalism to identify shared representation and inheritance mechanisms; demonstrated object programming by writing procedures and applying them to software objects; used data-directed and goal-directed reasoning to, respectively, infer the cause of bugs and evaluate the appropriateness of a configuration; and demonstrated knowledge-based graphics. Future plans include introduction of knowledge-based systems for rapid prototyping or rescheduling; natural language interfaces; blackboard architecture; and distributed processing
ERIC Educational Resources Information Center
Lipson, Joseph I.; Fisher, Kathleen M.
1985-01-01
In the future, the requirements of industry will generate a wide range of hardware devices and software programs that will significantly alter and improve the quality of education. The driving forces behind the development of new technological devices include economics; emotional factors, e.g., the desire to develop aids for the handicapped;…
NASA Astrophysics Data System (ADS)
Darch, Peter T.; Sands, Ashley E.
2016-06-01
Sky surveys, such as the Sloan Digital Sky Survey (SDSS) and the Large Synoptic Survey Telescope (LSST), generate data on an unprecedented scale. While many scientific projects span a few years from conception to completion, sky surveys are typically on the scale of decades. This paper focuses on critical challenges arising from long timescales, and how sky surveys address these challenges.We present findings from a study of LSST, comprising interviews (n=58) and observation. Conceived in the 1990s, the LSST Corporation was formed in 2003, and construction began in 2014. LSST will commence data collection operations in 2022 for ten years.One challenge arising from this long timescale is uncertainty about future needs of the astronomers who will use these data many years hence. Sources of uncertainty include scientific questions to be posed, astronomical phenomena to be studied, and tools and practices these astronomers will have at their disposal. These uncertainties are magnified by the rapid technological and scientific developments anticipated between now and the start of LSST operations.LSST is implementing a range of strategies to address these challenges. Some strategies involve delaying resolution of uncertainty, placing this resolution in the hands of future data users. Other strategies aim to reduce uncertainty by shaping astronomers’ data analysis practices so that these practices will integrate well with LSST once operations begin.One approach that exemplifies both types of strategy is the decision to make LSST data management software open source, even now as it is being developed. This policy will enable future data users to adapt this software to evolving needs. In addition, LSST intends for astronomers to start using this software well in advance of 2022, thereby embedding LSST software and data analysis approaches in the practices of astronomers.These findings strengthen arguments for making the software supporting sky surveys available as open source. Such arguments usually focus on reuse potential of software, and enhancing replicability of analyses. In this case, however, open source software also promises to mitigate the critical challenge of anticipating the needs of future data users.
WISE: Automated support for software project management and measurement. M.S. Thesis
NASA Technical Reports Server (NTRS)
Ramakrishnan, Sudhakar
1995-01-01
One important aspect of software development and IV&V is measurement. Unless a software development effort is measured in some way, it is difficult to judge the effectiveness of current efforts and predict future performances. Collection of metrics and adherence to a process are difficult tasks in a software project. Change activity is a powerful indicator of project status. Automated systems that can handle change requests, issues, and other process documents provide an excellent platform for tracking the status of the project. A World Wide Web based architecture is developed for (a) making metrics collection an implicit part of the software process, (b) providing metric analysis dynamically, (c) supporting automated tools that can complement current practices of in-process improvement, and (d) overcoming geographical barrier. An operational system (WISE) instantiates this architecture allowing for the improvement of software process in a realistic environment. The tool tracks issues in software development process, provides informal communication between the users with different roles, supports to-do lists (TDL), and helps in software process improvement. WISE minimizes the time devoted to metrics collection, analysis, and captures software change data. Automated tools like WISE focus on understanding and managing the software process. The goal is improvement through measurement.
ERIC Educational Resources Information Center
Tofan, Daniel C.
2009-01-01
This paper describes an upper-level undergraduate and graduate-level course on computers in chemical education that was developed and offered for the first time in Fall 2007. The course provides future chemistry teachers with exposure to current software tools that can improve productivity in teaching, curriculum development, and education…
NASA Astrophysics Data System (ADS)
Aguilar Cisneros, Jorge; Vargas Martinez, Hector; Pedroza Melendez, Alejandro; Alonso Arevalo, Miguel
2013-09-01
Mexico is a country where the experience to build software for satellite applications is beginning. This is a delicate situation because in the near future we will need to develop software for the SATEX-II (Mexican Experimental Satellite). SATEX- II is a SOMECyTA's project (the Mexican Society of Aerospace Science and Technology). We have experienced applying software development methodologies, like TSP (Team Software Process) and SCRUM in other areas. Then, we analyzed these methodologies and we concluded: these can be applied to develop software for the SATEX-II, also, we supported these methodologies with SSP-05-0 Standard in particular with ESA PSS-05-11. Our analysis was focusing on main characteristics of each methodology and how these methodologies could be used with the ESA PSS 05-0 Standards. Our outcomes, in general, may be used by teams who need to build small satellites, but, in particular, these are going to be used when we will build the on board software applications for the SATEX-II.
Detailed Design Documentation, without the Pain
NASA Astrophysics Data System (ADS)
Ramsay, C. D.; Parkes, S.
2004-06-01
Producing detailed forms of design documentation, such as pseudocode and structured flowcharts, to describe the procedures of a software system:(1) allows software developers to model and discuss their understanding of a problem and the design of a solution free from the syntax of a programming language,(2) facilitates deeper involvement of non-technical stakeholders, such as the customer or project managers, whose influence ensures the quality, correctness and timeliness of the resulting system,(3) forms comprehensive documentation of the system for its future maintenance, reuse and/or redeployment.However, such forms of documentation require effort to create and maintain.This paper describes a software tool which is currently being developed within the Space Systems Research Group at the University of Dundee which aims to improve the utility of, and the incentive for, creating detailed design documentation for the procedures of a software system. The rationale for creating such a tool is briefly discussed, followed by a description of the tool itself, a summary of its perceived benefits, and plans for future work.
Firing Room Remote Application Software Development
NASA Technical Reports Server (NTRS)
Liu, Kan
2014-01-01
The Engineering and Technology Directorate (NE) at National Aeronautics and Space Administration (NASA) Kennedy Space Center (KSC) is designing a new command and control system for the checkout and launch of Space Launch System (SLS) and future rockets. The purposes of the semester long internship as a remote application software developer include the design, development, integration, and verification of the software and hardware in the firing rooms, in particular with the Mobile Launcher (ML) Launch Accessories subsystem. In addition, a Conversion Fusion project was created to show specific approved checkout and launch engineering data for public-friendly display purposes.
Optimizing IV and V for Mature Organizations
NASA Technical Reports Server (NTRS)
Fuhman, Christopher
2003-01-01
NASA is intending for its future software development agencies to have at least a Level 3 rating in the Carnegie Mellon University Capability Maturity Model (CMM). The CMM has built-in Verification and Validation (V&V) processes that support higher software quality. Independent Verification and Validation (IV&V) of software developed by mature agencies can be therefore more effective than for software developed by less mature organizations. How is Independent V&V different with respect to the maturity of an organization? Knowing a priori the maturity of an organization's processes, how can IV&V planners better identify areas of need choose IV&V activities, etc? The objective of this research is to provide a complementary set of guidelines and criteria to assist the planning of IV&V activities on a project using a priori knowledge of the measurable levels of maturity of the organization developing the software.
A computationally efficient software application for calculating vibration from underground railways
NASA Astrophysics Data System (ADS)
Hussein, M. F. M.; Hunt, H. E. M.
2009-08-01
The PiP model is a software application with a user-friendly interface for calculating vibration from underground railways. This paper reports about the software with a focus on its latest version and the plans for future developments. The software calculates the Power Spectral Density of vibration due to a moving train on floating-slab track with track irregularity described by typical values of spectra for tracks with good, average and bad conditions. The latest version accounts for a tunnel embedded in a half space by employing a toolbox developed at K.U. Leuven which calculates Green's functions for a multi-layered half-space.
Software reliability models for critical applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pham, H.; Pham, M.
This report presents the results of the first phase of the ongoing EG G Idaho, Inc. Software Reliability Research Program. The program is studying the existing software reliability models and proposes a state-of-the-art software reliability model that is relevant to the nuclear reactor control environment. This report consists of three parts: (1) summaries of the literature review of existing software reliability and fault tolerant software reliability models and their related issues, (2) proposed technique for software reliability enhancement, and (3) general discussion and future research. The development of this proposed state-of-the-art software reliability model will be performed in the secondmore » place. 407 refs., 4 figs., 2 tabs.« less
Software reliability models for critical applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pham, H.; Pham, M.
This report presents the results of the first phase of the ongoing EG&G Idaho, Inc. Software Reliability Research Program. The program is studying the existing software reliability models and proposes a state-of-the-art software reliability model that is relevant to the nuclear reactor control environment. This report consists of three parts: (1) summaries of the literature review of existing software reliability and fault tolerant software reliability models and their related issues, (2) proposed technique for software reliability enhancement, and (3) general discussion and future research. The development of this proposed state-of-the-art software reliability model will be performed in the second place.more » 407 refs., 4 figs., 2 tabs.« less
The TAME Project: Towards improvement-oriented software environments
NASA Technical Reports Server (NTRS)
Basili, Victor R.; Rombach, H. Dieter
1988-01-01
Experience from a dozen years of analyzing software engineering processes and products is summarized as a set of software engineering and measurement principles that argue for software engineering process models that integrate sound planning and analysis into the construction process. In the TAME (Tailoring A Measurement Environment) project at the University of Maryland, such an improvement-oriented software engineering process model was developed that uses the goal/question/metric paradigm to integrate the constructive and analytic aspects of software development. The model provides a mechanism for formalizing the characterization and planning tasks, controlling and improving projects based on quantitative analysis, learning in a deeper and more systematic way about the software process and product, and feeding the appropriate experience back into the current and future projects. The TAME system is an instantiation of the TAME software engineering process model as an ISEE (integrated software engineering environment). The first in a series of TAME system prototypes has been developed. An assessment of experience with this first limited prototype is presented including a reassessment of its initial architecture.
Validation of a Quality Management Metric
2000-09-01
quality management metric (QMM) was used to measure the performance of ten software managers on Department of Defense (DoD) software development programs. Informal verification and validation of the metric compared the QMM score to an overall program success score for the entire program and yielded positive correlation. The results of applying the QMM can be used to characterize the quality of software management and can serve as a template to improve software management performance. Future work includes further refining the QMM, applying the QMM scores to provide feedback
Software Engineering Laboratory Ada performance study: Results and implications
NASA Technical Reports Server (NTRS)
Booth, Eric W.; Stark, Michael E.
1992-01-01
The SEL is an organization sponsored by NASA/GSFC to investigate the effectiveness of software engineering technologies applied to the development of applications software. The SEL was created in 1977 and has three organizational members: NASA/GSFC, Systems Development Branch; The University of Maryland, Computer Sciences Department; and Computer Sciences Corporation, Systems Development Operation. The goals of the SEL are as follows: (1) to understand the software development process in the GSFC environments; (2) to measure the effect of various methodologies, tools, and models on this process; and (3) to identify and then to apply successful development practices. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that include the Ada Performance Study Report. This paper describes the background of Ada in the Flight Dynamics Division (FDD), the objectives and scope of the Ada Performance Study, the measurement approach used, the performance tests performed, the major test results, and the implications for future FDD Ada development efforts.
NASA Technical Reports Server (NTRS)
Logan, Cory; Maida, James; Goldsby, Michael; Clark, Jim; Wu, Liew; Prenger, Henk
1993-01-01
The Space Station Freedom (SSF) Data Management System (DMS) consists of distributed hardware and software which monitor and control the many onboard systems. Virtual environment and off-the-shelf computer technologies can be used at critical points in project development to aid in objectives and requirements development. Geometric models (images) coupled with off-the-shelf hardware and software technologies were used in The Space Station Mockup and Trainer Facility (SSMTF) Crew Operational Assessment Project. Rapid prototyping is shown to be a valuable tool for operational procedure and system hardware and software requirements development. The project objectives, hardware and software technologies used, data gained, current activities, future development and training objectives shall be discussed. The importance of defining prototyping objectives and staying focused while maintaining schedules are discussed along with project pitfalls.
ERIC Educational Resources Information Center
Bennett, Hugh
1993-01-01
Describes Photo CD, a procedure developed by Eastman Kodak for storing high-resolution 35mm film images on compact discs, and explains Macintosh microcomputer-based hardware and software that can be used with it. Software for viewing as well as editing and altering images is described, and future products are discussed. (four references) (LRW)
ERIC Educational Resources Information Center
1972
Recent and expected developments in the computer industry are discussed in this 628-page yearbook, successor to "The Punched Card Annual." The first section of the report is an overview of current computer hardware and software and includes articles about future applications of mainframes, an analysis of the software industry, and a summary of the…
Continuation of research into language concepts for the mission support environment
NASA Technical Reports Server (NTRS)
1991-01-01
A concept for a more intuitive and graphically based Computation (Comp) Builder was developed. The Graphical Comp Builder Prototype was developed, which is an X Window based graphical tool that allows the user to build Comps using graphical symbols. Investigation was conducted to determine the availability and suitability of the Ada programming language for the development of future control center type software. The Space Station Freedom Project identified Ada as the desirable programming language for the development of Space Station Control Center software systems.
2012-09-28
spectral-geotechnical libraries and models developed during remote sensing and calibration/ validation campaigns conducted by NRL and collaborating...geotechnical libraries and models developed during remote sensing and calibration/ validation campaigns conducted by NRL and collaborating institutions in four...2010; Bachmann, Fry, et al, 2012a). The NRL HITT tool is a model for how we develop and validate software, and the future development of tools by
NASA Technical Reports Server (NTRS)
Khan, Gufran Sayeed; Gubarev, Mikhail; Speegle, Chet; Ramsey, Brian
2010-01-01
The presentation includes grazing incidence X-ray optics, motivation and challenges, mid spatial frequency generation in cylindrical polishing, design considerations for polishing lap, simulation studies and experimental results, future scope, and summary. Topics include current status of replication optics technology, cylindrical polishing process using large size polishing lap, non-conformance of polishin lap to the optics, development of software and polishing machine, deterministic prediction of polishing, polishing experiment under optimum conditions, and polishing experiment based on known error profile. Future plans include determination of non-uniformity in the polishing lap compliance, development of a polishing sequence based on a known error profile of the specimen, software for generating a mandrel polishing sequence, design an development of a flexible polishing lap, and computer controlled localized polishing process.
Marshall Space Flight Center Ground Systems Development and Integration
NASA Technical Reports Server (NTRS)
Wade, Gina
2016-01-01
Ground Systems Development and Integration performs a variety of tasks in support of the Mission Operations Laboratory (MOL) and other Center and Agency projects. These tasks include various systems engineering processes such as performing system requirements development, system architecture design, integration, verification and validation, software development, and sustaining engineering of mission operations systems that has evolved the Huntsville Operations Support Center (HOSC) into a leader in remote operations for current and future NASA space projects. The group is also responsible for developing and managing telemetry and command configuration and calibration databases. Personnel are responsible for maintaining and enhancing their disciplinary skills in the areas of project management, software engineering, software development, software process improvement, telecommunications, networking, and systems management. Domain expertise in the ground systems area is also maintained and includes detailed proficiency in the areas of real-time telemetry systems, command systems, voice, video, data networks, and mission planning systems.
PNNL Future Power Grid Initiative-developed GridOPTICS Software System (GOSS)
None
2018-01-16
The power grid is changing and evolving. One aspect of this change is the growing use of smart meters and other devices, which are producing large volumes of useful data. However, in many cases, the data canât be translated quickly into actionable guidance to improve grid performance. There's a need for innovative tools. The GridOPTICS(TM) Software System, or GOSS, developed through PNNL's Future Power Grid Initiative, is open source and became publicly available in spring 2014. The value of this middleware is that it easily integrates grid applications with sources of data and facilitates communication between them. Such a capability provides a foundation for developing a range of applications to improve grid management.
Operations analysis (study 2.1): Shuttle upper stage software requirements
NASA Technical Reports Server (NTRS)
Wolfe, R. R.
1974-01-01
An investigation of software costs related to space shuttle upper stage operations with emphasis on the additional costs attributable to space servicing was conducted. The questions and problem areas include the following: (1) the key parameters involved with software costs; (2) historical data for extrapolation of future costs; (3) elements of the basic software development effort that are applicable to servicing functions; (4) effect of multiple servicing on complexity of the operation; and (5) are recurring software costs significant. The results address these questions and provide a foundation for estimating software costs based on the costs of similar programs and a series of empirical factors.
Impact of Growing Business on Software Processes
NASA Astrophysics Data System (ADS)
Nikitina, Natalja; Kajko-Mattsson, Mira
When growing their businesses, software organizations should not only put effort into developing and executing their business strategies, but also into managing and improving their internal software development processes and aligning them with business growth strategies. It is only in this way they may confirm that their businesses grow in a healthy and sustainable way. In this paper, we map out one software company's business growth on the course of its historical events and identify its impact on the company's software production processes and capabilities. The impact concerns benefits, challenges, problems and lessons learned. The most important lesson learned is that although business growth has become a stimulus for starting thinking and improving software processes, the organization lacked guidelines aiding it in and aligning it to business growth. Finally, the paper generates research questions providing a platform for future research.
Evolution of the phase 2 preparation and observation tools at ESO
NASA Astrophysics Data System (ADS)
Dorigo, D.; Amarand, B.; Bierwirth, T.; Jung, Y.; Santos, P.; Sogni, F.; Vera, I.
2012-09-01
Throughout the course of many years of observations at the VLT, the phase 2 software applications supporting the specification, execution and reporting of observations have been continuously improved and refined. Specifically the introduction of astronomical surveys propelled the creation of new tools to express more sophisticated, longer-term observing strategies often consisting of several hundreds of observations. During the execution phase, such survey programs compete with other service and visitor mode observations and a number of constraints have to be considered. In order to maximize telescope utilization and execute all programs in a fair way, new algorithms have been developed to prioritize observable OBs taking into account both current and future constraints (e.g. OB time constraints, technical telescope time) and suggest the next OB to be executed. As a side effect, a higher degree of observation automation enables operators to run telescopes mostly autonomously with little supervision by a support astronomer. We describe the new tools that have been deployed and the iterative and incremental software development process applied to develop them. We present our key software technologies used so far and discuss potential future evolution both in terms of features as well as software technologies.
SCA Waveform Development for Space Telemetry
NASA Technical Reports Server (NTRS)
Mortensen, Dale J.; Kifle, Multi; Hall, C. Steve; Quinn, Todd M.
2004-01-01
The NASA Glenn Research Center is investigating and developing suitable reconfigurable radio architectures for future NASA missions. This effort is examining software-based open-architectures for space based transceivers, as well as common hardware platform architectures. The Joint Tactical Radio System's (JTRS) Software Communications Architecture (SCA) is a candidate for the software approach, but may need modifications or adaptations for use in space. An in-house SCA compliant waveform development focuses on increasing understanding of software defined radio architectures and more specifically the JTRS SCA. Space requirements put a premium on size, mass, and power. This waveform development effort is key to evaluating tradeoffs with the SCA for space applications. Existing NASA telemetry links, as well as Space Exploration Initiative scenarios, are the basis for defining the waveform requirements. Modeling and simulations are being developed to determine signal processing requirements associated with a waveform and a mission-specific computational burden. Implementation of the waveform on a laboratory software defined radio platform is proceeding in an iterative fashion. Parallel top-down and bottom-up design approaches are employed.
Lessons Learned in the First Year Operating Software Defined Radios in Space
NASA Technical Reports Server (NTRS)
Chelmins, David; Mortensen, Dale; Shalkhauser, Mary Jo; Johnson, Sandra K.; Reinhart, Richard
2014-01-01
Operating three unique software defined radios (SDRs) in a space environment aboard the Space Communications and Navigation (SCaN) Testbed for over one year has provided an opportunity to gather knowledge useful for future missions considering using software defined radios. This paper provides recommendations for the development and use of SDRs, and it considers the details of each SDRs approach to software upgrades and operation. After one year, the SCaN Testbed SDRs have operated for over 1000 hours. During this time, the waveforms launched with the SDR were tested on-orbit to assure that they operated in space at the same performance level as on the ground prior to launch to obtain an initial on-orbit performance baseline. A new waveform for each SDR has been developed, implemented, uploaded to the flight system, and tested in the flight environment. Recommendations for SDR-based missions have been gathered from early development through operations. These recommendations will aid future missions to reduce the cost, schedule, and risk of operating SDRs in a space environment. This paper considers the lessons learned as they apply to SDR pre-launch checkout, purchasing space-rated hardware, flexibility in command and telemetry methods, on-orbit diagnostics, use of engineering models to aid future development, and third-party software. Each SDR implements the SCaN Testbed flight computer command and telemetry interface uniquely, allowing comparisons to be drawn. The paper discusses the lessons learned from these three unique implementations, with suggestions on the preferred approach. Also, results are presented showing that it is important to have full system performance knowledge prior to launch to establish better performance baselines in space, requiring additional test applications to be developed pre-launch. Finally, the paper presents the issues encountered with the operation and implementation of new waveforms on each SDR and proposes recommendations to avoid these issues.
Lessons Learned in the First Year Operating Software Defined Radios in Space
NASA Technical Reports Server (NTRS)
Chelmins, David; Mortensen, Dale; Shalkhauser, Mary Jo; Johnson, Sandra K.; Reinhart, Richard
2014-01-01
Operating three unique software defined radios (SDRs) in a space environment aboard the Space Communications and Navigation (SCaN) Testbed for over one year has provided an opportunity to gather knowledge useful for future missions considering using software defined radios. This paper provides recommendations for the development and use of SDRs, and it considers the details of each SDR's approach to software upgrades and operation. After one year, the SCaN Testbed SDRs have operated for over 1000 hours. During this time, the waveforms launched with the SDR were tested on-orbit to assure that they operated in space at the same performance level as on the ground prior to launch to obtain an initial on-orbit performance baseline. A new waveform for each SDR has been developed, implemented, uploaded to the flight system, and tested in the flight environment. Recommendations for SDR-based missions have been gathered from early development through operations. These recommendations will aid future missions to reduce the cost, schedule, and risk of operating SDRs in a space environment. This paper considers the lessons learned as they apply to SDR pre-launch checkout, purchasing space-rated hardware, flexibility in command and telemetry methods, on-orbit diagnostics, use of engineering models to aid future development, and third-party software. Each SDR implements the SCaN Testbed flight computer command and telemetry interface uniquely, allowing comparisons to be drawn. The paper discusses the lessons learned from these three unique implementations, with suggestions on the preferred approach. Also, results are presented showing that it is important to have full system performance knowledge prior to launch to establish better performance baselines in space, requiring additional test applications to be developed pre-launch. Finally, the paper presents the issues encountered with the operation and implementation of new waveforms on each SDR and proposes recommendations to avoid these issues.
S-Cube: Enabling the Next Generation of Software Services
NASA Astrophysics Data System (ADS)
Metzger, Andreas; Pohl, Klaus
The Service Oriented Architecture (SOA) paradigm is increasingly adopted by industry for building distributed software systems. However, when designing, developing and operating innovative software services and servicebased systems, several challenges exist. Those challenges include how to manage the complexity of those systems, how to establish, monitor and enforce Quality of Service (QoS) and Service Level Agreements (SLAs), as well as how to build those systems such that they can proactively adapt to dynamically changing requirements and context conditions. Developing foundational solutions for those challenges requires joint efforts of different research communities such as Business Process Management, Grid Computing, Service Oriented Computing and Software Engineering. This paper provides an overview of S-Cube, the European Network of Excellence on Software Services and Systems. S-Cube brings together researchers from leading research institutions across Europe, who join their competences to develop foundations, theories as well as methods and tools for future service-based systems.
Advances in knowledge-based software engineering
NASA Technical Reports Server (NTRS)
Truszkowski, Walt
1991-01-01
The underlying hypothesis of this work is that a rigorous and comprehensive software reuse methodology can bring about a more effective and efficient utilization of constrained resources in the development of large-scale software systems by both government and industry. It is also believed that correct use of this type of software engineering methodology can significantly contribute to the higher levels of reliability that will be required of future operational systems. An overview and discussion of current research in the development and application of two systems that support a rigorous reuse paradigm are presented: the Knowledge-Based Software Engineering Environment (KBSEE) and the Knowledge Acquisition fo the Preservation of Tradeoffs and Underlying Rationales (KAPTUR) systems. Emphasis is on a presentation of operational scenarios which highlight the major functional capabilities of the two systems.
SCaN Testbed Software Development and Lessons Learned
NASA Technical Reports Server (NTRS)
Kacpura, Thomas J.; Varga, Denise M.
2012-01-01
National Aeronautics and Space Administration (NASA) has developed an on-orbit, adaptable, Software Defined Radio (SDR)Space Telecommunications Radio System (STRS)-based testbed facility to conduct a suite of experiments to advance technologies, reduce risk, and enable future mission capabilities on the International Space Station (ISS). The SCAN Testbed Project will provide NASA, industry, other Government agencies, and academic partners the opportunity to develop and field communications, navigation, and networking technologies in the laboratory and space environment based on reconfigurable, SDR platforms and the STRS Architecture.The SDRs are a new technology for NASA, and the support infrastructure they require is different from legacy, fixed function radios. SDRs offer the ability to reconfigure on-orbit communications by changing software for new waveforms and operating systems to enable new capabilities or fix any anomalies, which was not a previous option. They are not stand alone devices, but required a new approach to effectively control them and flow data. This requires extensive software to be developed to utilize the full potential of these reconfigurable platforms. The paper focuses on development, integration and testing as related to the avionics processor system, and the software required to command, control, monitor, and interact with the SDRs, as well as the other communication payload elements. An extensive effort was required to develop the flight software and meet the NASA requirements for software quality and safety. The flight avionics must be radiation tolerant, and these processors have limited capability in comparison to terrestrial counterparts. A big challenge was that there are three SDRs onboard, and interfacing with multiple SDRs simultaneously complicatesd the effort. The effort also includes ground software, which is a key element for both the command of the payload, and displaying data created by the payload. The verification of the software was an extensive effort. The challenges of specifying a suitable test matrix with reconfigurable systems that offer numerous configurations is highlighted. Since the flight system testing requires methodical, controlled testing that limits risk, a nearly identical ground system to the on-orbit flight system was required to develop the software and write verification procedures before it was installed and tested on the flight system. The development of the SCAN testbed was an accelerated effort to meet launch constraints, and this paper discusses tradeoffs made to balance needed software functionality and still maintain the schedule. Future upgrades are discussed that optimize the avionics and allow experimenters to utilize the SCAN testbed potential.
NASA Technical Reports Server (NTRS)
Fordyce, Jess
1996-01-01
Work carried out to re-engineer the mission analysis segment of JPL's mission planning ground system architecture is reported on. The aim is to transform the existing software tools, originally developed for specific missions on different support environments, into an integrated, general purpose, multi-mission tool set. The issues considered are: the development of a partnership between software developers and users; the definition of key mission analysis functions; the development of a consensus based architecture; the move towards evolutionary change instead of revolutionary replacement; software reusability, and the minimization of future maintenance costs. The current status and aims of new developments are discussed and specific examples of cost savings and improved productivity are presented.
Mark J. Twery; Peter D. Knopp; Scott A. Thomasma; Donald E. Nute
2011-01-01
This is the user's guide for NED-2, which is the latest version of NED, a forest ecosystem management decision support system. This software is part of a family of software products intended to help resource managers develop goals, assess current and future conditions, and produce sustainable management plans for forest properties. Designed for stand-alone Windows...
Mark J. Twery; Peter D. Knopp; Scott A. Thomasma; Donald E. Nute
2012-01-01
This is the reference guide for NED-2, which is the latest version of NED, a forest ecosystem management decision support system. This software is part of a family of software products intended to help resource managers develop goals, assess current and future conditions, and produce sustainable management plans for forest properties. Designed for stand-alone Windows-...
A Two-Century-Old Vision for the Future.
ERIC Educational Resources Information Center
Fuchs, Ira H.
1988-01-01
Discusses the necessity of acquiring and developing technological advances for use in the classroom to provide a vision for the future. Topics discussed include microcomputers; workstations; software; networks; cooperative endeavors in industry and academia; artificial intelligence; and the necessity for financial support. (LRW)
NASA Astrophysics Data System (ADS)
Drachova-Strang, Svetlana V.
As computing becomes ubiquitous, software correctness has a fundamental role in ensuring the safety and security of the systems we build. To design and develop software correctly according to their formal contracts, CS students, the future software practitioners, need to learn a critical set of skills that are necessary and sufficient for reasoning about software correctness. This dissertation presents a systematic approach to both introducing these reasoning skills into the curriculum, and assessing how well the students have learned them. Specifically, it introduces a comprehensive Reasoning Concept Inventory (RCI) that captures the fine details of basic reasoning skills that are ideally learned across the undergraduate curriculum to reason about software correctness, to develop high quality software, and to understand why software works as specified. The RCI forms the basis for developing learning outcomes that help educators to assess the adequacy of current techniques and pinpoint necessary improvements. This dissertation contains results from experimentation and assessment over the past few years in multiple CS courses. The results show that the finer principles of mathematical reasoning of software correctness can be taught effectively and continuously improved with the help of the RCI using suitable teaching practices, and supporting methods and tools.
Near-infrared face recognition utilizing open CV software
NASA Astrophysics Data System (ADS)
Sellami, Louiza; Ngo, Hau; Fowler, Chris J.; Kearney, Liam M.
2014-06-01
Commercially available hardware, freely available algorithms, and authors' developed software are synergized successfully to detect and recognize subjects in an environment without visible light. This project integrates three major components: an illumination device operating in near infrared (NIR) spectrum, a NIR capable camera and a software algorithm capable of performing image manipulation, facial detection and recognition. Focusing our efforts in the near infrared spectrum allows the low budget system to operate covertly while still allowing for accurate face recognition. In doing so a valuable function has been developed which presents potential benefits in future civilian and military security and surveillance operations.
A Unique Software System For Simulation-to-Flight Research
NASA Technical Reports Server (NTRS)
Chung, Victoria I.; Hutchinson, Brian K.
2001-01-01
"Simulation-to-Flight" is a research development concept to reduce costs and increase testing efficiency of future major aeronautical research efforts at NASA. The simulation-to-flight concept is achieved by using common software and hardware, procedures, and processes for both piloted-simulation and flight testing. This concept was applied to the design and development of two full-size transport simulators, a research system installed on a NASA B-757 airplane, and two supporting laboratories. This paper describes the software system that supports the simulation-to-flight facilities. Examples of various simulation-to-flight experimental applications were also provided.
NASA Technical Reports Server (NTRS)
Pena, Joaquin; Hinchey, Michael G.; Ruiz-Cortes, Antonio
2006-01-01
The field of Software Product Lines (SPL) emphasizes building a core architecture for a family of software products from which concrete products can be derived rapidly. This helps to reduce time-to-market, costs, etc., and can result in improved software quality and safety. Current AOSE methodologies are concerned with developing a single Multiagent System. We propose an initial approach to developing the core architecture of a Multiagent Systems Product Line (MAS-PL), exemplifying our approach with reference to a concept NASA mission based on multiagent technology.
Space Communication and Navigation Testbed Communications Technology for Exploration
NASA Technical Reports Server (NTRS)
Reinhart, Richard
2013-01-01
NASA developed and launched an experimental flight payload (referred to as the Space Communication and Navigation Test Bed) to investigate software defined radio, networking, and navigation technologies, operationally in the space environment. The payload consists of three software defined radios each compliant to NASAs Space Telecommunications Radio System Architecture, a common software interface description standard for software defined radios. The software defined radios are new technology developed by NASA and industry partners. The payload is externally mounted to the International Space Station truss and available to NASA, industry, and university partners to conduct experiments representative of future mission capability. Experiment operations include in-flight reconfiguration of the SDR waveform functions and payload networking software. The flight system communicates with NASAs orbiting satellite relay network, the Tracking, Data Relay Satellite System at both S-band and Ka-band and to any Earth-based compatible S-band ground station.
NASA Astrophysics Data System (ADS)
van Gend, Carel; Lombaard, Briehan; Sickafoose, Amanda; Whittal, Hamish
2016-07-01
Until recently, software for instruments on the smaller telescopes at the South African Astronomical Observatory (SAAO) has not been designed for remote accessibility and frequently has not been developed using modern software best-practice. We describe a software architecture we have implemented for use with new and upgraded instruments at the SAAO. The architecture was designed to allow for multiple components and to be fast, reliable, remotely- operable, support different user interfaces, employ as much non-proprietary software as possible, and to take future-proofing into consideration. Individual component drivers exist as standalone processes, communicating over a network. A controller layer coordinates the various components, and allows a variety of user interfaces to be used. The Sutherland High-speed Optical Cameras (SHOC) instruments incorporate an Andor electron-multiplying CCD camera, a GPS unit for accurate timing and a pair of filter wheels. We have applied the new architecture to the SHOC instruments, with the camera driver developed using Andor's software development kit. We have used this to develop an innovative web-based user-interface to the instrument.
UWB Tracking Software Development
NASA Technical Reports Server (NTRS)
Gross, Julia; Arndt, Dickey; Ngo, Phong; Phan, Chau; Dusl, John; Ni, Jianjun; Rafford, Melinda
2006-01-01
An Ultra-Wideband (UWB) two-cluster Angle of Arrival (AOA) tracking prototype system is currently being developed and tested at NASA Johnson Space Center for space exploration applications. This talk discusses the software development efforts for this UWB two-cluster AOA tracking system. The role the software plays in this system is to take waveform data from two UWB radio receivers as an input, feed this input into an AOA tracking algorithm, and generate the target position as an output. The architecture of the software (Input/Output Interface and Algorithm Core) will be introduced in this talk. The development of this software has three phases. In Phase I, the software is mostly Matlab driven and calls C++ socket functions to provide the communication links to the radios. This is beneficial in the early stage when it is necessary to frequently test changes in the algorithm. Phase II of the development is to have the software mostly C++ driven and call a Matlab function for the AOA tracking algorithm. This is beneficial in order to send the tracking results to other systems and also to improve the tracking update rate of the system. The third phase is part of future work and is to have the software completely C++ driven with a graphics user interface. This software design enables the fine resolution tracking of the UWB two-cluster AOA tracking system.
Analyzing qualitative data with computer software.
Weitzman, E A
1999-01-01
OBJECTIVE: To provide health services researchers with an overview of the qualitative data analysis process and the role of software within it; to provide a principled approach to choosing among software packages to support qualitative data analysis; to alert researchers to the potential benefits and limitations of such software; and to provide an overview of the developments to be expected in the field in the near future. DATA SOURCES, STUDY DESIGN, METHODS: This article does not include reports of empirical research. CONCLUSIONS: Software for qualitative data analysis can benefit the researcher in terms of speed, consistency, rigor, and access to analytic methods not available by hand. Software, however, is not a replacement for methodological training. PMID:10591282
Using Combined SFTA and SFMECA Techniques for Space Critical Software
NASA Astrophysics Data System (ADS)
Nicodemos, F. G.; Lahoz, C. H. N.; Abdala, M. A. D.; Saotome, O.
2012-01-01
This work addresses the combined Software Fault Tree Analysis (SFTA) and Software Failure Modes, Effects and Criticality Analysis (SFMECA) techniques applied to space critical software of satellite launch vehicles. The combined approach is under research as part of the Verification and Validation (V&V) efforts to increase software dependability and as future application in other projects under development at Instituto de Aeronáutica e Espaço (IAE). The applicability of such approach was conducted on system software specification and applied to a case study based on the Brazilian Satellite Launcher (VLS). The main goal is to identify possible failure causes and obtain compensating provisions that lead to inclusion of new functional and non-functional system software requirements.
The Future of Library Automation in Schools.
ERIC Educational Resources Information Center
Anderson, Elaine
2000-01-01
Addresses the future of library automation programs for schools. Discusses requirements of emerging OPACs and circulation systems; the Schools Interoperability Framework (SIF), an industry initiatives to develop an open specification for ensuring that K-12 instructional and administrative software applications work together more effectively; home…
New technologies for supporting real-time on-board software development
NASA Astrophysics Data System (ADS)
Kerridge, D.
1995-03-01
The next generation of on-board data management systems will be significantly more complex than current designs, and will be required to perform more complex and demanding tasks in software. Improved hardware technology, in the form of the MA31750 radiation hard processor, is one key component in addressing the needs of future embedded systems. However, to complement these hardware advances, improved support for the design and implementation of real-time data management software is now needed. This will help to control the cost and risk assoicated with developing data management software development as it becomes an increasingly significant element within embedded systems. One particular problem with developing embedded software is managing the non-functional requirements in a systematic way. This paper identifies how Logica has exploited recent developments in hard real-time theory to address this problem through the use of new hard real-time analysis and design methods which can be supported by specialized tools. The first stage in transferring this technology from the research domain to industrial application has already been completed. The MA37150 Hard Real-Time Embedded Software Support Environment (HESSE) is a loosely integrated set of hardware and software tools which directly support the process of hard real-time analysis for software targeting the MA31750 processor. With further development, this HESSE promises to provide embedded system developers with software tools which can reduce the risks associated with developing complex hard real-time software. Supported in this way by more sophisticated software methods and tools, it is foreseen that MA31750 based embedded systems can meet the processing needs for the next generation of on-board data management systems.
Advanced Software Development Workstation Project
NASA Technical Reports Server (NTRS)
Lee, Daniel
1989-01-01
The Advanced Software Development Workstation Project, funded by Johnson Space Center, is investigating knowledge-based techniques for software reuse in NASA software development projects. Two prototypes have been demonstrated and a third is now in development. The approach is to build a foundation that provides passive reuse support, add a layer that uses domain-independent programming knowledge, add a layer that supports the acquisition of domain-specific programming knowledge to provide active support, and enhance maintainability and modifiability through an object-oriented approach. The development of new application software would use specification-by-reformulation, based on a cognitive theory of retrieval from very long-term memory in humans, and using an Ada code library and an object base. Current tasks include enhancements to the knowledge representation of Ada packages and abstract data types, extensions to support Ada package instantiation knowledge acquisition, integration with Ada compilers and relational databases, enhancements to the graphical user interface, and demonstration of the system with a NASA contractor-developed trajectory simulation package. Future work will focus on investigating issues involving scale-up and integration.
2002-09-01
seconds per minute that the runtime environment was up and running. Defect Categories. The labels of the 5 defect categories. 78 Cosmetic Defects...The name that corresponds to QSM’s cosmetic defects. Cosmetic defects can be described as deferred, such as errors in format of displays or...2002. [Fent00] Fenton , N. E. and Neil, M. Software Metrics: Roadmap. Proceedings of the Conference on the Future of Software Engineering, 2000, pp
CrossTalk, The Journal of Defense Software Engineering. Volume 27, Number 3. May/June 2014
2014-06-01
field of software engineering. by Delores M. Etter, Jennifer Webb, and John Howard The Problem of Prolific Process What is the optimal amount and...Programming Will Never Be Obsolete The creativity of software developers will always be needed to solve problems of the future and to then translate those...utilized to address some of the complex problems associated with biometric database construction. 1. A Next Generation Multispectral Iris Biometric
PD5: a general purpose library for primer design software.
Riley, Michael C; Aubrey, Wayne; Young, Michael; Clare, Amanda
2013-01-01
Complex PCR applications for large genome-scale projects require fast, reliable and often highly sophisticated primer design software applications. Presently, such applications use pipelining methods to utilise many third party applications and this involves file parsing, interfacing and data conversion, which is slow and prone to error. A fully integrated suite of software tools for primer design would considerably improve the development time, the processing speed, and the reliability of bespoke primer design software applications. The PD5 software library is an open-source collection of classes and utilities, providing a complete collection of software building blocks for primer design and analysis. It is written in object-oriented C(++) with an emphasis on classes suitable for efficient and rapid development of bespoke primer design programs. The modular design of the software library simplifies the development of specific applications and also integration with existing third party software where necessary. We demonstrate several applications created using this software library that have already proved to be effective, but we view the project as a dynamic environment for building primer design software and it is open for future development by the bioinformatics community. Therefore, the PD5 software library is published under the terms of the GNU General Public License, which guarantee access to source-code and allow redistribution and modification. The PD5 software library is downloadable from Google Code and the accompanying Wiki includes instructions and examples: http://code.google.com/p/primer-design.
Clinical software development for the Web: lessons learned from the BOADICEA project
2012-01-01
Background In the past 20 years, society has witnessed the following landmark scientific advances: (i) the sequencing of the human genome, (ii) the distribution of software by the open source movement, and (iii) the invention of the World Wide Web. Together, these advances have provided a new impetus for clinical software development: developers now translate the products of human genomic research into clinical software tools; they use open-source programs to build them; and they use the Web to deliver them. Whilst this open-source component-based approach has undoubtedly made clinical software development easier, clinical software projects are still hampered by problems that traditionally accompany the software process. This study describes the development of the BOADICEA Web Application, a computer program used by clinical geneticists to assess risks to patients with a family history of breast and ovarian cancer. The key challenge of the BOADICEA Web Application project was to deliver a program that was safe, secure and easy for healthcare professionals to use. We focus on the software process, problems faced, and lessons learned. Our key objectives are: (i) to highlight key clinical software development issues; (ii) to demonstrate how software engineering tools and techniques can facilitate clinical software development for the benefit of individuals who lack software engineering expertise; and (iii) to provide a clinical software development case report that can be used as a basis for discussion at the start of future projects. Results We developed the BOADICEA Web Application using an evolutionary software process. Our approach to Web implementation was conservative and we used conventional software engineering tools and techniques. The principal software development activities were: requirements, design, implementation, testing, documentation and maintenance. The BOADICEA Web Application has now been widely adopted by clinical geneticists and researchers. BOADICEA Web Application version 1 was released for general use in November 2007. By May 2010, we had > 1200 registered users based in the UK, USA, Canada, South America, Europe, Africa, Middle East, SE Asia, Australia and New Zealand. Conclusions We found that an evolutionary software process was effective when we developed the BOADICEA Web Application. The key clinical software development issues identified during the BOADICEA Web Application project were: software reliability, Web security, clinical data protection and user feedback. PMID:22490389
Clinical software development for the Web: lessons learned from the BOADICEA project.
Cunningham, Alex P; Antoniou, Antonis C; Easton, Douglas F
2012-04-10
In the past 20 years, society has witnessed the following landmark scientific advances: (i) the sequencing of the human genome, (ii) the distribution of software by the open source movement, and (iii) the invention of the World Wide Web. Together, these advances have provided a new impetus for clinical software development: developers now translate the products of human genomic research into clinical software tools; they use open-source programs to build them; and they use the Web to deliver them. Whilst this open-source component-based approach has undoubtedly made clinical software development easier, clinical software projects are still hampered by problems that traditionally accompany the software process. This study describes the development of the BOADICEA Web Application, a computer program used by clinical geneticists to assess risks to patients with a family history of breast and ovarian cancer. The key challenge of the BOADICEA Web Application project was to deliver a program that was safe, secure and easy for healthcare professionals to use. We focus on the software process, problems faced, and lessons learned. Our key objectives are: (i) to highlight key clinical software development issues; (ii) to demonstrate how software engineering tools and techniques can facilitate clinical software development for the benefit of individuals who lack software engineering expertise; and (iii) to provide a clinical software development case report that can be used as a basis for discussion at the start of future projects. We developed the BOADICEA Web Application using an evolutionary software process. Our approach to Web implementation was conservative and we used conventional software engineering tools and techniques. The principal software development activities were: requirements, design, implementation, testing, documentation and maintenance. The BOADICEA Web Application has now been widely adopted by clinical geneticists and researchers. BOADICEA Web Application version 1 was released for general use in November 2007. By May 2010, we had > 1200 registered users based in the UK, USA, Canada, South America, Europe, Africa, Middle East, SE Asia, Australia and New Zealand. We found that an evolutionary software process was effective when we developed the BOADICEA Web Application. The key clinical software development issues identified during the BOADICEA Web Application project were: software reliability, Web security, clinical data protection and user feedback.
From Exotic to Mainstream: A 10-year Odyssey from Internet Speed to Boundary Spanning with Scrum
NASA Astrophysics Data System (ADS)
Baskerville, Richard; Pries-Heje, Jan; Madsen, Sabine
Based on four empirical studies conducted over a 10-year time period from 1999 to 2008 we investigate how local software processes interact with global changes in the software development context. In 1999 companies were developing software at high speed in a desperate rush to be first-to-market. In 2001 a new high speed/quick results development process had become established practice. In 2003 changes in the market created the need for a more balanced view on speed and quality, and in 2008 companies were successfully combining agile and plan driven approaches to achieve the benefits of both. The studies reveal a twostage pattern in which dramatic changes in the market causes disruption of established practices, experimentation, and process adaptations followed by consolidation of lessons learnt into a new (and once again mature) software development process. Limitations, implications, and areas for future research are discussed.
A META-COMPOSITE SOFTWARE DEVELOPMENT APPROACH FOR TRANSLATIONAL RESEARCH
Sadasivam, Rajani S.; Tanik, Murat M.
2013-01-01
Translational researchers conduct research in a highly data-intensive and continuously changing environment and need to use multiple, disparate tools to achieve their goals. These researchers would greatly benefit from meta-composite software development or the ability to continuously compose and recompose tools together in response to their ever-changing needs. However, the available tools are largely disconnected, and current software approaches are inefficient and ineffective in their support for meta-composite software development. Building on the composite services development approach, the de facto standard for developing integrated software systems, we propose a concept-map and agent-based meta-composite software development approach. A crucial step in composite services development is the modeling of users’ needs as processes, which can then be specified in an executable format for system composition. We have two key innovations. First, our approach allows researchers (who understand their needs best) instead of technicians to take a leadership role in the development of process models, reducing inefficiencies and errors. A second innovation is that our approach also allows for modeling of complex user interactions as part of the process, overcoming the technical limitations of current tools. We demonstrate the feasibility of our approach using a real-world translational research use case. We also present results of usability studies evaluating our approach for future refinements. PMID:23504436
A meta-composite software development approach for translational research.
Sadasivam, Rajani S; Tanik, Murat M
2013-06-01
Translational researchers conduct research in a highly data-intensive and continuously changing environment and need to use multiple, disparate tools to achieve their goals. These researchers would greatly benefit from meta-composite software development or the ability to continuously compose and recompose tools together in response to their ever-changing needs. However, the available tools are largely disconnected, and current software approaches are inefficient and ineffective in their support for meta-composite software development. Building on the composite services development approach, the de facto standard for developing integrated software systems, we propose a concept-map and agent-based meta-composite software development approach. A crucial step in composite services development is the modeling of users' needs as processes, which can then be specified in an executable format for system composition. We have two key innovations. First, our approach allows researchers (who understand their needs best) instead of technicians to take a leadership role in the development of process models, reducing inefficiencies and errors. A second innovation is that our approach also allows for modeling of complex user interactions as part of the process, overcoming the technical limitations of current tools. We demonstrate the feasibility of our approach using a real-world translational research use case. We also present results of usability studies evaluating our approach for future refinements.
Application of Real Options Theory to DoD Software Acquisitions
2009-02-20
Future Combat Systems Program. Washington, DC. U.S. Government Printing Office. Damodaran , A. (2007). Investment Valuation : The Options To Expand... valuation methodology, when enhanced and properly formulated around a proposed or existing software investment employing the spiral development approach...THIS PAGE INTENTIONALLY LEFT BLANK iii ABSTRACT The traditional real options valuation methodology, when enhanced and properly formulated
Microcomputers and the future of epidemiology.
Dean, A G
1994-01-01
The Workshop on Microcomputers and the Future of Epidemiology was held March 8-9, 1993, at the Turner Conference Center, Atlanta, GA, with 130 public health professionals participating. The purpose of the workshop was to define microcomputer needs in epidemiology and to propose future initiatives. Thirteen groups representing public health disciplines defined their needs for better and more useful data, development of computer technology appropriate to epidemiology, user support and human infrastructure development, and global communication and planning. Initiatives proposed were demonstration of health surveillance systems, new software and hardware, computer-based training, projects to establish or improve data bases and community access to data bases, improved international communication, conferences on microcomputer use in particular disciplines, a suggestion to encourage competition in the production of public-domain software, and longrange global planning for epidemiologic computing and data management. Other interested groups are urged to study, modify, and implement those ideas. PMID:7910692
Characterizing the scientific potential of satellite sensors. [San Francisco, California
NASA Technical Reports Server (NTRS)
1984-01-01
Eleven thematic mapper (TM) radiometric calibration programs were tested and evaluated in support of the task to characterize the potential of LANDSAT TM digital imagery for scientific investigations in the Earth sciences and terrestrial physics. Three software errors related to integer overflow, divide by zero, and nonexist file group were found and solved. Raw, calibrated, and corrected image groups that were created and stored on the Barker2 disk are enumerated. Black and white pixel print files were created for various subscenes of a San Francisco scene (ID 40392-18152). The development of linear regression software is discussed. The output of the software and its function are described. Future work in TM radiometric calibration, image processing, and software development is outlined.
Report on Automated Semantic Analysis of Scientific and Engineering Codes
NASA Technical Reports Server (NTRS)
Stewart. Maark E. M.; Follen, Greg (Technical Monitor)
2001-01-01
The loss of the Mars Climate Orbiter due to a software error reveals what insiders know: software development is difficult and risky because, in part, current practices do not readily handle the complex details of software. Yet, for scientific software development the MCO mishap represents the tip of the iceberg; few errors are so public, and many errors are avoided with a combination of expertise, care, and testing during development and modification. Further, this effort consumes valuable time and resources even when hardware costs and execution time continually decrease. Software development could use better tools! This lack of tools has motivated the semantic analysis work explained in this report. However, this work has a distinguishing emphasis; the tool focuses on automated recognition of the fundamental mathematical and physical meaning of scientific code. Further, its comprehension is measured by quantitatively evaluating overall recognition with practical codes. This emphasis is necessary if software errors-like the MCO error-are to be quickly and inexpensively avoided in the future. This report evaluates the progress made with this problem. It presents recommendations, describes the approach, the tool's status, the challenges, related research, and a development strategy.
Deuterostome Evolution: Large Data Set Analysis
NASA Technical Reports Server (NTRS)
Janies, Daniel; Wheeler, Ward
2004-01-01
This award allowed us to develop novel hardware for phylogenetics, collect genomic data and produce several phylogenies of deuterostome organisms, communicate the results publicly, release software into the public domain, publish textbooks and papers, and prepare for the next research projects. There are no resulting subject inventions to report. We review these activities in three sections: 1) Hardware and software and development; 2) Evolutionary biology research; 3) Our proposed future direction, predictive analysis of pathogens in support of the NASA mission.
Chemical calculations on Cray computers
NASA Technical Reports Server (NTRS)
Taylor, Peter R.; Bauschlicher, Charles W., Jr.; Schwenke, David W.
1989-01-01
The influence of recent developments in supercomputing on computational chemistry is discussed with particular reference to Cray computers and their pipelined vector/limited parallel architectures. After reviewing Cray hardware and software the performance of different elementary program structures are examined, and effective methods for improving program performance are outlined. The computational strategies appropriate for obtaining optimum performance in applications to quantum chemistry and dynamics are discussed. Finally, some discussion is given of new developments and future hardware and software improvements.
Open Source software and social networks: disruptive alternatives for medical imaging.
Ratib, Osman; Rosset, Antoine; Heuberger, Joris
2011-05-01
In recent decades several major changes in computer and communication technology have pushed the limits of imaging informatics and PACS beyond the traditional system architecture providing new perspectives and innovative approach to a traditionally conservative medical community. Disruptive technologies such as the world-wide-web, wireless networking, Open Source software and recent emergence of cyber communities and social networks have imposed an accelerated pace and major quantum leaps in the progress of computer and technology infrastructure applicable to medical imaging applications. This paper reviews the impact and potential benefits of two major trends in consumer market software development and how they will influence the future of medical imaging informatics. Open Source software is emerging as an attractive and cost effective alternative to traditional commercial software developments and collaborative social networks provide a new model of communication that is better suited to the needs of the medical community. Evidence shows that successful Open Source software tools have penetrated the medical market and have proven to be more robust and cost effective than their commercial counterparts. Developed by developers that are themselves part of the user community, these tools are usually better adapted to the user's need and are more robust than traditional software programs being developed and tested by a large number of contributing users. This context allows a much faster and more appropriate development and evolution of the software platforms. Similarly, communication technology has opened up to the general public in a way that has changed the social behavior and habits adding a new dimension to the way people communicate and interact with each other. The new paradigms have also slowly penetrated the professional market and ultimately the medical community. Secure social networks allowing groups of people to easily communicate and exchange information is a new model that is particularly suitable for some specific groups of healthcare professional and for physicians. It has also changed the expectations of how patients wish to communicate with their physicians. Emerging disruptive technologies and innovative paradigm such as Open Source software are leading the way to a new generation of information systems that slowly will change the way physicians and healthcare providers as well as patients will interact and communicate in the future. The impact of these new technologies is particularly effective in image communication, PACS and teleradiology. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
Rendezvous Integration Complexities of NASA Human Flight Vehicles
NASA Technical Reports Server (NTRS)
Brazzel, Jack P.; Goodman, John L.
2009-01-01
Propellant-optimal trajectories, relative sensors and navigation, and docking/capture mechanisms are rendezvous disciplines that receive much attention in the technical literature. However, other areas must be considered. These include absolute navigation, maneuver targeting, attitude control, power generation, software development and verification, redundancy management, thermal control, avionics integration, robotics, communications, lighting, human factors, crew timeline, procedure development, orbital debris risk mitigation, structures, plume impingement, logistics, and in some cases extravehicular activity. While current and future spaceflight programs will introduce new technologies and operations concepts, the complexity of integrating multiple systems on multiple spacecraft will remain. The systems integration task may become more difficult as increasingly complex software is used to meet current and future automation, autonomy, and robotic operation requirements.
Reference software implementation for GIFTS ground data processing
NASA Astrophysics Data System (ADS)
Garcia, R. K.; Howell, H. B.; Knuteson, R. O.; Martin, G. D.; Olson, E. R.; Smuga-Otto, M. J.
2006-08-01
Future satellite weather instruments such as high spectral resolution imaging interferometers pose a challenge to the atmospheric science and software development communities due to the immense data volumes they will generate. An open-source, scalable reference software implementation demonstrating the calibration of radiance products from an imaging interferometer, the Geosynchronous Imaging Fourier Transform Spectrometer1 (GIFTS), is presented. This paper covers essential design principles laid out in summary system diagrams, lessons learned during implementation and preliminary test results from the GIFTS Information Processing System (GIPS) prototype.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Nicholas R.; Pointer, William David; Sieger, Matt
2016-04-01
The goal of this review is to enable application of codes or software packages for safety assessment of advanced sodium-cooled fast reactor (SFR) designs. To address near-term programmatic needs, the authors have focused on two objectives. First, the authors have focused on identification of requirements for software QA that must be satisfied to enable the application of software to future safety analyses. Second, the authors have collected best practices applied by other code development teams to minimize cost and time of initial code qualification activities and to recommend a path to the stated goal.
NASA Astrophysics Data System (ADS)
Ye, Jinzuo; Chi, Chongwei; Zhang, Shuang; Ma, Xibo; Tian, Jie
2014-02-01
Sentinel lymph node (SLN) in vivo detection is vital in breast cancer surgery. A new near-infrared fluorescence-based surgical navigation system (SNS) imaging software, which has been developed by our research group, is presented for SLN detection surgery in this paper. The software is based on the fluorescence-based surgical navigation hardware system (SNHS) which has been developed in our lab, and is designed specifically for intraoperative imaging and postoperative data analysis. The surgical navigation imaging software consists of the following software modules, which mainly include the control module, the image grabbing module, the real-time display module, the data saving module and the image processing module. And some algorithms have been designed to achieve the performance of the software, for example, the image registration algorithm based on correlation matching. Some of the key features of the software include: setting the control parameters of the SNS; acquiring, display and storing the intraoperative imaging data in real-time automatically; analysis and processing of the saved image data. The developed software has been used to successfully detect the SLNs in 21 cases of breast cancer patients. In the near future, we plan to improve the software performance and it will be extensively used for clinical purpose.
An overview of 3D software visualization.
Teyseyre, Alfredo R; Campo, Marcelo R
2009-01-01
Software visualization studies techniques and methods for graphically representing different aspects of software. Its main goal is to enhance, simplify and clarify the mental representation a software engineer has of a computer system. During many years, visualization in 2D space has been actively studied, but in the last decade, researchers have begun to explore new 3D representations for visualizing software. In this article, we present an overview of current research in the area, describing several major aspects like: visual representations, interaction issues, evaluation methods and development tools. We also perform a survey of some representative tools to support different tasks, i.e., software maintenance and comprehension, requirements validation and algorithm animation for educational purposes, among others. Finally, we conclude identifying future research directions.
NASA Technical Reports Server (NTRS)
Boulanger, Richard; Overland, David
2004-01-01
Technologies that facilitate the design and control of complex, hybrid, and resource-constrained systems are examined. This paper focuses on design methodologies, and system architectures, not on specific control methods that may be applied to life support subsystems. Honeywell and Boeing have estimated that 60-80Y0 of the effort in developing complex control systems is software development, and only 20-40% is control system development. It has also been shown that large software projects have failure rates of as high as 50-65%. Concepts discussed include the Unified Modeling Language (UML) and design patterns with the goal of creating a self-improving, self-documenting system design process. Successful architectures for control must not only facilitate hardware to software integration, but must also reconcile continuously changing software with much less frequently changing hardware. These architectures rely on software modules or components to facilitate change. Architecting such systems for change leverages the interfaces between these modules or components.
Cockpit Ocular Recording System (CORS)
NASA Technical Reports Server (NTRS)
Rothenheber, Edward; Stokes, James; Lagrossa, Charles; Arnold, William; Dick, A. O.
1990-01-01
The overall goal was the development of a Cockpit Ocular Recording System (CORS). Four tasks were used: (1) the development of the system; (2) the experimentation and improvement of the system; (3) demonstrations of the working system; and (4) system documentation. Overall, the prototype represents a workable and flexibly designed CORS system. For the most part, the hardware use for the prototype system is off-the-shelf. All of the following software was developed specifically: (1) setup software that the user specifies the cockpit configuration and identifies possible areas in which the pilot will look; (2) sensing software which integrates the 60 Hz data from the oculometer and heat orientation sensing unit; (3) processing software which applies a spatiotemporal filter to the lookpoint data to determine fixation/dwell positions; (4) data recording output routines; and (5) playback software which allows the user to retrieve and analyze the data. Several experiments were performed to verify the system accuracy and quantify system deficiencies. These tests resulted in recommendations for any future system that might be constructed.
Operational Analysis of Time-Optimal Maneuvering for Imaging Spacecraft
2013-03-01
imaging spacecraft. The analysis is facilitated through the use of AGI’s Systems Tool Kit ( STK ) software. An Analytic Hierarchy Process (AHP)-based...the Singapore-developed X-SAT imaging spacecraft. The analysis is facilitated through the use of AGI’s Systems Tool Kit ( STK ) software. An Analytic...89 B. FUTURE WORK................................................................................. 90 APPENDIX A. STK DATA AND BENEFIT
Web Application Software for Ground Operations Planning Database (GOPDb) Management
NASA Technical Reports Server (NTRS)
Lanham, Clifton; Kallner, Shawn; Gernand, Jeffrey
2013-01-01
A Web application facilitates collaborative development of the ground operations planning document. This will reduce costs and development time for new programs by incorporating the data governance, access control, and revision tracking of the ground operations planning data. Ground Operations Planning requires the creation and maintenance of detailed timelines and documentation. The GOPDb Web application was created using state-of-the-art Web 2.0 technologies, and was deployed as SaaS (Software as a Service), with an emphasis on data governance and security needs. Application access is managed using two-factor authentication, with data write permissions tied to user roles and responsibilities. Multiple instances of the application can be deployed on a Web server to meet the robust needs for multiple, future programs with minimal additional cost. This innovation features high availability and scalability, with no additional software that needs to be bought or installed. For data governance and security (data quality, management, business process management, and risk management for data handling), the software uses NAMS. No local copy/cloning of data is permitted. Data change log/tracking is addressed, as well as collaboration, work flow, and process standardization. The software provides on-line documentation and detailed Web-based help. There are multiple ways that this software can be deployed on a Web server to meet ground operations planning needs for future programs. The software could be used to support commercial crew ground operations planning, as well as commercial payload/satellite ground operations planning. The application source code and database schema are owned by NASA.
SU-E-P-05: Electronic Brachytherapy: A Physics Perspective On Field Implementation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pai, S; Ayyalasomayajula, S; Lee, S
2015-06-15
Purpose: We want to summarize our experience implementing a successful program of electronic brachytherapy at several dermatology clinics with the help of a cloud based software to help us define the key program parameters and capture physics QA aspects. Optimally developed software helps the physicist in peer review and qualify the physical parameters. Methods: Using the XOFT™ Axxent™ electronic brachytherapy system in conjunction with a cloud-based software, a process was setup to capture and record treatments. It was implemented initially at about 10 sites in California. For dosimetric purposes, the software facilitated storage of the physics parameters of surface applicatorsmore » used in treatment and other source calibration parameters. In addition, the patient prescription, pathology and other setup considerations were input by radiation oncologist and the therapist. This facilitated physics planning of the treatment parameters and also independent check of the dwell time. From 2013–2014, nearly1500 such calculation were completed by a group of physicists. A total of 800 patients with multiple lesions have been treated successfully during this period. The treatment log files have been uploaded and documented in the software which facilitated physics peer review of treatments per the standards in place by AAPM and ACR. Results: The program model was implemented successfully at multiple sites. The cloud based software allowed for proper peer review and compliance of the program at 10 clinical sites. Dosimtery was done on 800 patients and executed in a timely fashion to suit the clinical needs. Accumulated physics data in the software from the clinics allows for robust analysis and future development. Conclusion: Electronic brachytherapy implementation experience from a quality assurance perspective was greatly enhanced by using a cloud based software. The comprehensive database will pave the way for future developments to yield superior physics outcomes.« less
The Software Architecture of the Upgraded ESA DRAMA Software Suite
NASA Astrophysics Data System (ADS)
Kebschull, Christopher; Flegel, Sven; Gelhaus, Johannes; Mockel, Marek; Braun, Vitali; Radtke, Jonas; Wiedemann, Carsten; Vorsmann, Peter; Sanchez-Ortiz, Noelia; Krag, Holger
2013-08-01
In the beginnings of man's space flight activities there was the belief that space is so big that everybody could use it without any repercussions. However during the last six decades the increasing use of Earth's orbits has lead to a rapid growth in the space debris environment, which has a big influence on current and future space missions. For this reason ESA issued the "Requirements on Space Debris Mitigation for ESA Projects" [1] in 2008, which apply to all ESA missions henceforth. The DRAMA (Debris Risk Assessment and Mitigation Analysis) software suite had been developed to support the planning of space missions to comply with these requirements. During the last year the DRAMA software suite has been upgraded under ESA contract by TUBS and DEIMOS to include additional tools and increase the performance of existing ones. This paper describes the overall software architecture of the ESA DRAMA software suite. Specifically the new graphical user interface, which manages the five main tools ARES (Assessment of Risk Event Statistics), MIDAS (MASTER-based Impact Flux and Damage Assessment Software), OSCAR (Orbital Spacecraft Active Removal), CROC (Cross Section of Complex Bodies) and SARA (Re-entry Survival and Risk Analysis) is being discussed. The advancements are highlighted as well as the challenges that arise from the integration of the five tool interfaces. A framework had been developed at the ILR and was used for MASTER-2009 and PROOF-2009. The Java based GUI framework, enables the cross-platform deployment, and its underlying model-view-presenter (MVP) software pattern, meet strict design requirements necessary to ensure a robust and reliable method of operation in an environment where the GUI is separated from the processing back-end. While the GUI framework evolved with each project, allowing an increasing degree of integration of services like validators for input fields, it has also increased in complexity. The paper will conclude with an outlook on the future development of the GUI framework, where the potential for advancements will be shown.
NASA Technical Reports Server (NTRS)
Lawrence, Stella
1991-01-01
The object of this project was to develop and calibrate quantitative models for predicting the quality of software. Reliable flight and supporting ground software is a highly important factor in the successful operation of the space shuttle program. The models used in the present study consisted of SMERFS (Statistical Modeling and Estimation of Reliability Functions for Software). There are ten models in SMERFS. For a first run, the results obtained in modeling the cumulative number of failures versus execution time showed fairly good results for our data. Plots of cumulative software failures versus calendar weeks were made and the model results were compared with the historical data on the same graph. If the model agrees with actual historical behavior for a set of data then there is confidence in future predictions for this data. Considering the quality of the data, the models have given some significant results, even at this early stage. With better care in data collection, data analysis, recording of the fixing of failures and CPU execution times, the models should prove extremely helpful in making predictions regarding the future pattern of failures, including an estimate of the number of errors remaining in the software and the additional testing time required for the software quality to reach acceptable levels. It appears that there is no one 'best' model for all cases. It is for this reason that the aim of this project was to test several models. One of the recommendations resulting from this study is that great care must be taken in the collection of data. When using a model, the data should satisfy the model assumptions.
Programming support environment issues in the Byron programming environment
NASA Technical Reports Server (NTRS)
Larsen, Matthew J.
1986-01-01
Issues are discussed which programming support environments need to address in order to successfully support software engineering. These concerns are divided into two categories. The first category, issues of how software development is supported by an environment, includes support of the full life cycle, methodology flexibility, and support of software reusability. The second category contains issues of how environments should operate, such as tool reusability and integration, user friendliness, networking, and use of a central data base. This discussion is followed by an examination of Byron, an Ada based programming support environment developed at Intermetrics, focusing on the solutions Byron offers to these problems, including the support provided for software reusability and the test and maintenance phases of the life cycle. The use of Byron in project development is described briefly, and some suggestions for future Byron tools and user written tools are presented.
National Cycle Program (NCP) Common Analysis Tool for Aeropropulsion
NASA Technical Reports Server (NTRS)
Follen, G.; Naiman, C.; Evans, A.
1999-01-01
Through the NASA/Industry Cooperative Effort (NICE) agreement, NASA Lewis and industry partners are developing a new engine simulation, called the National Cycle Program (NCP), which is the initial framework of NPSS. NCP is the first phase toward achieving the goal of NPSS. This new software supports the aerothermodynamic system simulation process for the full life cycle of an engine. The National Cycle Program (NCP) was written following the Object Oriented Paradigm (C++, CORBA). The software development process used was also based on the Object Oriented paradigm. Software reviews, configuration management, test plans, requirements, design were all apart of the process used in developing NCP. Due to the many contributors to NCP, the stated software process was mandatory for building a common tool intended for use by so many organizations. The U.S. aircraft and airframe companies recognize NCP as the future industry standard for propulsion system modeling.
Cassini Attitude Control Flight Software: from Development to In-Flight Operation
NASA Technical Reports Server (NTRS)
Brown, Jay
2008-01-01
The Cassini Attitude and Articulation Control Subsystem (AACS) Flight Software (FSW) has achieved its intended design goals by successfully guiding and controlling the Cassini-Huygens planetary mission to Saturn and its moons. This paper describes an overview of AACS FSW details from early design, development, implementation, and test to its fruition of operating and maintaining spacecraft control over an eleven year prime mission. Starting from phases of FSW development, topics expand to FSW development methodology, achievements utilizing in-flight autonomy, and summarize lessons learned during flight operations which can be useful to FSW in current and future spacecraft missions.
Implementation of a production Ada project: The GRODY study
NASA Technical Reports Server (NTRS)
Godfrey, Sara; Brophy, Carolyn Elizabeth
1989-01-01
The use of the Ada language and design methodologies that encourage full use of its capabilities have a strong impact on all phases of the software development project life cycle. At the National Aeronautics and Space Administration/Goddard Space Flight Center (NASA/GSFC), the Software Engineering Laboratory (SEL) conducted an experiment in parallel development of two flight dynamics systems in FORTRAN and Ada. The differences observed during the implementation, unit testing, and integration phases of the two projects are described and the lessons learned during the implementation phase of the Ada development are outlined. Included are recommendations for future Ada development projects.
An expert system based software sizing tool, phase 2
NASA Technical Reports Server (NTRS)
Friedlander, David
1990-01-01
A software tool was developed for predicting the size of a future computer program at an early stage in its development. The system is intended to enable a user who is not expert in Software Engineering to estimate software size in lines of source code with an accuracy similar to that of an expert, based on the program's functional specifications. The project was planned as a knowledge based system with a field prototype as the goal of Phase 2 and a commercial system planned for Phase 3. The researchers used techniques from Artificial Intelligence and knowledge from human experts and existing software from NASA's COSMIC database. They devised a classification scheme for the software specifications, and a small set of generic software components that represent complexity and apply to large classes of programs. The specifications are converted to generic components by a set of rules and the generic components are input to a nonlinear sizing function which makes the final prediction. The system developed for this project predicted code sizes from the database with a bias factor of 1.06 and a fluctuation factor of 1.77, an accuracy similar to that of human experts but without their significant optimistic bias.
Space Communication and Navigation SDR Testbed, Overview and Opportunity for Experiments
NASA Technical Reports Server (NTRS)
Reinhart, Richard C.
2013-01-01
NASA has developed an experimental flight payload (referred to as the Space Communication and Navigation (SCAN) Test Bed) to investigate software defined radio (SDR) communications, networking, and navigation technologies, operationally in the space environment. The payload consists of three software defined radios each compliant to NASAs Space Telecommunications Radio System Architecture, a common software interface description standard for software defined radios. The software defined radios are new technology developments underway by NASA and industry partners launched in 2012. The payload is externally mounted to the International Space Station truss to conduct experiments representative of future mission capability. Experiment operations include in-flight reconfiguration of the SDR waveform functions and payload networking software. The flight system will communicate with NASAs orbiting satellite relay network, the Tracking and Data Relay Satellite System at both S-band and Ka-band and to any Earth-based compatible S-band ground station. The system is available for experiments by industry, academia, and other government agencies to participate in the SDR technology assessments and standards advancements.
Software-Based Safety Systems in Space - Learning from other Domains
NASA Astrophysics Data System (ADS)
Klicker, M.; Putzer, H.
2012-01-01
Increasing complexity and new emerging capabilities for manned and unmanned missions have been the hallmark of the past decades of space exploration. One of the drivers in this process was the ever increasing use of software and software-intensive systems to implement system functions necessary to the capabilities needed. The course of technological evolution suggests that this development will continue well into the future with a number of challenges for the safety community some of which shall be discussed in this paper. The current state of the art reveals a number of problems with developing and assessing safety critical software which explains the reluctance of the space community to rely on software-based safety measures to mitigate hazards. Among others, usually lack of trustworthy evidence of software integrity in all foreseeable situations and the difficulties to integrate software in the traditional safety analysis framework are cited. Experience from other domains and recent developments in modern software development methodologies and verification techniques are analysed for the suitability for space systems and an avionics architectural framework (see STANAG 4626) for the implementation of safety critical software is proposed. This is shown to create among other features the possibility of numerous degradation modes enhancing overall system safety and interoperability of computerized space systems. It also potentially simplifies international cooperation on a technical level by introducing a higher degree of compatibility. As software safety cannot be tested or argued into a system in hindsight, the development process and especially the architecture chosen are essential to establish safety properties for the software used to implement safety functions. The core of the safety argument revolves around the separation of different functions and software modules from each other by minimal coupling of functions and credible separation mechanisms in the architecture combined with rigorous development methodologies for the software itself.
Towards understanding software: 15 years in the SEL
NASA Technical Reports Server (NTRS)
Mcgarry, Frank; Pajerski, Rose
1990-01-01
For 15 years, the Software Engineering Laboratory (SEL) at GSFC has been carrying out studies and experiments for the purpose of understanding, assessing, and improving software, and software processes within a production software environment. The SEL comprises three major organizations: (1) the GSFC Flight Dynamics Division; (2) the University of Maryland Computer Science Department; and (3) the Computer Sciences Corporation Flight Dynamics Technology Group. These organizations have jointly carried out several hundred software studies, producing hundreds of reports, papers, and documents: all describing some aspect of the software engineering technology that has undergone analysis in the flight dynamics environment. The studies range from small controlled experiments (such as analyzing the effectiveness of code reading versus functional testing) to large, multiple-project studies (such as assessing the impacts of Ada on a production environment). The key findings that NASA feels have laid the foundation for ongoing and future software development and research activities are summarized.
Micros for the 1990's: An Update.
ERIC Educational Resources Information Center
Grosch, Audrey N.
1991-01-01
Discusses new hardware and software developments for microcomputers and considers strategies for future library microcomputing. Topics discussed include developments with Macintosh computers; the importance of local area networks (LANs); upgrading options for hardware; operating system upgrades; dynamic data exchange (DDE); microcomputer…
ElectroMagnetoEncephalography Software: Overview and Integration with Other EEG/MEG Toolboxes
Peyk, Peter; De Cesarei, Andrea; Junghöfer, Markus
2011-01-01
EMEGS (electromagnetic encephalography software) is a MATLAB toolbox designed to provide novice as well as expert users in the field of neuroscience with a variety of functions to perform analysis of EEG and MEG data. The software consists of a set of graphical interfaces devoted to preprocessing, analysis, and visualization of electromagnetic data. Moreover, it can be extended using a plug-in interface. Here, an overview of the capabilities of the toolbox is provided, together with a simple tutorial for both a standard ERP analysis and a time-frequency analysis. Latest features and future directions of the software development are presented in the final section. PMID:21577273
ElectroMagnetoEncephalography software: overview and integration with other EEG/MEG toolboxes.
Peyk, Peter; De Cesarei, Andrea; Junghöfer, Markus
2011-01-01
EMEGS (electromagnetic encephalography software) is a MATLAB toolbox designed to provide novice as well as expert users in the field of neuroscience with a variety of functions to perform analysis of EEG and MEG data. The software consists of a set of graphical interfaces devoted to preprocessing, analysis, and visualization of electromagnetic data. Moreover, it can be extended using a plug-in interface. Here, an overview of the capabilities of the toolbox is provided, together with a simple tutorial for both a standard ERP analysis and a time-frequency analysis. Latest features and future directions of the software development are presented in the final section.
Flight Controller Software Protects Lightweight Flexible Aircraft
NASA Technical Reports Server (NTRS)
2015-01-01
Lightweight flexible aircraft may be the future of aviation, but a major problem is their susceptibility to flutter-uncontrollable vibrations that can destroy wings. Armstrong Flight Research Center awarded SBIR funding to Minneapolis, Minnesota-based MUSYN Inc. to develop software that helps program flight controllers to suppress flutter. The technology is now available for aircraft manufacturers and other industries that use equipment with automated controls.
Approaches and possible improvements in the area of multibody dynamics modeling
NASA Technical Reports Server (NTRS)
Lips, K. W.; Singh, R.
1987-01-01
A wide ranging look is taken at issues involved in the dynamic modeling of complex, multibodied orbiting space systems. Capabilities and limitations of two major codes (DISCOS, TREETOPS) are assessed and possible extensions to the CONTOPS software are outlined. In addition, recommendations are made concerning the direction future development should take in order to achieve higher fidelity, more computationally efficient multibody software solutions.
PySE: Software for extracting sources from radio images
NASA Astrophysics Data System (ADS)
Carbone, D.; Garsden, H.; Spreeuw, H.; Swinbank, J. D.; van der Horst, A. J.; Rowlinson, A.; Broderick, J. W.; Rol, E.; Law, C.; Molenaar, G.; Wijers, R. A. M. J.
2018-04-01
PySE is a Python software package for finding and measuring sources in radio telescope images. The software was designed to detect sources in the LOFAR telescope images, but can be used with images from other radio telescopes as well. We introduce the LOFAR Telescope, the context within which PySE was developed, the design of PySE, and describe how it is used. Detailed experiments on the validation and testing of PySE are then presented, along with results of performance testing. We discuss some of the current issues with the algorithms implemented in PySE and their interaction with LOFAR images, concluding with the current status of PySE and its future development.
Computing in high-energy physics
Mount, Richard P.
2016-05-31
I present a very personalized journey through more than three decades of computing for experimental high-energy physics, pointing out the enduring lessons that I learned. This is followed by a vision of how the computing environment will evolve in the coming ten years and the technical challenges that this will bring. I then address the scale and cost of high-energy physics software and examine the many current and future challenges, particularly those of management, funding and software-lifecycle management. Lastly, I describe recent developments aimed at improving the overall coherence of high-energy physics software.
Computing in high-energy physics
NASA Astrophysics Data System (ADS)
Mount, Richard P.
2016-04-01
I present a very personalized journey through more than three decades of computing for experimental high-energy physics, pointing out the enduring lessons that I learned. This is followed by a vision of how the computing environment will evolve in the coming ten years and the technical challenges that this will bring. I then address the scale and cost of high-energy physics software and examine the many current and future challenges, particularly those of management, funding and software-lifecycle management. Finally, I describe recent developments aimed at improving the overall coherence of high-energy physics software.
Computing in high-energy physics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mount, Richard P.
I present a very personalized journey through more than three decades of computing for experimental high-energy physics, pointing out the enduring lessons that I learned. This is followed by a vision of how the computing environment will evolve in the coming ten years and the technical challenges that this will bring. I then address the scale and cost of high-energy physics software and examine the many current and future challenges, particularly those of management, funding and software-lifecycle management. Lastly, I describe recent developments aimed at improving the overall coherence of high-energy physics software.
Automated Operations Development for Advanced Exploration Systems
NASA Technical Reports Server (NTRS)
Haddock, Angie; Stetson, Howard K.
2012-01-01
Automated space operations command and control software development and its implementation must be an integral part of the vehicle design effort. The software design must encompass autonomous fault detection, isolation, recovery capabilities and also provide single button intelligent functions for the crew. Development, operations and safety approval experience with the Timeliner system on-board the International Space Station (ISS), which provided autonomous monitoring with response and single command functionality of payload systems, can be built upon for future automated operations as the ISS Payload effort was the first and only autonomous command and control system to be in continuous execution (6 years), 24 hours a day, 7 days a week within a crewed spacecraft environment. Utilizing proven capabilities from the ISS Higher Active Logic (HAL) System [1] , along with the execution component design from within the HAL 9000 Space Operating System [2] , this design paper will detail the initial HAL System software architecture and interfaces as applied to NASA s Habitat Demonstration Unit (HDU) in support of the Advanced Exploration Systems, Autonomous Mission Operations project. The development and implementation of integrated simulators within this development effort will also be detailed and is the first step in verifying the HAL 9000 Integrated Test-Bed Component [2] designs effectiveness. This design paper will conclude with a summary of the current development status and future development goals as it pertains to automated command and control for the HDU.
Automated Operations Development for Advanced Exploration Systems
NASA Technical Reports Server (NTRS)
Haddock, Angie T.; Stetson, Howard
2012-01-01
Automated space operations command and control software development and its implementation must be an integral part of the vehicle design effort. The software design must encompass autonomous fault detection, isolation, recovery capabilities and also provide "single button" intelligent functions for the crew. Development, operations and safety approval experience with the Timeliner system onboard the International Space Station (ISS), which provided autonomous monitoring with response and single command functionality of payload systems, can be built upon for future automated operations as the ISS Payload effort was the first and only autonomous command and control system to be in continuous execution (6 years), 24 hours a day, 7 days a week within a crewed spacecraft environment. Utilizing proven capabilities from the ISS Higher Active Logic (HAL) System, along with the execution component design from within the HAL 9000 Space Operating System, this design paper will detail the initial HAL System software architecture and interfaces as applied to NASA's Habitat Demonstration Unit (HDU) in support of the Advanced Exploration Systems, Autonomous Mission Operations project. The development and implementation of integrated simulators within this development effort will also be detailed and is the first step in verifying the HAL 9000 Integrated Test-Bed Component [2] designs effectiveness. This design paper will conclude with a summary of the current development status and future development goals as it pertains to automated command and control for the HDU.
Launching GUPPI: the Green Bank Ultimate Pulsar Processing Instrument
NASA Astrophysics Data System (ADS)
DuPlain, Ron; Ransom, Scott; Demorest, Paul; Brandt, Patrick; Ford, John; Shelton, Amy L.
2008-08-01
The National Radio Astronomy Observatory (NRAO) is launching the Green Bank Ultimate Pulsar Processing Instrument (GUPPI), a prototype flexible digital signal processor designed for pulsar observations with the Robert C. Byrd Green Bank Telescope (GBT). GUPPI uses field programmable gate array (FPGA) hardware and design tools developed by the Center for Astronomy Signal Processing and Electronics Research (CASPER) at the University of California, Berkeley. The NRAO has been concurrently developing GUPPI software and hardware using minimal software resources. The software handles instrument monitor and control, data acquisition, and hardware interfacing. GUPPI is currently an expert-only spectrometer, but supports future integration with the full GBT production system. The NRAO was able to take advantage of the unique flexibility of the CASPER FPGA hardware platform, develop hardware and software in parallel, and build a suite of software tools for monitoring, controlling, and acquiring data with a new instrument over a short timeline of just a few months. The NRAO interacts regularly with CASPER and its users, and GUPPI stands as an example of what reconfigurable computing and open-source development can do for radio astronomy. GUPPI is modular for portability, and the NRAO provides the results of development as an open-source resource.
Jha, Ashish Kumar
2015-01-01
Glomerular filtration rate (GFR) estimation by plasma sampling method is considered as the gold standard. However, this method is not widely used because the complex technique and cumbersome calculations coupled with the lack of availability of user-friendly software. The routinely used Serum Creatinine method (SrCrM) of GFR estimation also requires the use of online calculators which cannot be used without internet access. We have developed user-friendly software "GFR estimation software" which gives the options to estimate GFR by plasma sampling method as well as SrCrM. We have used Microsoft Windows(®) as operating system and Visual Basic 6.0 as the front end and Microsoft Access(®) as database tool to develop this software. We have used Russell's formula for GFR calculation by plasma sampling method. GFR calculations using serum creatinine have been done using MIRD, Cockcroft-Gault method, Schwartz method, and Counahan-Barratt methods. The developed software is performing mathematical calculations correctly and is user-friendly. This software also enables storage and easy retrieval of the raw data, patient's information and calculated GFR for further processing and comparison. This is user-friendly software to calculate the GFR by various plasma sampling method and blood parameter. This software is also a good system for storing the raw and processed data for future analysis.
Future Standardization of Space Telecommunications Radio System with Core Flight System
NASA Technical Reports Server (NTRS)
Hickey, Joseph P.; Briones, Janette C.; Roche, Rigoberto; Handler, Louis M.; Hall, Steven
2016-01-01
NASA Glenn Research Center (GRC) is integrating the NASA Space Telecommunications Radio System (STRS) Standard with the Core Flight System (cFS). The STRS standard provides a common, consistent framework to develop, qualify, operate and maintain complex, reconfigurable and reprogrammable radio systems. The cFS is a flexible, open architecture that features a plug-and-play software executive called the Core Flight Executive (cFE), a reusable library of software components for flight and space missions and an integrated tool suite. Together, STRS and cFS create a development environment that allows for STRS compliant applications to reference the STRS APIs through the cFS infrastructure. These APis are used to standardize the communication protocols on NASAs space SDRs. The cFE-STRS Operating Environment (OE) is a portable cFS library, which adds the ability to run STRS applications on existing cFS platforms. The purpose of this paper is to discuss the cFE-STRS OE prototype, preliminary experimental results performed using the Advanced Space Radio Platform (ASRP), the GRC Sband Ground Station and the SCaN (Space Communication and Navigation) Testbed currently flying onboard the International Space Station. Additionally, this paper presents a demonstration of the Consultative Committee for Space Data Systems (CCSDS) Spacecraft Onboard Interface Services (SOIS) using electronic data sheets inside cFE. This configuration allows for the data sheets to specify binary formats for data exchange between STRS applications. The integration of STRS with cFS leverages mission-proven platform functions and mitigates barriers to integration with future missions. This reduces flight software development time and the costs of software-defined radio (SDR) platforms. Furthermore, the combined benefits of STRS standardization with the flexibility of cFS provide an effective, reliable and modular framework to minimize software development efforts for spaceflight missions.
NASA Technical Reports Server (NTRS)
Broderick, Ron
1997-01-01
The ultimate goal of this report was to integrate the powerful tools of artificial intelligence into the traditional process of software development. To maintain the US aerospace competitive advantage, traditional aerospace and software engineers need to more easily incorporate the technology of artificial intelligence into the advanced aerospace systems being designed today. The future goal was to transition artificial intelligence from an emerging technology to a standard technology that is considered early in the life cycle process to develop state-of-the-art aircraft automation systems. This report addressed the future goal in two ways. First, it provided a matrix that identified typical aircraft automation applications conducive to various artificial intelligence methods. The purpose of this matrix was to provide top-level guidance to managers contemplating the possible use of artificial intelligence in the development of aircraft automation. Second, the report provided a methodology to formally evaluate neural networks as part of the traditional process of software development. The matrix was developed by organizing the discipline of artificial intelligence into the following six methods: logical, object representation-based, distributed, uncertainty management, temporal and neurocomputing. Next, a study of existing aircraft automation applications that have been conducive to artificial intelligence implementation resulted in the following five categories: pilot-vehicle interface, system status and diagnosis, situation assessment, automatic flight planning, and aircraft flight control. The resulting matrix provided management guidance to understand artificial intelligence as it applied to aircraft automation. The approach taken to develop a methodology to formally evaluate neural networks as part of the software engineering life cycle was to start with the existing software quality assurance standards and to change these standards to include neural network development. The changes were to include evaluation tools that can be applied to neural networks at each phase of the software engineering life cycle. The result was a formal evaluation approach to increase the product quality of systems that use neural networks for their implementation.
Generating Safety-Critical PLC Code From a High-Level Application Software Specification
NASA Technical Reports Server (NTRS)
2008-01-01
The benefits of automatic-application code generation are widely accepted within the software engineering community. These benefits include raised abstraction level of application programming, shorter product development time, lower maintenance costs, and increased code quality and consistency. Surprisingly, code generation concepts have not yet found wide acceptance and use in the field of programmable logic controller (PLC) software development. Software engineers at Kennedy Space Center recognized the need for PLC code generation while developing the new ground checkout and launch processing system, called the Launch Control System (LCS). Engineers developed a process and a prototype software tool that automatically translates a high-level representation or specification of application software into ladder logic that executes on a PLC. All the computer hardware in the LCS is planned to be commercial off the shelf (COTS), including industrial controllers or PLCs that are connected to the sensors and end items out in the field. Most of the software in LCS is also planned to be COTS, with only small adapter software modules that must be developed in order to interface between the various COTS software products. A domain-specific language (DSL) is a programming language designed to perform tasks and to solve problems in a particular domain, such as ground processing of launch vehicles. The LCS engineers created a DSL for developing test sequences of ground checkout and launch operations of future launch vehicle and spacecraft elements, and they are developing a tabular specification format that uses the DSL keywords and functions familiar to the ground and flight system users. The tabular specification format, or tabular spec, allows most ground and flight system users to document how the application software is intended to function and requires little or no software programming knowledge or experience. A small sample from a prototype tabular spec application is shown.
Demonstration the Class, Object and Inheritance Concepts by Software
ERIC Educational Resources Information Center
Udvaros, József; Gubán, Miklós
2016-01-01
The world all around us is rapidly developing. We are witnessing the rapid evolution of technology and communication. This means new challenges and responsibilities to future strategies and attitudes. Today's operating systems and development environments apply the principle of OOP; therefore today's developments are inconceivable without the…
An overview of very high level software design methods
NASA Technical Reports Server (NTRS)
Asdjodi, Maryam; Hooper, James W.
1988-01-01
Very High Level design methods emphasize automatic transfer of requirements to formal design specifications, and/or may concentrate on automatic transformation of formal design specifications that include some semantic information of the system into machine executable form. Very high level design methods range from general domain independent methods to approaches implementable for specific applications or domains. Applying AI techniques, abstract programming methods, domain heuristics, software engineering tools, library-based programming and other methods different approaches for higher level software design are being developed. Though one finds that a given approach does not always fall exactly in any specific class, this paper provides a classification for very high level design methods including examples for each class. These methods are analyzed and compared based on their basic approaches, strengths and feasibility for future expansion toward automatic development of software systems.
Topalov, Angel A; Katsounaros, Ioannis; Meier, Josef C; Klemm, Sebastian O; Mayrhofer, Karl J J
2011-11-01
This paper describes a system for performing electrochemical catalyst testing where all hardware components are controlled simultaneously using a single LabVIEW-based software application. The software that we developed can be operated in both manual mode for exploratory investigations and automatic mode for routine measurements, by using predefined execution procedures. The latter enables the execution of high-throughput or combinatorial investigations, which decrease substantially the time and cost for catalyst testing. The software was constructed using a modular architecture which simplifies the modification or extension of the system, depending on future needs. The system was tested by performing stability tests of commercial fuel cell electrocatalysts, and the advantages of the developed system are discussed. © 2011 American Institute of Physics
NASA Technical Reports Server (NTRS)
Albus, James S.; Mccain, Harry G.; Lumia, Ronald
1989-01-01
The document describes the NASA Standard Reference Model (NASREM) Architecture for the Space Station Telerobot Control System. It defines the functional requirements and high level specifications of the control system for the NASA space Station document for the functional specification, and a guideline for the development of the control system architecture, of the 10C Flight Telerobot Servicer. The NASREM telerobot control system architecture defines a set of standard modules and interfaces which facilitates software design, development, validation, and test, and make possible the integration of telerobotics software from a wide variety of sources. Standard interfaces also provide the software hooks necessary to incrementally upgrade future Flight Telerobot Systems as new capabilities develop in computer science, robotics, and autonomous system control.
Development of Shanghai satellite laser ranging station
NASA Technical Reports Server (NTRS)
Yang, Fu-Min; Tan, De-Tong; Xiao, Chi-Kun; Chen, Wan-Zhen; Zhang, J.-H.; Zhang, Z.-P.; Lu, Wen-Hu; Hu, Z.-Q.; Tang, W.-F.; Chen, J.-P.
1993-01-01
The topics covered include the following: improvement of the system hardware; upgrading of the software; the observation status; preliminary daylight tracking capability; testing the new type of laser; and future plans.
Space shuttle onboard navigation console expert/trainer system
NASA Technical Reports Server (NTRS)
Wang, Lui; Bochsler, Dan
1987-01-01
A software system for use in enhancing operational performance as well as training ground controllers in monitoring onboard Space Shuttle navigation sensors is described. The Onboard Navigation (ONAV) development reflects a trend toward following a structured and methodical approach to development. The ONAV system must deal with integrated conventional and expert system software, complex interfaces, and implementation limitations due to the target operational environment. An overview of the onboard navigation sensor monitoring function is presented, along with a description of guidelines driving the development effort, requirements that the system must meet, current progress, and future efforts.
NASA Astrophysics Data System (ADS)
Gunawan, D.; Amalia, A.; Rahmat, R. F.; Muchtar, M. A.; Siregar, I.
2018-02-01
Identification of software maturity level is a technique to determine the quality of the software. By identifying the software maturity level, the weaknesses of the software can be observed. As a result, the recommendations might be a reference for future software maintenance and development. This paper discusses the software Capability Level (CL) with case studies on Quality Management Unit (Unit Manajemen Mutu) University of Sumatera Utara (UMM-USU). This research utilized Standard CMMI Appraisal Method for Process Improvement class C (SCAMPI C) model with continuous representation. This model focuses on activities for developing quality products and services. The observation is done in three process areas, such as Project Planning (PP), Project Monitoring and Control (PMC), and Requirements Management (REQM). According to the measurement of software capability level for UMM-USU software, turns out that the capability level for the observed process area is in the range of CL1 and CL2. Planning Project (PP) is the only process area which reaches capability level 2, meanwhile, PMC and REQM are still in CL 1 or in performed level. This research reveals several weaknesses of existing UMM-USU software. Therefore, this study proposes several recommendations for UMM-USU to improve capability level for observed process areas.
Integrating interface slicing into software engineering processes
NASA Technical Reports Server (NTRS)
Beck, Jon
1993-01-01
Interface slicing is a tool which was developed to facilitate software engineering. As previously presented, it was described in terms of its techniques and mechanisms. The integration of interface slicing into specific software engineering activities is considered by discussing a number of potential applications of interface slicing. The applications discussed specifically address the problems, issues, or concerns raised in a previous project. Because a complete interface slicer is still under development, these applications must be phrased in future tenses. Nonetheless, the interface slicing techniques which were presented can be implemented using current compiler and static analysis technology. Whether implemented as a standalone tool or as a module in an integrated development or reverse engineering environment, they require analysis no more complex than that required for current system development environments. By contrast, conventional slicing is a methodology which, while showing much promise and intuitive appeal, has yet to be fully implemented in a production language environment despite 12 years of development.
Engineering and Software Engineering
NASA Astrophysics Data System (ADS)
Jackson, Michael
The phrase ‘software engineering' has many meanings. One central meaning is the reliable development of dependable computer-based systems, especially those for critical applications. This is not a solved problem. Failures in software development have played a large part in many fatalities and in huge economic losses. While some of these failures may be attributable to programming errors in the narrowest sense—a program's failure to satisfy a given formal specification—there is good reason to think that most of them have other roots. These roots are located in the problem of software engineering rather than in the problem of program correctness. The famous 1968 conference was motivated by the belief that software development should be based on “the types of theoretical foundations and practical disciplines that are traditional in the established branches of engineering.” Yet after forty years of currency the phrase ‘software engineering' still denotes no more than a vague and largely unfulfilled aspiration. Two major causes of this disappointment are immediately clear. First, too many areas of software development are inadequately specialised, and consequently have not developed the repertoires of normal designs that are the indispensable basis of reliable engineering success. Second, the relationship between structural design and formal analytical techniques for software has rarely been one of fruitful synergy: too often it has defined a boundary between competing dogmas, at which mutual distrust and incomprehension deprive both sides of advantages that should be within their grasp. This paper discusses these causes and their effects. Whether the common practice of software development will eventually satisfy the broad aspiration of 1968 is hard to predict; but an understanding of past failure is surely a prerequisite of future success.
Use of Field Programmable Gate Array Technology in Future Space Avionics
NASA Technical Reports Server (NTRS)
Ferguson, Roscoe C.; Tate, Robert
2005-01-01
Fulfilling NASA's new vision for space exploration requires the development of sustainable, flexible and fault tolerant spacecraft control systems. The traditional development paradigm consists of the purchase or fabrication of hardware boards with fixed processor and/or Digital Signal Processing (DSP) components interconnected via a standardized bus system. This is followed by the purchase and/or development of software. This paradigm has several disadvantages for the development of systems to support NASA's new vision. Building a system to be fault tolerant increases the complexity and decreases the performance of included software. Standard bus design and conventional implementation produces natural bottlenecks. Configuring hardware components in systems containing common processors and DSPs is difficult initially and expensive or impossible to change later. The existence of Hardware Description Languages (HDLs), the recent increase in performance, density and radiation tolerance of Field Programmable Gate Arrays (FPGAs), and Intellectual Property (IP) Cores provides the technology for reprogrammable Systems on a Chip (SOC). This technology supports a paradigm better suited for NASA's vision. Hardware and software production are melded for more effective development; they can both evolve together over time. Designers incorporating this technology into future avionics can benefit from its flexibility. Systems can be designed with improved fault isolation and tolerance using hardware instead of software. Also, these designs can be protected from obsolescence problems where maintenance is compromised via component and vendor availability.To investigate the flexibility of this technology, the core of the Central Processing Unit and Input/Output Processor of the Space Shuttle AP101S Computer were prototyped in Verilog HDL and synthesized into an Altera Stratix FPGA.
Future Standardization of Space Telecommunications Radio System with Core Flight System
NASA Technical Reports Server (NTRS)
Briones, Janette C.; Hickey, Joseph P.; Roche, Rigoberto; Handler, Louis M.; Hall, Charles S.
2016-01-01
NASA Glenn Research Center (GRC) is integrating the NASA Space Telecommunications Radio System (STRS) Standard with the Core Flight System (cFS), an avionics software operating environment. The STRS standard provides a common, consistent framework to develop, qualify, operate and maintain complex, reconfigurable and reprogrammable radio systems. The cFS is a flexible, open architecture that features a plugand- play software executive called the Core Flight Executive (cFE), a reusable library of software components for flight and space missions and an integrated tool suite. Together, STRS and cFS create a development environment that allows for STRS compliant applications to reference the STRS application programmer interfaces (APIs) that use the cFS infrastructure. These APIs are used to standardize the communication protocols on NASAs space SDRs. The cFS-STRS Operating Environment (OE) is a portable cFS library, which adds the ability to run STRS applications on existing cFS platforms. The purpose of this paper is to discuss the cFS-STRS OE prototype, preliminary experimental results performed using the Advanced Space Radio Platform (ASRP), the GRC S- band Ground Station and the SCaN (Space Communication and Navigation) Testbed currently flying onboard the International Space Station (ISS). Additionally, this paper presents a demonstration of the Consultative Committee for Space Data Systems (CCSDS) Spacecraft Onboard Interface Services (SOIS) using electronic data sheets (EDS) inside cFE. This configuration allows for the data sheets to specify binary formats for data exchange between STRS applications. The integration of STRS with cFS leverages mission-proven platform functions and mitigates barriers to integration with future missions. This reduces flight software development time and the costs of software-defined radio (SDR) platforms. Furthermore, the combined benefits of STRS standardization with the flexibility of cFS provide an effective, reliable and modular framework to minimize software development efforts for spaceflight missions.
Trauma Pod/Operating Room of the Future
2006-02-01
into C++ objects. OpenBinder software provided by ORNL was also used. This approach reduces the potential errors that might be introduced by...publications can be found here. OSCAR has been used by developers at the Univ. of Texas, ORNL , NASA/Ames, and NASA/JSC. RRGKinematix, a single...the last DH frame (at the wrist) is 70 mm. Position Travel Limits (degrees) - these are software limits as specified by ORNL Joint 1
A software system for the simulation of chest lesions
NASA Astrophysics Data System (ADS)
Ryan, John T.; McEntee, Mark; Barrett, Saoirse; Evanoff, Michael; Manning, David; Brennan, Patrick
2007-03-01
We report on the development of a novel software tool for the simulation of chest lesions. This software tool was developed for use in our study to attain optimal ambient lighting conditions for chest radiology. This study involved 61 consultant radiologists from the American Board of Radiology. Because of its success, we intend to use the same tool for future studies. The software has two main functions: the simulation of lesions and retrieval of information for ROC (Receiver Operating Characteristic) and JAFROC (Jack-Knife Free Response ROC) analysis. The simulation layer operates by randomly selecting an image from a bank of reportedly normal chest x-rays. A random location is then generated for each lesion, which is checked against a reference lung-map. If the location is within the lung fields, as derived from the lung-map, a lesion is superimposed. Lesions are also randomly selected from a bank of manually created chest lesion images. A blending algorithm determines which are the best intensity levels for the lesion to sit naturally within the chest x-ray. The same software was used to run a study for all 61 radiologists. A sequence of images is displayed in random order. Half of these images had simulated lesions, ranging from subtle to obvious, and half of the images were normal. The operator then selects locations where he/she thinks lesions exist and grades the lesion accordingly. We have found that this software was very effective in this study and intend to use the same principles for future studies.
Global Software Development with Cloud Platforms
NASA Astrophysics Data System (ADS)
Yara, Pavan; Ramachandran, Ramaseshan; Balasubramanian, Gayathri; Muthuswamy, Karthik; Chandrasekar, Divya
Offshore and outsourced distributed software development models and processes are facing challenges, previously unknown, with respect to computing capacity, bandwidth, storage, security, complexity, reliability, and business uncertainty. Clouds promise to address these challenges by adopting recent advances in virtualization, parallel and distributed systems, utility computing, and software services. In this paper, we envision a cloud-based platform that addresses some of these core problems. We outline a generic cloud architecture, its design and our first implementation results for three cloud forms - a compute cloud, a storage cloud and a cloud-based software service- in the context of global distributed software development (GSD). Our ”compute cloud” provides computational services such as continuous code integration and a compile server farm, ”storage cloud” offers storage (block or file-based) services with an on-line virtual storage service, whereas the on-line virtual labs represent a useful cloud service. We note some of the use cases for clouds in GSD, the lessons learned with our prototypes and identify challenges that must be conquered before realizing the full business benefits. We believe that in the future, software practitioners will focus more on these cloud computing platforms and see clouds as a means to supporting a ecosystem of clients, developers and other key stakeholders.
Bigger data, collaborative tools and the future of predictive drug discovery
NASA Astrophysics Data System (ADS)
Ekins, Sean; Clark, Alex M.; Swamidass, S. Joshua; Litterman, Nadia; Williams, Antony J.
2014-10-01
Over the past decade we have seen a growth in the provision of chemistry data and cheminformatics tools as either free websites or software as a service commercial offerings. These have transformed how we find molecule-related data and use such tools in our research. There have also been efforts to improve collaboration between researchers either openly or through secure transactions using commercial tools. A major challenge in the future will be how such databases and software approaches handle larger amounts of data as it accumulates from high throughput screening and enables the user to draw insights, enable predictions and move projects forward. We now discuss how information from some drug discovery datasets can be made more accessible and how privacy of data should not overwhelm the desire to share it at an appropriate time with collaborators. We also discuss additional software tools that could be made available and provide our thoughts on the future of predictive drug discovery in this age of big data. We use some examples from our own research on neglected diseases, collaborations, mobile apps and algorithm development to illustrate these ideas.
NASA Technical Reports Server (NTRS)
Mallasch, Paul G.; Babic, Slavoljub
1994-01-01
The United States Air Force (USAF) provides NASA Lewis Research Center with monthly reports containing the Synchronous Satellite Catalog and the associated Two Line Mean Element Sets. The USAF Synchronous Satellite Catalog supplies satellite orbital parameters collected by an automated monitoring system and provided to Lewis Research Center as text files on magnetic tape. Software was developed to facilitate automated formatting, data normalization, cross-referencing, and error correction of Synchronous Satellite Catalog files before loading into the NASA Geosynchronous Satellite Orbital Statistics Database System (GSOSTATS). This document contains the User's Guide and Software Maintenance Manual with information necessary for installation, initialization, start-up, operation, error recovery, and termination of the software application. It also contains implementation details, modification aids, and software source code adaptations for use in future revisions.
Architecture for interoperable software in biology.
Bare, James Christopher; Baliga, Nitin S
2014-07-01
Understanding biological complexity demands a combination of high-throughput data and interdisciplinary skills. One way to bring to bear the necessary combination of data types and expertise is by encapsulating domain knowledge in software and composing that software to create a customized data analysis environment. To this end, simple flexible strategies are needed for interconnecting heterogeneous software tools and enabling data exchange between them. Drawing on our own work and that of others, we present several strategies for interoperability and their consequences, in particular, a set of simple data structures--list, matrix, network, table and tuple--that have proven sufficient to achieve a high degree of interoperability. We provide a few guidelines for the development of future software that will function as part of an interoperable community of software tools for biological data analysis and visualization. © The Author 2012. Published by Oxford University Press.
The instrumental genesis process in future primary teachers using Dynamic Geometry Software
NASA Astrophysics Data System (ADS)
Ruiz-López, Natalia
2018-05-01
This paper, which describes a study undertaken with pairs of future primary teachers using GeoGebra software to solve geometry problems, includes a brief literature review, the theoretical framework and methodology used. An analysis of the instrumental genesis process for a pair participating in the case study is also provided. This analysis addresses the techniques and types of dragging used, the obstacles to learning encountered, a description of the interaction between the pair and their interaction with the teacher, and the type of language used. Based on this analysis, possibilities and limitations of the instrumental genesis process are identified for the development of geometric competencies such as conjecture creation, property checking and problem researching. It is also suggested that the methodology used in the analysis of the problem solving process may be useful for those teachers and researchers who want to integrate Dynamic Geometry Software (DGS) in their classrooms.
A Survey of Middleware for Sensor and Network Virtualization
Khalid, Zubair; Fisal, Norsheila; Rozaini, Mohd.
2014-01-01
Wireless Sensor Network (WSN) is leading to a new paradigm of Internet of Everything (IoE). WSNs have a wide range of applications but are usually deployed in a particular application. However, the future of WSNs lies in the aggregation and allocation of resources, serving diverse applications. WSN virtualization by the middleware is an emerging concept that enables aggregation of multiple independent heterogeneous devices, networks, radios and software platforms; and enhancing application development. WSN virtualization, middleware can further be categorized into sensor virtualization and network virtualization. Middleware for WSN virtualization poses several challenges like efficient decoupling of networks, devices and software. In this paper efforts have been put forward to bring an overview of the previous and current middleware designs for WSN virtualization, the design goals, software architectures, abstracted services, testbeds and programming techniques. Furthermore, the paper also presents the proposed model, challenges and future opportunities for further research in the middleware designs for WSN virtualization. PMID:25615737
A survey of middleware for sensor and network virtualization.
Khalid, Zubair; Fisal, Norsheila; Rozaini, Mohd
2014-12-12
Wireless Sensor Network (WSN) is leading to a new paradigm of Internet of Everything (IoE). WSNs have a wide range of applications but are usually deployed in a particular application. However, the future of WSNs lies in the aggregation and allocation of resources, serving diverse applications. WSN virtualization by the middleware is an emerging concept that enables aggregation of multiple independent heterogeneous devices, networks, radios and software platforms; and enhancing application development. WSN virtualization, middleware can further be categorized into sensor virtualization and network virtualization. Middleware for WSN virtualization poses several challenges like efficient decoupling of networks, devices and software. In this paper efforts have been put forward to bring an overview of the previous and current middleware designs for WSN virtualization, the design goals, software architectures, abstracted services, testbeds and programming techniques. Furthermore, the paper also presents the proposed model, challenges and future opportunities for further research in the middleware designs for WSN virtualization.
Cranswick, Lachlan Michael David
2008-01-01
The history of crystallographic computing and use of crystallographic software is one which traces the escape from the drudgery of manual human calculations to a world where the user delegates most of the travail to electronic computers. In practice, this involves practising crystallographers communicating their thoughts to the crystallographic program authors, in the hope that new procedures will be implemented within their software. Against this background, the development of small-molecule single-crystal and powder diffraction software is traced. Starting with the analogue machines and the use of Hollerith tabulators of the late 1930's, it is shown that computing developments have been science led, with new technologies being harnessed to solve pressing crystallographic problems. The development of software is also traced, with a final caution that few of the computations now performed daily are really understood by the program users. Unless a sufficient body of people continues to dismantle and re-build programs, the knowledge encoded in the old programs will become as inaccessible as the knowledge of how to build the Great Pyramid at Giza.
Review of Software Tools for Design and Analysis of Large scale MRM Proteomic Datasets
Colangelo, Christopher M.; Chung, Lisa; Bruce, Can; Cheung, Kei-Hoi
2013-01-01
Selective or Multiple Reaction monitoring (SRM/MRM) is a liquid-chromatography (LC)/tandem-mass spectrometry (MS/MS) method that enables the quantitation of specific proteins in a sample by analyzing precursor ions and the fragment ions of their selected tryptic peptides. Instrumentation software has advanced to the point that thousands of transitions (pairs of primary and secondary m/z values) can be measured in a triple quadrupole instrument coupled to an LC, by a well-designed scheduling and selection of m/z windows. The design of a good MRM assay relies on the availability of peptide spectra from previous discovery-phase LC-MS/MS studies. The tedious aspect of manually developing and processing MRM assays involving thousands of transitions has spurred to development of software tools to automate this process. Software packages have been developed for project management, assay development, assay validation, data export, peak integration, quality assessment, and biostatistical analysis. No single tool provides a complete end-to-end solution, thus this article reviews the current state and discusses future directions of these software tools in order to enable researchers to combine these tools for a comprehensive targeted proteomics workflow. PMID:23702368
Software reliability perspectives
NASA Technical Reports Server (NTRS)
Wilson, Larry; Shen, Wenhui
1987-01-01
Software which is used in life critical functions must be known to be highly reliable before installation. This requires a strong testing program to estimate the reliability, since neither formal methods, software engineering nor fault tolerant methods can guarantee perfection. Prior to the final testing software goes through a debugging period and many models have been developed to try to estimate reliability from the debugging data. However, the existing models are poorly validated and often give poor performance. This paper emphasizes the fact that part of their failures can be attributed to the random nature of the debugging data given to these models as input, and it poses the problem of correcting this defect as an area of future research.
Craniux: A LabVIEW-Based Modular Software Framework for Brain-Machine Interface Research
Degenhart, Alan D.; Kelly, John W.; Ashmore, Robin C.; Collinger, Jennifer L.; Tyler-Kabara, Elizabeth C.; Weber, Douglas J.; Wang, Wei
2011-01-01
This paper presents “Craniux,” an open-access, open-source software framework for brain-machine interface (BMI) research. Developed in LabVIEW, a high-level graphical programming environment, Craniux offers both out-of-the-box functionality and a modular BMI software framework that is easily extendable. Specifically, it allows researchers to take advantage of multiple features inherent to the LabVIEW environment for on-the-fly data visualization, parallel processing, multithreading, and data saving. This paper introduces the basic features and system architecture of Craniux and describes the validation of the system under real-time BMI operation using simulated and real electrocorticographic (ECoG) signals. Our results indicate that Craniux is able to operate consistently in real time, enabling a seamless work flow to achieve brain control of cursor movement. The Craniux software framework is made available to the scientific research community to provide a LabVIEW-based BMI software platform for future BMI research and development. PMID:21687575
Craniux: a LabVIEW-based modular software framework for brain-machine interface research.
Degenhart, Alan D; Kelly, John W; Ashmore, Robin C; Collinger, Jennifer L; Tyler-Kabara, Elizabeth C; Weber, Douglas J; Wang, Wei
2011-01-01
This paper presents "Craniux," an open-access, open-source software framework for brain-machine interface (BMI) research. Developed in LabVIEW, a high-level graphical programming environment, Craniux offers both out-of-the-box functionality and a modular BMI software framework that is easily extendable. Specifically, it allows researchers to take advantage of multiple features inherent to the LabVIEW environment for on-the-fly data visualization, parallel processing, multithreading, and data saving. This paper introduces the basic features and system architecture of Craniux and describes the validation of the system under real-time BMI operation using simulated and real electrocorticographic (ECoG) signals. Our results indicate that Craniux is able to operate consistently in real time, enabling a seamless work flow to achieve brain control of cursor movement. The Craniux software framework is made available to the scientific research community to provide a LabVIEW-based BMI software platform for future BMI research and development.
STRS Radio Service Software for NASA's SCaN Testbed
NASA Technical Reports Server (NTRS)
Mortensen, Dale J.; Bishop, Daniel Wayne; Chelmins, David T.
2012-01-01
NASAs Space Communication and Navigation(SCaN) Testbed was launched to the International Space Station in 2012. The objective is to promote new software defined radio technologies and associated software application reuse, enabled by this first flight of NASAs Space Telecommunications Radio System(STRS) architecture standard. Pre-launch testing with the testbeds software defined radios was performed as part of system integration. Radio services for the JPL SDR were developed during system integration to allow the waveform application to operate properly in the space environment, especially considering thermal effects. These services include receiver gain control, frequency offset, IQ modulator balance, and transmit level control. Development, integration, and environmental testing of the radio services will be described. The added software allows the waveform application to operate properly in the space environment, and can be reused by future experimenters testing different waveform applications. Integrating such services with the platform provided STRS operating environment will attract more users, and these services are candidates for interface standardization via STRS.
STRS Radio Service Software for NASA's SCaN Testbed
NASA Technical Reports Server (NTRS)
Mortensen, Dale J.; Bishop, Daniel Wayne; Chelmins, David T.
2013-01-01
NASA's Space Communication and Navigation(SCaN) Testbed was launched to the International Space Station in 2012. The objective is to promote new software defined radio technologies and associated software application reuse, enabled by this first flight of NASA's Space Telecommunications Radio System (STRS) architecture standard. Pre-launch testing with the testbed's software defined radios was performed as part of system integration. Radio services for the JPL SDR were developed during system integration to allow the waveform application to operate properly in the space environment, especially considering thermal effects. These services include receiver gain control, frequency offset, IQ modulator balance, and transmit level control. Development, integration, and environmental testing of the radio services will be described. The added software allows the waveform application to operate properly in the space environment, and can be reused by future experimenters testing different waveform applications. Integrating such services with the platform provided STRS operating environment will attract more users, and these services are candidates for interface standardization via STRS.
Advanced Structural Optimization Under Consideration of Cost Tracking
NASA Astrophysics Data System (ADS)
Zell, D.; Link, T.; Bickelmaier, S.; Albinger, J.; Weikert, S.; Cremaschi, F.; Wiegand, A.
2014-06-01
In order to improve the design process of launcher configurations in the early development phase, the software Multidisciplinary Optimization (MDO) was developed. The tool combines different efficient software tools such as Optimal Design Investigations (ODIN) for structural optimizations, Aerospace Trajectory Optimization Software (ASTOS) for trajectory and vehicle design optimization for a defined payload and mission.The present paper focuses to the integration and validation of ODIN. ODIN enables the user to optimize typical axis-symmetric structures by means of sizing the stiffening designs concerning strength and stability while minimizing the structural mass. In addition a fully automatic finite element model (FEM) generator module creates ready-to-run FEM models of a complete stage or launcher assembly.Cost tracking respectively future improvements concerning cost optimization are indicated.
A LabVIEW® based generic CT scanner control software platform.
Dierick, M; Van Loo, D; Masschaele, B; Boone, M; Van Hoorebeke, L
2010-01-01
UGCT, the Centre for X-ray tomography at Ghent University (Belgium) does research on X-ray tomography and its applications. This includes the development and construction of state-of-the-art CT scanners for scientific research. Because these scanners are built for very different purposes they differ considerably in their physical implementations. However, they all share common principle functionality. In this context a generic software platform was developed using LabVIEW® in order to provide the same interface and functionality on all scanners. This article describes the concept and features of this software, and its potential for tomography in a research setting. The core concept is to rigorously separate the abstract operation of a CT scanner from its actual physical configuration. This separation is achieved by implementing a sender-listener architecture. The advantages are that the resulting software platform is generic, scalable, highly efficient, easy to develop and to extend, and that it can be deployed on future scanners with minimal effort.
Scout: high-performance heterogeneous computing made simple
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jablin, James; Mc Cormick, Patrick; Herlihy, Maurice
2011-01-26
Researchers must often write their own simulation and analysis software. During this process they simultaneously confront both computational and scientific problems. Current strategies for aiding the generation of performance-oriented programs do not abstract the software development from the science. Furthermore, the problem is becoming increasingly complex and pressing with the continued development of many-core and heterogeneous (CPU-GPU) architectures. To acbieve high performance, scientists must expertly navigate both software and hardware. Co-design between computer scientists and research scientists can alleviate but not solve this problem. The science community requires better tools for developing, optimizing, and future-proofing codes, allowing scientists to focusmore » on their research while still achieving high computational performance. Scout is a parallel programming language and extensible compiler framework targeting heterogeneous architectures. It provides the abstraction required to buffer scientists from the constantly-shifting details of hardware while still realizing higb-performance by encapsulating software and hardware optimization within a compiler framework.« less
Cooperative Work and Sustainable Scientific Software Practices in R
NASA Astrophysics Data System (ADS)
Weber, N.
2013-12-01
Most scientific software projects are dependent on the work of many diverse people, institutions and organizations. Incentivizing these actors to cooperatively develop software that is both reliable, and sustainable is complicated by the fact that the reward structures of these various actors greatly differ: research scientists want results from a software or model run in order to publish papers, produce new data, or test a hypothesis; software engineers and research centers want compilable, well documented code that is refactorable, reusable and reproducible in future research scenarios. While much research has been done on incentives and motivations for participating in open source software projects or cyberinfrastrcture development, little work has been done on what motivates or incentivizes developers to maintain scientific software projects beyond their original application. This poster will present early results of research into the incentives and motivation for cooperative scientific software development. In particular, this work focuses on motivations for the maintenance and repair of libraries on the software platform R. Our work here uses a sample of R packages that were created by research centers, or are specific to earth, environmental and climate science applications. We first mined 'check' logs from the Comprehensive R Archive Network (CRAN) to determine the amount of time a package has existed, the number of versions it has gone through over this time, the number of releases, and finally the contact information for each official package 'maintainer'. We then sent a survey to each official maintainer, asking them questions about what role they played in developing the original package, and what their motivations were for sustaining the project over time. We will present early results from this mining and our survey of R maintainers.
Ada training evaluation and recommendations from the Gamma Ray Observatory Ada Development Team
NASA Technical Reports Server (NTRS)
1985-01-01
The Ada training experiences of the Gamma Ray Observatory Ada development team are related, and recommendations are made concerning future Ada training for software developers. Training methods are evaluated, deficiencies in the training program are noted, and a recommended approach, including course outline, time allocation, and reference materials, is offered.
NASA Astrophysics Data System (ADS)
Downs, R. R.; Lenhardt, W. C.; Robinson, E.
2014-12-01
Science software is integral to the scientific process and must be developed and managed in a sustainable manner to ensure future access to scientific data and related resources. Organizations that are part of the scientific enterprise, as well as members of the scientific community who work within these entities, can contribute to the sustainability of science software and to practices that improve scientific community capabilities for science software sustainability. As science becomes increasingly digital and therefore, dependent on software, improving community practices for sustainable science software will contribute to the sustainability of science. Members of the Earth science informatics community, including scientific data producers and distributers, end-user scientists, system and application developers, and data center managers, use science software regularly and face the challenges and the opportunities that science software presents for the sustainability of science. To gain insight on practices needed for the sustainability of science software from the science software experiences of the Earth science informatics community, an interdisciplinary group of 300 community members were asked to engage in simultaneous roundtable discussions and report on their answers to questions about the requirements for improving scientific software sustainability. This paper will present an analysis of the issues reported and the conclusions offered by the participants. These results provide perspectives for science software sustainability practices and have implications for actions that organizations and their leadership can initiate to improve the sustainability of science software.
Natural Computing: Its Impact on Software Development
2000-02-01
liars) Hanieuually ------. - ------ hadirline and single 1________N *ha c ea I coal C.oke loand . SIt thtan I other 0 Boxohead cutoff rule...user can develop new proce- dures by copying objects from documents and connecting them. These procedures can be saved for future use. Figure 27 shows
The ORT Open Tech Robotics and Automation Literacy Course.
ERIC Educational Resources Information Center
Sharon, Dan; And Others
1987-01-01
Presents an overview of a course on robotics and automation developed by the Organization for Rehabilitation through Training (ORT) to be offered through an open learning environment in the United Kingdom. Highlights include hardware and software requirements, an educational model, design principles, and future developments. (LRW)
The Next Generation of Personal Computers.
ERIC Educational Resources Information Center
Crecine, John P.
1986-01-01
Discusses factors converging to create high-capacity, low-cost nature of next generation of microcomputers: a coherent vision of what graphics workstation and future computing environment should be like; hardware developments leading to greater storage capacity at lower costs; and development of software and expertise to exploit computing power…
Wang, Tongli; Hamann, Andreas; Spittlehouse, Dave; Carroll, Carlos
2016-01-01
Large volumes of gridded climate data have become available in recent years including interpolated historical data from weather stations and future predictions from general circulation models. These datasets, however, are at various spatial resolutions that need to be converted to scales meaningful for applications such as climate change risk and impact assessments or sample-based ecological research. Extracting climate data for specific locations from large datasets is not a trivial task and typically requires advanced GIS and data management skills. In this study, we developed a software package, ClimateNA, that facilitates this task and provides a user-friendly interface suitable for resource managers and decision makers as well as scientists. The software locally downscales historical and future monthly climate data layers into scale-free point estimates of climate values for the entire North American continent. The software also calculates a large number of biologically relevant climate variables that are usually derived from daily weather data. ClimateNA covers 1) 104 years of historical data (1901–2014) in monthly, annual, decadal and 30-year time steps; 2) three paleoclimatic periods (Last Glacial Maximum, Mid Holocene and Last Millennium); 3) three future periods (2020s, 2050s and 2080s); and 4) annual time-series of model projections for 2011–2100. Multiple general circulation models (GCMs) were included for both paleo and future periods, and two representative concentration pathways (RCP4.5 and 8.5) were chosen for future climate data. PMID:27275583
Wang, Tongli; Hamann, Andreas; Spittlehouse, Dave; Carroll, Carlos
2016-01-01
Large volumes of gridded climate data have become available in recent years including interpolated historical data from weather stations and future predictions from general circulation models. These datasets, however, are at various spatial resolutions that need to be converted to scales meaningful for applications such as climate change risk and impact assessments or sample-based ecological research. Extracting climate data for specific locations from large datasets is not a trivial task and typically requires advanced GIS and data management skills. In this study, we developed a software package, ClimateNA, that facilitates this task and provides a user-friendly interface suitable for resource managers and decision makers as well as scientists. The software locally downscales historical and future monthly climate data layers into scale-free point estimates of climate values for the entire North American continent. The software also calculates a large number of biologically relevant climate variables that are usually derived from daily weather data. ClimateNA covers 1) 104 years of historical data (1901-2014) in monthly, annual, decadal and 30-year time steps; 2) three paleoclimatic periods (Last Glacial Maximum, Mid Holocene and Last Millennium); 3) three future periods (2020s, 2050s and 2080s); and 4) annual time-series of model projections for 2011-2100. Multiple general circulation models (GCMs) were included for both paleo and future periods, and two representative concentration pathways (RCP4.5 and 8.5) were chosen for future climate data.
NASA Technical Reports Server (NTRS)
1988-01-01
Integrated Environments for Large, Complex Systems is the theme for the RICIS symposium of 1988. Distinguished professionals from industry, government, and academia have been invited to participate and present their views and experiences regarding research, education, and future directions related to this topic. Within RICIS, more than half of the research being conducted is in the area of Computer Systems and Software Engineering. The focus of this research is on the software development life-cycle for large, complex, distributed systems. Within the education and training component of RICIS, the primary emphasis has been to provide education and training for software professionals.
A New Look at NASA: Strategic Research In Information Technology
NASA Technical Reports Server (NTRS)
Alfano, David; Tu, Eugene (Technical Monitor)
2002-01-01
This viewgraph presentation provides information on research undertaken by NASA to facilitate the development of information technologies. Specific ideas covered here include: 1) Bio/nano technologies: biomolecular and nanoscale systems and tools for assembly and computing; 2) Evolvable hardware: autonomous self-improving, self-repairing hardware and software for survivable space systems in extreme environments; 3) High Confidence Software Technologies: formal methods, high-assurance software design, and program synthesis; 4) Intelligent Controls and Diagnostics: Next generation machine learning, adaptive control, and health management technologies; 5) Revolutionary computing: New computational models to increase capability and robustness to enable future NASA space missions.
Application of software technology to a future spacecraft computer design
NASA Technical Reports Server (NTRS)
Labaugh, R. J.
1980-01-01
A study was conducted to determine how major improvements in spacecraft computer systems can be obtained from recent advances in hardware and software technology. Investigations into integrated circuit technology indicated that the CMOS/SOS chip set being developed for the Air Force Avionics Laboratory at Wright Patterson had the best potential for improving the performance of spaceborne computer systems. An integral part of the chip set is the bit slice arithmetic and logic unit. The flexibility allowed by microprogramming, combined with the software investigations, led to the specification of a baseline architecture and instruction set.
A Platform-Independent Plugin for Navigating Online Radiology Cases.
Balkman, Jason D; Awan, Omer A
2016-06-01
Software methods that enable navigation of radiology cases on various digital platforms differ between handheld devices and desktop computers. This has resulted in poor compatibility of online radiology teaching files across mobile smartphones, tablets, and desktop computers. A standardized, platform-independent, or "agnostic" approach for presenting online radiology content was produced in this work by leveraging modern hypertext markup language (HTML) and JavaScript web software technology. We describe the design and evaluation of this software, demonstrate its use across multiple viewing platforms, and make it publicly available as a model for future development efforts.
NASA Technical Reports Server (NTRS)
Liaw, Morris; Evesson, Donna
1988-01-01
This is a manual for users of the Software Engineering and Ada Database (SEAD). SEAD was developed to provide an information resource to NASA and NASA contractors with respect to Ada-based resources and activities that are available or underway either in NASA or elsewhere in the worldwide Ada community. The sharing of such information will reduce the duplication of effort while improving quality in the development of future software systems. The manual describes the organization of the data in SEAD, the user interface from logging in to logging out, and concludes with a ten chapter tutorial on how to use the information in SEAD. Two appendices provide quick reference for logging into SEAD and using the keyboard of an IBM 3270 or VT100 computer terminal.
Status and plans for the future of the Vienna VLBI Software
NASA Astrophysics Data System (ADS)
Madzak, Matthias; Böhm, Johannes; Böhm, Sigrid; Girdiuk, Anastasiia; Hellerschmied, Andreas; Hofmeister, Armin; Krasna, Hana; Kwak, Younghee; Landskron, Daniel; Mayer, David; McCallum, Jamie; Plank, Lucia; Schönberger, Caroline; Shabala, Stanislav; Sun, Jing; Teke, Kamil
2016-04-01
The Vienna VLBI Software (VieVS) is a VLBI analysis software developed and maintained at Technische Universität Wien (TU Wien) since 2008 with contributions from groups all over the world. It is used for both academic purposes in university courses as well as for providing VLBI analysis results to the geodetic community. Written in a modular structure in Matlab, VieVS offers easy access to the source code and the possibility to adapt the programs for particular purposes. The new version 2.3, released in December 2015, includes several new parameters to be estimated in the global solution, such as tidal ERP variation coefficients. The graphical user interface was slightly modified for an improved user functionality and, e.g., the possibility of deriving baseline length repeatabilities. The scheduling of satellite observations was refined, the simulator newly includes the effect of source structure which can also be corrected for in the analysis. This poster gives an overview of all VLBI-related activities in Vienna and provides an outlook to future plans concerning the Vienna VLBI Software.
System Re-engineering Project Executive Summary
1991-11-01
Management Information System (STAMIS) application. This project involved reverse engineering, evaluation of structured design and object-oriented design, and re- implementation of the system in Ada. This executive summary presents the approach to re-engineering the system, the lessons learned while going through the process, and issues to be considered in future tasks of this nature.... Computer-Aided Software Engineering (CASE), Distributed Software, Ada, COBOL, Systems Analysis, Systems Design, Life Cycle Development, Functional Decomposition, Object-Oriented
Quantitative CMMI Assessment for Offshoring through the Analysis of Project Management Repositories
NASA Astrophysics Data System (ADS)
Sunetnanta, Thanwadee; Nobprapai, Ni-On; Gotel, Olly
The nature of distributed teams and the existence of multiple sites in offshore software development projects pose a challenging setting for software process improvement. Often, the improvement and appraisal of software processes is achieved through a turnkey solution where best practices are imposed or transferred from a company’s headquarters to its offshore units. In so doing, successful project health checks and monitoring for quality on software processes requires strong project management skills, well-built onshore-offshore coordination, and often needs regular onsite visits by software process improvement consultants from the headquarters’ team. This paper focuses on software process improvement as guided by the Capability Maturity Model Integration (CMMI) and proposes a model to evaluate the status of such improvement efforts in the context of distributed multi-site projects without some of this overhead. The paper discusses the application of quantitative CMMI assessment through the collection and analysis of project data gathered directly from project repositories to facilitate CMMI implementation and reduce the cost of such implementation for offshore-outsourced software development projects. We exemplify this approach to quantitative CMMI assessment through the analysis of project management data and discuss the future directions of this work in progress.
Software Defined Radio Architecture Contributions to Next Generation Space Communications
NASA Technical Reports Server (NTRS)
Kacpura, Thomas J.; Eddy, Wesley M.; Smith, Carl R.; Liebetreu, John
2015-01-01
Space communications architecture concepts, comprising the elements of the system, the interactions among them, and the principles that govern their development, are essential factors in developing National Aeronautics and Space Administration (NASA) future exploration and science missions. Accordingly, vital architectural attributes encompass flexibility, the extensibility to insert future capabilities, and to enable evolution to provide interoperability with other current and future systems. Space communications architectures and technologies for this century must satisfy a growing set of requirements, including those for Earth sensing, collaborative observation missions, robotic scientific missions, human missions for exploration of the Moon and Mars where surface activities require supporting communications, and in-space observatories for observing the earth, as well as other star systems and the universe. An advanced, integrated, communications infrastructure will enable the reliable, multipoint, high-data-rate capabilities needed on demand to provide continuous, maximum coverage for areas of concentrated activity. Importantly, the cost/value proposition of the future architecture must be an integral part of its design; an affordable and sustainable architecture is indispensable within anticipated future budget environments. Effective architecture design informs decision makers with insight into the capabilities needed to efficiently satisfy the demanding space-communication requirements of future missions and formulate appropriate requirements. A driving requirement for the architecture is the extensibility to address new requirements and provide low-cost on-ramps for new capabilities insertion, ensuring graceful growth as new functionality and new technologies are infused into the network infrastructure. In addition to extensibility, another key architectural attribute of the space communication equipment's interoperability with other NASA communications systems, as well as those communications and navigation systems operated by international space agencies and civilian and government agencies. In this paper, we review the philosophies, technologies, architectural attributes, mission services, and communications capabilities that form the structure of candidate next-generation integrated communication architectures for space communications and navigation. A key area that this paper explores is from the development and operation of the software defined radio for the NASA Space Communications and Navigation (SCaN) Testbed currently on the International Space Station (ISS). Evaluating the lessons learned from development and operation feed back into the communications architecture. Leveraging the reconfigurability provides a change in the way that operations are done and must be considered. Quantifying the impact on the NASA Space Telecommunications Radio System (STRS) software defined radio architecture provides feedback to keep the standard useful and up to date. NASA is not the only customer of these radios. Software defined radios are developed for other applications, and taking advantage of these developments promotes an architecture that is cost effective and sustainable. Developments in the following areas such as an updated operating environment, higher data rates, networking and security can be leveraged. The ability to sustain an architecture that uses radios for multiple markets can lower costs and keep new technology infused.
Software Engineering Laboratory (SEL) Ada performance study report
NASA Technical Reports Server (NTRS)
Booth, Eric W.; Stark, Michael E.
1991-01-01
The goals of the Ada Performance Study are described. The methods used are explained. Guidelines for future Ada development efforts are given. The goals and scope of the study are detailed, and the background of Ada development in the Flight Dynamics Division (FDD) is presented. The organization and overall purpose of each test are discussed. The purpose, methods, and results of each test and analyses of these results are given. Guidelines for future development efforts based on the analysis of results from this study are provided. The approach used on the performance tests is discussed.
1991-09-01
involved in choosing hardware and so-ftware for CAI "are.the lesson objectives and the future needs of the instructor and student" (18:6-2). And...did not cover the graiTmatical errors nighlighted by the survey of subject-matter ’experts. Future research should include an expansion of, or...display any hypertext document. This tutorial covered basic English grammar concepts. Future research should address the possibilities of developing
Moody, George B; Mark, Roger G; Goldberger, Ary L
2011-01-01
PhysioNet provides free web access to over 50 collections of recorded physiologic signals and time series, and related open-source software, in support of basic, clinical, and applied research in medicine, physiology, public health, biomedical engineering and computing, and medical instrument design and evaluation. Its three components (PhysioBank, the archive of signals; PhysioToolkit, the software library; and PhysioNetWorks, the virtual laboratory for collaborative development of future PhysioBank data collections and PhysioToolkit software components) connect researchers and students who need physiologic signals and relevant software with researchers who have data and software to share. PhysioNet's annual open engineering challenges stimulate rapid progress on unsolved or poorly solved questions of basic or clinical interest, by focusing attention on achievable solutions that can be evaluated and compared objectively using freely available reference data.
The Next Step: 25 Discoveries That Could Change Our Lives.
ERIC Educational Resources Information Center
Science85, 1985
1985-01-01
Describes (in separate articles) 25 developments in science, technology, and medicine that have potential impact on the near future. They include discoveries related to space butterflies, drugs, twenty-first century software, experimental mathematics, brain drugs, egg development, ultrasmall microchips, the biology of birth, cancer-causing genes,…
Minnesota Land Management Information Center
NASA Technical Reports Server (NTRS)
Nordstrand, E. A.
1981-01-01
A brief history of the Minnesota Land Management Information Center is given and the present operational status and plans for future development are described. The incorporation of LANDSAT data into the system, hardware and software capabilities, and funding are addressed.
Human-Automation Integration: Principle and Method for Design and Evaluation
NASA Technical Reports Server (NTRS)
Billman, Dorrit; Feary, Michael
2012-01-01
Future space missions will increasingly depend on integration of complex engineered systems with their human operators. It is important to ensure that the systems that are designed and developed do a good job of supporting the needs of the work domain. Our research investigates methods for needs analysis. We included analysis of work products (plans for regulation of the space station) as well as work processes (tasks using current software), in a case study of Attitude Determination and Control Officers (ADCO) planning work. This allows comparing how well different designs match the structure of the work to be supported. Redesigned planning software that better matches the structure of work was developed and experimentally assessed. The new prototype enabled substantially faster and more accurate performance in plan revision tasks. This success suggests the approach to needs assessment and use in design and evaluation is promising, and merits investigatation in future research.
Distributed computing environments for future space control systems
NASA Technical Reports Server (NTRS)
Viallefont, Pierre
1993-01-01
The aim of this paper is to present the results of a CNES research project on distributed computing systems. The purpose of this research was to study the impact of the use of new computer technologies in the design and development of future space applications. The first part of this study was a state-of-the-art review of distributed computing systems. One of the interesting ideas arising from this review is the concept of a 'virtual computer' allowing the distributed hardware architecture to be hidden from a software application. The 'virtual computer' can improve system performance by adapting the best architecture (addition of computers) to the software application without having to modify its source code. This concept can also decrease the cost and obsolescence of the hardware architecture. In order to verify the feasibility of the 'virtual computer' concept, a prototype representative of a distributed space application is being developed independently of the hardware architecture.
NASA Astrophysics Data System (ADS)
Habibi, Ali
1993-01-01
The objective of this article is to present a discussion on the future of image data compression in the next two decades. It is virtually impossible to predict with any degree of certainty the breakthroughs in theory and developments, the milestones in advancement of technology and the success of the upcoming commercial products in the market place which will be the main factors in establishing the future stage to image coding. What we propose to do, instead, is look back at the progress in image coding during the last two decades and assess the state of the art in image coding today. Then, by observing the trends in developments of theory, software, and hardware coupled with the future needs for use and dissemination of imagery data and the constraints on the bandwidth and capacity of various networks, predict the future state of image coding. What seems to be certain today is the growing need for bandwidth compression. The television is using a technology which is half a century old and is ready to be replaced by high definition television with an extremely high digital bandwidth. Smart telephones coupled with personal computers and TV monitors accommodating both printed and video data will be common in homes and businesses within the next decade. Efficient and compact digital processing modules using developing technologies will make bandwidth compressed imagery the cheap and preferred alternative in satellite and on-board applications. In view of the above needs, we expect increased activities in development of theory, software, special purpose chips and hardware for image bandwidth compression in the next two decades. The following sections summarize the future trends in these areas.
Atlas : A library for numerical weather prediction and climate modelling
NASA Astrophysics Data System (ADS)
Deconinck, Willem; Bauer, Peter; Diamantakis, Michail; Hamrud, Mats; Kühnlein, Christian; Maciel, Pedro; Mengaldo, Gianmarco; Quintino, Tiago; Raoult, Baudouin; Smolarkiewicz, Piotr K.; Wedi, Nils P.
2017-11-01
The algorithms underlying numerical weather prediction (NWP) and climate models that have been developed in the past few decades face an increasing challenge caused by the paradigm shift imposed by hardware vendors towards more energy-efficient devices. In order to provide a sustainable path to exascale High Performance Computing (HPC), applications become increasingly restricted by energy consumption. As a result, the emerging diverse and complex hardware solutions have a large impact on the programming models traditionally used in NWP software, triggering a rethink of design choices for future massively parallel software frameworks. In this paper, we present Atlas, a new software library that is currently being developed at the European Centre for Medium-Range Weather Forecasts (ECMWF), with the scope of handling data structures required for NWP applications in a flexible and massively parallel way. Atlas provides a versatile framework for the future development of efficient NWP and climate applications on emerging HPC architectures. The applications range from full Earth system models, to specific tools required for post-processing weather forecast products. The Atlas library thus constitutes a step towards affordable exascale high-performance simulations by providing the necessary abstractions that facilitate the application in heterogeneous HPC environments by promoting the co-design of NWP algorithms with the underlying hardware.
Evaluation of radiological dispersion/consequence codes supporting DOE nuclear facility SARs
DOE Office of Scientific and Technical Information (OSTI.GOV)
O`Kula, K.R.; Paik, I.K.; Chung, D.Y.
1996-12-31
Since the early 1990s, the authorization basis documentation of many U.S. Department of Energy (DOE) nuclear facilities has been upgraded to comply with DOE orders and standards. In this process, many safety analyses have been revised. Unfortunately, there has been nonuniform application of software, and the most appropriate computer and engineering methodologies often are not applied. A DOE Accident Phenomenology and Consequence (APAC) Methodology Evaluation Program was originated at the request of DOE Defense Programs to evaluate the safety analysis methodologies used in nuclear facility authorization basis documentation and to define future cost-effective support and development initiatives. Six areas, includingmore » source term development (fire, spills, and explosion analysis), in-facility transport, and dispersion/ consequence analysis (chemical and radiological) are contained in the APAC program. The evaluation process, codes considered, key results, and recommendations for future model and software development of the Radiological Dispersion/Consequence Working Group are summarized in this paper.« less
OntoSoft: A Software Registry for Geosciences
NASA Astrophysics Data System (ADS)
Garijo, D.; Gil, Y.
2017-12-01
The goal of the EarthCube OntoSoft project is to enable the creation of an ecosystem for software stewardship in geosciences that will empower scientists to manage their software as valuable scientific assets. By sharing software metadata in OntoSoft, scientists enable broader access to that software by other scientists, software professionals, students, and decision makers. Our work to date includes: 1) an ontology for describing scientific software metadata, 2) a distributed scientific software repository that contains more than 750 entries that can be searched and compared across metadata fields, 3) an intelligent user interface that guides scientists to publish software and allows them to crowdsource its corresponding metadata. We have also developed a training program where scientists learn to describe and cite software in their papers in addition to data and provenance, and we are using OntoSoft to show them the benefits of publishing their software metadata. This training program is part of a Geoscience Papers of the Future Initiative, where scientists are reflecting on their current practices, benefits and effort for sharing software and data. This journal paper can be submitted to a Special Section of the AGU Earth and Space Science Journal.
A survey of program slicing for software engineering
NASA Technical Reports Server (NTRS)
Beck, Jon
1993-01-01
This research concerns program slicing which is used as a tool for program maintainence of software systems. Program slicing decreases the level of effort required to understand and maintain complex software systems. It was first designed as a debugging aid, but it has since been generalized into various tools and extended to include program comprehension, module cohesion estimation, requirements verification, dead code elimination, and maintainence of several software systems, including reverse engineering, parallelization, portability, and reuse component generation. This paper seeks to address and define terminology, theoretical concepts, program representation, different program graphs, developments in static slicing, dynamic slicing, and semantics and mathematical models. Applications for conventional slicing are presented, along with a prognosis of future work in this field.
Software engineering for ESO's VLT project
NASA Astrophysics Data System (ADS)
Filippi, G.
1994-12-01
This paper reports on the experience at the European Southern Observatory on the application of software engineering techniques to a 200 man-year control software project for the Very Large Telescope (VLT). This shall provide astronomers, before the end of the century, with one of the most powerful telescopes in the world. From the definition of the general model, described in the software management plan, specific activities have been and will be defined: standards for documents and for code development, design approach using a CASE tool, the process of reviewing both documentation and code, quality assurance, test strategy, etc. The initial choices, the current implementation and the future planned activities are presented and, where feedback is already available, pros and cons are discussed.
NASA Technical Reports Server (NTRS)
Villarreal, James A.
1991-01-01
A whole new arena of computer technologies is now beginning to form. Still in its infancy, neural network technology is a biologically inspired methodology which draws on nature's own cognitive processes. The Software Technology Branch has provided a software tool, Neural Execution and Training System (NETS), to industry, government, and academia to facilitate and expedite the use of this technology. NETS is written in the C programming language and can be executed on a variety of machines. Once a network has been debugged, NETS can produce a C source code which implements the network. This code can then be incorporated into other software systems. Described here are various software projects currently under development with NETS and the anticipated future enhancements to NETS and the technology.
SOA: A Quality Attribute Perspective
2011-06-23
in software engineering from CMU. 6June 2011 Twitter #seiwebinar © 2011 Carnegie Mellon University Agenda Service -Oriented Architecture and... Software Architecture: Review Service -Orientation and Quality Attributes Summary and Future Challenges 7June 2011 Twitter #seiwebinar © 2011...Architecture and Software Architecture: Review Service -Orientation and Quality Attributes Summary and Future Challenges Review 10June 2011 Twitter
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pepper, Susan E.; Pickett, Chris A.; Queirolo, Al
The U.S Department of Energy (DOE) National Nuclear Security Administration (NNSA) Next Generation Safeguards Initiative (NGSI) and the International Atomic Energy Agency (IAEA) convened a workshop on Software Sustainability for Safeguards Instrumentation in Vienna, Austria, May 6-8, 2014. Safeguards instrumentation software must be sustained in a changing environment to ensure existing instruments can continue to perform as designed, with improved security. The approaches to the development and maintenance of instrument software used in the past may not be the best model for the future and, therefore, the organizers’ goal was to investigate these past approaches and to determine an optimalmore » path forward. The purpose of this report is to provide input for the DOE NNSA Office of International Nuclear Safeguards (NA-241) and other stakeholders that can be utilized when making decisions related to the development and maintenance of software used in the implementation of international nuclear safeguards. For example, this guidance can be used when determining whether to fund the development, upgrade, or replacement of a particular software product. The report identifies the challenges related to sustaining software, and makes recommendations for addressing these challenges, supported by summaries and detailed notes from the workshop discussions. In addition the authors provide a set of recommendations for institutionalizing software sustainability practices in the safeguards community. The term “software sustainability” was defined for this workshop as ensuring that safeguards instrument software and algorithm functionality can be maintained efficiently throughout the instrument lifecycle, without interruption and providing the ability to continue to improve that software as needs arise.« less
Flexible Software Architecture for Visualization and Seismic Data Analysis
NASA Astrophysics Data System (ADS)
Petunin, S.; Pavlov, I.; Mogilenskikh, D.; Podzyuban, D.; Arkhipov, A.; Baturuin, N.; Lisin, A.; Smith, A.; Rivers, W.; Harben, P.
2007-12-01
Research in the field of seismology requires software and signal processing utilities for seismogram manipulation and analysis. Seismologists and data analysts often encounter a major problem in the use of any particular software application specific to seismic data analysis: the tuning of commands and windows to the specific waveforms and hot key combinations so as to fit their familiar informational environment. The ability to modify the user's interface independently from the developer requires an adaptive code structure. An adaptive code structure also allows for expansion of software capabilities such as new signal processing modules and implementation of more efficient algorithms. Our approach is to use a flexible "open" architecture for development of geophysical software. This report presents an integrated solution for organizing a logical software architecture based on the Unix version of the Geotool software implemented on the Microsoft NET 2.0 platform. Selection of this platform greatly expands the variety and number of computers that can implement the software, including laptops that can be utilized in field conditions. It also facilitates implementation of communication functions for seismic data requests from remote databases through the Internet. The main principle of the new architecture for Geotool is that scientists should be able to add new routines for digital waveform analysis via software plug-ins that utilize the basic Geotool display for GUI interaction. The use of plug-ins allows the efficient integration of diverse signal-processing software, including software still in preliminary development, into an organized platform without changing the fundamental structure of that platform itself. An analyst's use of Geotool is tracked via a metadata file so that future studies can reconstruct, and alter, the original signal processing operations. The work has been completed in the framework of a joint Russian- American project.
RiskScape: a new tool for comparing risk from natural hazards (Invited)
NASA Astrophysics Data System (ADS)
Stirling, M. W.; King, A.
2010-12-01
The Regional RiskScape is New Zealand’s joint venture between GNS Science & NIWA, and represents a comprehensive and easy-to-use tool for multi-hazard-based risk and impact analysis. It has basic GIS functionality, in that it has Import/Export functions to use with GIS software. Five natural hazards have been implemented in Riskscape to date: Flood (river), earthquake, volcano (ash), tsunami and wind storm. The software converts hazard exposure information into the likely impacts for a region, for example, damage and replacement costs, casualties, economic losses, disruption, and number of people affected. It therefore can be used to assist with risk management, land use planning, building codes and design, risk identification, prioritization of risk-reduction/mitigation, determination of “best use” risk-reduction investment, evacuation and contingency planning, awareness raising, public information, realistic scenarios for exercises, and hazard event response. Three geographically disparate pilot regions have been used to develop and triall Riskscape in New Zealand, and each region is exposed to a different mix of natural hazards. Future (phase II) development of Riskscape will include the following hazards: Landslides (both rainfall and earthquake triggered), storm surges, pyroclastic flows and lahars, and climate change effects. While Riskscape developments have thus far focussed on scenario-based risk, future developments will advance the software into providing probabilistic-based solutions.
From Excavations to Web: a GIS for Archaeology
NASA Astrophysics Data System (ADS)
D'Urso, M. G.; Corsi, E.; Nemeti, S.; Germani, M.
2017-05-01
The study and the protection of Cultural Heritage in recent years have undergone a revolution about the search tools and the reference disciplines. The technological approach to the problem of the collection, organization and publication of archaeological data using GIS software has completely changed the essence of the traditional methods of investigation, paving the way to the development of several application areas, up to the Cultural Resource Management. A relatively recent specific sector of development for archaeological GIS development sector is dedicated to the intra - site analyses aimed to recording, processing and display information obtained during the excavations. The case - study of the archaeological site located in the south - east of San Pietro Vetere plateau in Aquino, in the Southern Lazio, is concerned with the illustration of a procedure describing the complete digital workflow relative to an intra-site analysis of an archaeological dig. The GIS project implementation and its publication on the web, thanks to several softwares, particularly the FOSS (Free Open Source Software) Quantum - GIS, are an opportunity to reflect on the strengths and the critical nature of this particular application of the GIS technology. For future developments in research it is of fundamental importance the identification of a digital protocol for processing of excavations (from the acquisition, cataloguing, up data insertion), also on account of a possible future Open Project on medieval Aquino.
Fostering soft skills in project-oriented learning within an agile atmosphere
NASA Astrophysics Data System (ADS)
Chassidim, Hadas; Almog, Dani; Mark, Shlomo
2018-07-01
The project-oriented and Agile approaches have motivated a new generation of software engineers. Within the academic curriculum, the issue of whether students are being sufficiently prepared for the future has been raised. The objective of this work is to present the project-oriented environment as an influential factor that software engineering profession requires, using the second year course 'Software Development and Management in Agile Approach' as a case-study. This course combines academic topics, self-learned and soft skills implementation, the call for creativity, and the recognition of updated technologies and dynamic circumstances. The results of a survey that evaluated the perceived value of the course showed that the highest contribution of our environment was in the effectiveness of the team-work and the overall development process of the project.
Power, Avionics and Software - Phase 1.0:. [Subsystem Integration Test Report
NASA Technical Reports Server (NTRS)
Ivancic, William D.; Sands, Obed S.; Bakula, Casey J.; Oldham, Daniel R.; Wright, Ted; Bradish, Martin A.; Klebau, Joseph M.
2014-01-01
This report describes Power, Avionics and Software (PAS) 1.0 subsystem integration testing and test results that occurred in August and September of 2013. This report covers the capabilities of each PAS assembly to meet integration test objectives for non-safety critical, non-flight, non-human-rated hardware and software development. This test report is the outcome of the first integration of the PAS subsystem and is meant to provide data for subsequent designs, development and testing of the future PAS subsystems. The two main objectives were to assess the ability of the PAS assemblies to exchange messages and to perform audio testing of both inbound and outbound channels. This report describes each test performed, defines the test, the data, and provides conclusions and recommendations.
Development of land based radar polarimeter processor system
NASA Technical Reports Server (NTRS)
Kronke, C. W.; Blanchard, A. J.
1983-01-01
The processing subsystem of a land based radar polarimeter was designed and constructed. This subsystem is labeled the remote data acquisition and distribution system (RDADS). The radar polarimeter, an experimental remote sensor, incorporates the RDADS to control all operations of the sensor. The RDADS uses industrial standard components including an 8-bit microprocessor based single board computer, analog input/output boards, a dynamic random access memory board, and power supplis. A high-speed digital electronics board was specially designed and constructed to control range-gating for the radar. A complete system of software programs was developed to operate the RDADS. The software uses a powerful real time, multi-tasking, executive package as an operating system. The hardware and software used in the RDADS are detailed. Future system improvements are recommended.
Future Field Programmable Gate Array (FPGA) Design Methodologies and Tool Flows
2008-07-01
a) that the results are accepted by users, vendors, … and (b) that they can quantitatively explain HPC rules of thumb such as: “OpenMP is easier...in productivity that were demonstrated by traditional software systems. Using advances in software productivity as a guide , we have identified three...of this study we developed a productivity model to guide our investigation (14). Models have limitations and the model we propose is no exception
Validation of highly reliable, real-time knowledge-based systems
NASA Technical Reports Server (NTRS)
Johnson, Sally C.
1988-01-01
Knowledge-based systems have the potential to greatly increase the capabilities of future aircraft and spacecraft and to significantly reduce support manpower needed for the space station and other space missions. However, a credible validation methodology must be developed before knowledge-based systems can be used for life- or mission-critical applications. Experience with conventional software has shown that the use of good software engineering techniques and static analysis tools can greatly reduce the time needed for testing and simulation of a system. Since exhaustive testing is infeasible, reliability must be built into the software during the design and implementation phases. Unfortunately, many of the software engineering techniques and tools used for conventional software are of little use in the development of knowledge-based systems. Therefore, research at Langley is focused on developing a set of guidelines, methods, and prototype validation tools for building highly reliable, knowledge-based systems. The use of a comprehensive methodology for building highly reliable, knowledge-based systems should significantly decrease the time needed for testing and simulation. A proven record of delivering reliable systems at the beginning of the highly visible testing and simulation phases is crucial to the acceptance of knowledge-based systems in critical applications.
A Java application for tissue section image analysis.
Kamalov, R; Guillaud, M; Haskins, D; Harrison, A; Kemp, R; Chiu, D; Follen, M; MacAulay, C
2005-02-01
The medical industry has taken advantage of Java and Java technologies over the past few years, in large part due to the language's platform-independence and object-oriented structure. As such, Java provides powerful and effective tools for developing tissue section analysis software. The background and execution of this development are discussed in this publication. Object-oriented structure allows for the creation of "Slide", "Unit", and "Cell" objects to simulate the corresponding real-world objects. Different functions may then be created to perform various tasks on these objects, thus facilitating the development of the software package as a whole. At the current time, substantial parts of the initially planned functionality have been implemented. Getafics 1.0 is fully operational and currently supports a variety of research projects; however, there are certain features of the software that currently introduce unnecessary complexity and inefficiency. In the future, we hope to include features that obviate these problems.
SCOS 2: An object oriented software development approach
NASA Technical Reports Server (NTRS)
Symonds, Martin; Lynenskjold, Steen; Mueller, Christian
1994-01-01
The Spacecraft Control and Operations System 2 (SCOS 2), is intended to provide the generic mission control system infrastructure for future ESA missions. It represents a bold step forward in order to take advantage of state-of-the-art technology and current practices in the area of software engineering. Key features include: (1) use of object oriented analysis and design techniques; (2) use of UNIX, C++ and a distributed architecture as the enabling implementation technology; (3) goal of re-use for development, maintenance and mission specific software implementation; and (4) introduction of the concept of a spacecraft control model. This paper touches upon some of the traditional beliefs surrounding Object Oriented development and describes their relevance to SCOS 2. It gives rationale for why particular approaches were adopted and others not, and describes the impact of these decisions. The development approach followed is discussed, highlighting the evolutionary nature of the overall process and the iterative nature of the various tasks carried out. The emphasis of this paper is on the process of the development with the following being covered: (1) the three phases of the SCOS 2 project - prototyping & analysis, design & implementation and configuration / delivery of mission specific systems; (2) the close cooperation and continual interaction with the users during the development; (3) the management approach - the split between client staff, industry and some of the required project management activities; (4) the lifecycle adopted being an enhancement of the ESA PSS-05 standard with SCOS 2 specific activities and approaches defined; and (5) an examination of some of the difficulties encountered and the solutions adopted. Finally, the lessons learned from the SCOS 2 experience are highlighted, identifying those issues to be used as feedback into future developments of this nature. This paper does not intend to describe the finished product and its operation, but focusing on the journey to arrive there, concentrating therefore on the process and not the products of the SCOS 2 software development.
Review of software tools for design and analysis of large scale MRM proteomic datasets.
Colangelo, Christopher M; Chung, Lisa; Bruce, Can; Cheung, Kei-Hoi
2013-06-15
Selective or Multiple Reaction monitoring (SRM/MRM) is a liquid-chromatography (LC)/tandem-mass spectrometry (MS/MS) method that enables the quantitation of specific proteins in a sample by analyzing precursor ions and the fragment ions of their selected tryptic peptides. Instrumentation software has advanced to the point that thousands of transitions (pairs of primary and secondary m/z values) can be measured in a triple quadrupole instrument coupled to an LC, by a well-designed scheduling and selection of m/z windows. The design of a good MRM assay relies on the availability of peptide spectra from previous discovery-phase LC-MS/MS studies. The tedious aspect of manually developing and processing MRM assays involving thousands of transitions has spurred to development of software tools to automate this process. Software packages have been developed for project management, assay development, assay validation, data export, peak integration, quality assessment, and biostatistical analysis. No single tool provides a complete end-to-end solution, thus this article reviews the current state and discusses future directions of these software tools in order to enable researchers to combine these tools for a comprehensive targeted proteomics workflow. Copyright © 2013 The Authors. Published by Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Liaw, Morris; Evesson, Donna
1988-01-01
Software Engineering and Ada Database (SEAD) was developed to provide an information resource to NASA and NASA contractors with respect to Ada-based resources and activities which are available or underway either in NASA or elsewhere in the worldwide Ada community. The sharing of such information will reduce duplication of effort while improving quality in the development of future software systems. SEAD data is organized into five major areas: information regarding education and training resources which are relevant to the life cycle of Ada-based software engineering projects such as those in the Space Station program; research publications relevant to NASA projects such as the Space Station Program and conferences relating to Ada technology; the latest progress reports on Ada projects completed or in progress both within NASA and throughout the free world; Ada compilers and other commercial products that support Ada software development; and reusable Ada components generated both within NASA and from elsewhere in the free world. This classified listing of reusable components shall include descriptions of tools, libraries, and other components of interest to NASA. Sources for the data include technical newletters and periodicals, conference proceedings, the Ada Information Clearinghouse, product vendors, and project sponsors and contractors.
Bigger Data, Collaborative Tools and the Future of Predictive Drug Discovery
Clark, Alex M.; Swamidass, S. Joshua; Litterman, Nadia; Williams, Antony J.
2014-01-01
Over the past decade we have seen a growth in the provision of chemistry data and cheminformatics tools as either free websites or software as a service (SaaS) commercial offerings. These have transformed how we find molecule-related data and use such tools in our research. There have also been efforts to improve collaboration between researchers either openly or through secure transactions using commercial tools. A major challenge in the future will be how such databases and software approaches handle larger amounts of data as it accumulates from high throughput screening and enables the user to draw insights, enable predictions and move projects forward. We now discuss how information from some drug discovery datasets can be made more accessible and how privacy of data should not overwhelm the desire to share it at an appropriate time with collaborators. We also discuss additional software tools that could be made available and provide our thoughts on the future of predictive drug discovery in this age of big data. We use some examples from our own research on neglected diseases, collaborations, mobile apps and algorithm development to illustrate these ideas. PMID:24943138
Future GOES-R global ground receivers
NASA Astrophysics Data System (ADS)
Dafesh, P. A.; Grayver, E.
2006-08-01
The Aerospace Corporation has developed an end-to-end testbed to demonstrate a wide range of modern modulation and coding alternatives for future broadcast by the GOES-R Global Rebroadcast (GRB) system. In particular, this paper describes the development of a compact, low cost, flexible GRB digital receiver that was designed, implemented, fabricated, and tested as part of the development. This receiver demonstrates a 10-fold increase in data rate compared to the rate achievable by the current GOES generation, without a major impact on either cost or size. The digital receiver is integrated on a single PCI card with an FPGA device, and analog-to-digital converters. It supports a wide range of modulations (including 8-PSK and 16-QAM) and turbo coding. With appropriate FPGA firmware and software changes, it can also be configured to receive the current (legacy) GOES signals. The receiver has been validated by sending large image files over a high-fidelity satellite channel emulator, including a space-qualified power amplifier and a white noise source. The receiver is a key component of a future GOES-R weather receiver system (also called user terminal) that includes the antenna, low-noise amplifier, downconverter, filters, digital receiver, and receiver system software. This work describes this receiver proof of concept and its application to providing a very credible estimate of the impact of using modern modulation and coding techniques in the future GOES-R system.
ILLiad: Customer-Focused Interlibrary Loan Automation.
ERIC Educational Resources Information Center
Kriz, Harry M.; Glover, M. Jason; Ford, Kevin C.
1998-01-01
Describes ILLiad (Interlibrary Loan Internet Accessible Database), software that examines the current state of interlibrary loan borrowing requests at Virginia Polytechnic Institute and State University. Topics include system requirements, user procedures, staff procedures, copyright clearance, OCLC, and future developments. (LRW)
Global Assimilative Ionospheric Model
2002-09-30
CHAMP) and Satelite de Aplicaciones Cientificas-C (SAC-C), as well as from the Ionospheric Occultation Experiment (IOX) instrument developed by...strong interest in future collaborative research. TRANSITIONS Our project is still in its initial stage. No software has been transitioned to
Recognition of handprinted characters for automated cartography A progress report
NASA Technical Reports Server (NTRS)
Lybanon, M.; Brown, R. M.; Gronmeyer, L. K.
1980-01-01
A research program for developing handwritten character recognition techniques is reported. The generation of cartographic/hydrographic manuscripts is overviewed. The performance of hardware/software systems is discussed, along with future research problem areas and planned approaches.
Analysis of past NBI ratings to determine future bridge preservation needs.
DOT National Transportation Integrated Search
2004-01-01
Bridge Management System (BMS) needs an analytical tool that can predict bridge element deterioration and answer questions related to bridge preservation. PONTIS, a comprehensive BMS software, was developed to serve this purpose. However, the intensi...
Launch Control Systems: Moving Towards a Scalable, Universal Platform for Future Space Endeavors
NASA Technical Reports Server (NTRS)
Sun, Jonathan
2011-01-01
The redirection of NASA away from the Constellation program calls for heavy reliance on commercial launch vehicles for the near future in order to reduce costs and shift focus to research and long term space exploration. To support them, NASA will renovate Kennedy Space Center's launch facilities and make them available for commercial use. However, NASA's current launch software is deeply connected with the now-retired Space Shuttle and is otherwise not massively compatible. Therefore, a new Launch Control System must be designed that is adaptable to a variety of different launch protocols and vehicles. This paper exposits some of the features and advantages of the new system both from the perspective of the software developers and the launch engineers.
CONTACT: An Air Force technical report on military satellite control technology
NASA Astrophysics Data System (ADS)
Weakley, Christopher K.
1993-07-01
This technical report focuses on Military Satellite Control Technologies and their application to the Air Force Satellite Control Network (AFSCN). This report is a compilation of articles that provide an overview of the AFSCN and the Advanced Technology Program, and discusses relevant technical issues and developments applicable to the AFSCN. Among the topics covered are articles on Future Technology Projections; Future AFSCN Topologies; Modeling of the AFSCN; Wide Area Communications Technology Evolution; Automating AFSCN Resource Scheduling; Health & Status Monitoring at Remote Tracking Stations; Software Metrics and Tools for Measuring AFSCN Software Performance; Human-Computer Interface Working Group; Trusted Systems Workshop; and the University Technical Interaction Program. In addition, Key Technology Area points of contact are listed in the report.
Student project of optical system analysis API-library development
NASA Astrophysics Data System (ADS)
Ivanova, Tatiana; Zhukova, Tatiana; Dantcaranov, Ruslan; Romanova, Maria; Zhadin, Alexander; Ivanov, Vyacheslav; Kalinkina, Olga
2017-08-01
In the paper API-library software developed by students of Applied and Computer Optics Department (ITMO University) for optical system design is presented. The library performs paraxial and real ray tracing, calculates 3d order (Seidel) aberration and real ray aberration of axis and non-axis beams (wave, lateral, longitudinal, coma, distortion etc.) and finally, approximate wave aberration by Zernike polynomials. Real aperture can be calculated by considering of real rays tracing failure on each surface. So far we assume optical system is centered, with spherical or 2d order aspherical surfaces. Optical glasses can be set directly by refraction index or by dispersion coefficients. The library can be used for education or research purposes in optical system design area. It provides ready to use software functions for optical system simulation and analysis that developer can simply plug into their software development for different purposes, for example for some specific synthesis tasks or investigation of new optimization modes. In the paper we present an example of using the library for development of cemented doublet synthesis software based on Slusarev's methodology. The library is used in optical system optimization recipes course for deep studying of optimization model and its application for optical system design. Development of such software is an excellent experience for students and help to understanding optical image modeling and quality analysis. This development is organized as student group joint project. We try to organize it as a group in real research and development project, so each student has his own role in the project and then use whole library functionality in his own master or bachelor thesis. Working in such group gives students useful experience and opportunity to work as research and development engineer of scientific software in the future.
Process Improvement in a Radically Changing Organization
NASA Technical Reports Server (NTRS)
Varga, Denise M.; Wilson, Barbara M.
2007-01-01
This presentation describes how the NASA Glenn Research Center planned and implemented a process improvement effort in response to a radically changing environment. As a result of a presidential decision to redefine the Agency's mission, many ongoing projects were canceled and future workload would be awarded based on relevance to the Exploration Initiative. NASA imposed a new Procedural Requirements standard on all future software development, and the Center needed to redesign its processes from CMM Level 2 objectives to meet the new standard and position itself for CMMI. The intended audience for this presentation is systems/software developers and managers in a large, research-oriented organization that may need to respond to imposed standards while also pursuing CMMI Maturity Level goals. A set of internally developed tools will be presented, including an overall Process Improvement Action Item database, a formal inspection/peer review tool, metrics collection spreadsheet, and other related technologies. The Center also found a need to charter Technical Working Groups (TWGs) to address particular Process Areas. In addition, a Marketing TWG was needed to communicate the process changes to the development community, including an innovative web site portal.
Sociotechnical Challenges of Developing an Interoperable Personal Health Record
Gaskin, G.L.; Longhurst, C.A.; Slayton, R.; Das, A.K.
2011-01-01
Objectives To analyze sociotechnical issues involved in the process of developing an interoperable commercial Personal Health Record (PHR) in a hospital setting, and to create guidelines for future PHR implementations. Methods This qualitative study utilized observational research and semi-structured interviews with 8 members of the hospital team, as gathered over a 28 week period of developing and adapting a vendor-based PHR at Lucile Packard Children’s Hospital at Stanford University. A grounded theory approach was utilized to code and analyze over 100 pages of typewritten field notes and interview transcripts. This grounded analysis allowed themes to surface during the data collection process which were subsequently explored in greater detail in the observations and interviews. Results Four major themes emerged: (1) Multidisciplinary teamwork helped team members identify crucial features of the PHR; (2) Divergent goals for the PHR existed even within the hospital team; (3) Differing organizational conceptions of the end-user between the hospital and software company differentially shaped expectations for the final product; (4) Difficulties with coordination and accountability between the hospital and software company caused major delays and expenses and strained the relationship between hospital and software vendor. Conclusions Though commercial interoperable PHRs have great potential to improve healthcare, the process of designing and developing such systems is an inherently sociotechnical process with many complex issues and barriers. This paper offers recommendations based on the lessons learned to guide future development of such PHRs. PMID:22003373
E-Control: First Public Release of Remote Control Software for VLBI Telescopes
NASA Technical Reports Server (NTRS)
Neidhardt, Alexander; Ettl, Martin; Rottmann, Helge; Ploetz, Christian; Muehlbauer, Matthias; Hase, Hayo; Alef, Walter; Sobarzo, Sergio; Herrera, Cristian; Himwich, Ed
2010-01-01
Automating and remotely controlling observations are important for future operations in a Global Geodetic Observing System (GGOS). At the Geodetic Observatory Wettzell, in cooperation with the Max-Planck-Institute for Radio Astronomy in Bonn, a software extension to the existing NASA Field System has been developed for remote control. It uses the principle of a remotely accessible, autonomous process cell as a server extension for the Field System. The communication is realized for low transfer rates using Remote Procedure Calls (RPC). It uses generative programming with the interface software generator idl2rpc.pl developed at Wettzell. The user interacts with this system over a modern graphical user interface created with wxWidgets. For security reasons the communication is automatically tunneled through a Secure Shell (SSH) session to the telescope. There are already successful test observations with the telescopes at O Higgins, Concepcion, and Wettzell. At Wettzell the software is already used routinely for weekend observations. Therefore the first public release of the software is now available, which will also be useful for other telescopes.
Software Engineering Laboratory (SEL) cleanroom process model
NASA Technical Reports Server (NTRS)
Green, Scott; Basili, Victor; Godfrey, Sally; Mcgarry, Frank; Pajerski, Rose; Waligora, Sharon
1991-01-01
The Software Engineering Laboratory (SEL) cleanroom process model is described. The term 'cleanroom' originates in the integrated circuit (IC) production process, where IC's are assembled in dust free 'clean rooms' to prevent the destructive effects of dust. When applying the clean room methodology to the development of software systems, the primary focus is on software defect prevention rather than defect removal. The model is based on data and analysis from previous cleanroom efforts within the SEL and is tailored to serve as a guideline in applying the methodology to future production software efforts. The phases that are part of the process model life cycle from the delivery of requirements to the start of acceptance testing are described. For each defined phase, a set of specific activities is discussed, and the appropriate data flow is described. Pertinent managerial issues, key similarities and differences between the SEL's cleanroom process model and the standard development approach used on SEL projects, and significant lessons learned from prior cleanroom projects are presented. It is intended that the process model described here will be further tailored as additional SEL cleanroom projects are analyzed.
Software Development Processes Applied to Computational Icing Simulation
NASA Technical Reports Server (NTRS)
Levinson, Laurie H.; Potapezuk, Mark G.; Mellor, Pamela A.
1999-01-01
The development of computational icing simulation methods is making the transition form the research to common place use in design and certification efforts. As such, standards of code management, design validation, and documentation must be adjusted to accommodate the increased expectations of the user community with respect to accuracy, reliability, capability, and usability. This paper discusses these concepts with regard to current and future icing simulation code development efforts as implemented by the Icing Branch of the NASA Lewis Research Center in collaboration with the NASA Lewis Engineering Design and Analysis Division. With the application of the techniques outlined in this paper, the LEWICE ice accretion code has become a more stable and reliable software product.
Human Centered Autonomous and Assistant Systems Testbed for Exploration Operations
NASA Technical Reports Server (NTRS)
Malin, Jane T.; Mount, Frances; Carreon, Patricia; Torney, Susan E.
2001-01-01
The Engineering and Mission Operations Directorates at NASA Johnson Space Center are combining laboratories and expertise to establish the Human Centered Autonomous and Assistant Systems Testbed for Exploration Operations. This is a testbed for human centered design, development and evaluation of intelligent autonomous and assistant systems that will be needed for human exploration and development of space. This project will improve human-centered analysis, design and evaluation methods for developing intelligent software. This software will support human-machine cognitive and collaborative activities in future interplanetary work environments where distributed computer and human agents cooperate. We are developing and evaluating prototype intelligent systems for distributed multi-agent mixed-initiative operations. The primary target domain is control of life support systems in a planetary base. Technical approaches will be evaluated for use during extended manned tests in the target domain, the Bioregenerative Advanced Life Support Systems Test Complex (BIO-Plex). A spinoff target domain is the International Space Station (ISS) Mission Control Center (MCC). Prodl}cts of this project include human-centered intelligent software technology, innovative human interface designs, and human-centered software development processes, methods and products. The testbed uses adjustable autonomy software and life support systems simulation models from the Adjustable Autonomy Testbed, to represent operations on the remote planet. Ground operations prototypes and concepts will be evaluated in the Exploration Planning and Operations Center (ExPOC) and Jupiter Facility.
[Current status and future of telemonitoring : Scenarios for telemedical care in 2025].
Zippel-Schultz, Bettina; Schultz, Carsten; Helms, Thomas M
2017-09-01
Telemonitoring is an already realized implementation of digital transformation in the healthcare system. It has the potential to support and secure a sustainable and comprehensive provision of healthcare for a rising number of chronically ill patients, e. g. patients with chronic heart failure. Remote regions in particular can profit from the benefits of telemonitoring; however, so far telemonitoring services have not become truly established in the German healthcare market. Together with experts from politics, science and practice, a scenario analysis "Health Care System 2025 - A Place for Telemonitoring?" was carried out with the aim to examine the future development of the healthcare market and to draw conclusions for providers of telemonitoring services or devices. The scenario analysis contained two workshops and an expert survey and was supported by a scenario software. The current drivers and barriers of the diffusion of telemonitoring were identified and the most relevant factors that influence the future development of the healthcare market were discussed. Based on those influencing factors, three different scenarios were determined: (1) administrating rather than shaping, (2) safely into the future and (3) interconnected and digital world. In the subsequent consequence analysis activities were defined, which describe the necessary infrastructure, software instruments, organizational structures and provision of services and discuss possible activities, which prepare telemonitoring solutions for the future.
Three Object-Oriented enhancement for EPICS
NASA Astrophysics Data System (ADS)
Osberg, E. A.; Dohan, D. A.; Richter, R.; Biggs, R.; Chillara, K.; Wade, D.; Bossom, J.
1994-12-01
In line with our group's intention of producing software using, where possible, Object-Oriented methodologies and techniques in the development of RF control systems, we have undertaken three projects to enhance the EPICS software environment. Two of the projects involve interfaces to EPICs Channel Access from Object-Oriented languages. The third is an enhancement to the EPICS State Notation Language to better support the Shlaer-Mellor Object-Oriented Analysis and Design Methodology. This paper discusses the motivation, approaches, results and future directions of these three projects.
Intelligent Systems Technologies for Ops
NASA Technical Reports Server (NTRS)
Smith, Ernest E.; Korsmeyer, David J.
2012-01-01
As NASA supports International Space Station assembly complete operations through 2020 (or later) and prepares for future human exploration programs, there is additional emphasis in the manned spaceflight program to find more efficient and effective ways of providing the ground-based mission support. Since 2006 this search for improvement has led to a significant cross-fertilization between the NASA advanced software development community and the manned spaceflight operations community. A variety of mission operations systems and tools have been developed over the past decades as NASA has operated the Mars robotic missions, the Space Shuttle, and the International Space Station. NASA Ames Research Center has been developing and applying its advanced intelligent systems research to mission operations tools for both unmanned Mars missions operations since 2001 and to manned operations with NASA Johnson Space Center since 2006. In particular, the fundamental advanced software development work under the Exploration Technology Program, and the experience and capabilities developed for mission operations systems for the Mars surface missions, (Spirit/Opportunity, Phoenix Lander, and MSL) have enhanced the development and application of advanced mission operation systems for the International Space Station and future spacecraft. This paper provides an update on the status of the development and deployment of a variety of intelligent systems technologies adopted for manned mission operations, and some discussion of the planned work for Autonomous Mission Operations in future human exploration. We discuss several specific projects between the Ames Research Center and the Johnson Space Centers Mission Operations Directorate, and how these technologies and projects are enhancing the mission operations support for the International Space Station, and supporting the current Autonomous Mission Operations Project for the mission operation support of the future human exploration programs.
An Information Highway to the Future.
ERIC Educational Resources Information Center
Duderstadt, James J.
1992-01-01
Discussion of the evolution of a postindustrial, knowledge-based society addresses the importance of intellectual power and information technology as strategic resources, communications technology, development of the National Research and Education Network (NREN), the need for creative software applications, implications of advanced information…
Continuous integration for concurrent MOOSE framework and application development on GitHub
Slaughter, Andrew E.; Peterson, John W.; Gaston, Derek R.; ...
2015-11-20
For the past several years, Idaho National Laboratory’s MOOSE framework team has employed modern software engineering techniques (continuous integration, joint application/framework source code repos- itories, automated regression testing, etc.) in developing closed-source multiphysics simulation software (Gaston et al., Journal of Open Research Software vol. 2, article e10, 2014). In March 2014, the MOOSE framework was released under an open source license on GitHub, significantly expanding and diversifying the pool of current active and potential future contributors on the project. Despite this recent growth, the same philosophy of concurrent framework and application development continues to guide the project’s development roadmap. Severalmore » specific practices, including techniques for managing multiple repositories, conducting automated regression testing, and implementing a cascading build process are discussed in this short paper. Furthermore, special attention is given to describing the manner in which these practices naturally synergize with the GitHub API and GitHub-specific features such as issue tracking, Pull Requests, and project forks.« less
Continuous integration for concurrent MOOSE framework and application development on GitHub
DOE Office of Scientific and Technical Information (OSTI.GOV)
Slaughter, Andrew E.; Peterson, John W.; Gaston, Derek R.
For the past several years, Idaho National Laboratory’s MOOSE framework team has employed modern software engineering techniques (continuous integration, joint application/framework source code repos- itories, automated regression testing, etc.) in developing closed-source multiphysics simulation software (Gaston et al., Journal of Open Research Software vol. 2, article e10, 2014). In March 2014, the MOOSE framework was released under an open source license on GitHub, significantly expanding and diversifying the pool of current active and potential future contributors on the project. Despite this recent growth, the same philosophy of concurrent framework and application development continues to guide the project’s development roadmap. Severalmore » specific practices, including techniques for managing multiple repositories, conducting automated regression testing, and implementing a cascading build process are discussed in this short paper. Furthermore, special attention is given to describing the manner in which these practices naturally synergize with the GitHub API and GitHub-specific features such as issue tracking, Pull Requests, and project forks.« less
The maturing of the quality improvement paradigm in the SEL
NASA Technical Reports Server (NTRS)
Basili, Victor R.
1993-01-01
The Software Engineering Laboratory uses a paradigm for improving the software process and product, called the quality improvement paradigm. This paradigm has evolved over the past 18 years, along with our software development processes and product. Since 1976, when we first began the SEL, we have learned a great deal about improving the software process and product, making a great many mistakes along the way. Quality improvement paradigm, as it is currently defined, can be broken up into six steps: characterize the current project and its environment with respect to the appropriate models and metrics; set the quantifiable goals for successful project performance and improvement; choose the appropriate process model and supporting methods and tools for this project; execute the processes, construct the products, and collect, validate, and analyze the data to provide real-time feedback for corrective action; analyze the data to evaluate the current practices, determine problems, record findings, and make recommendations for future project improvements; and package the experience gained in the form of updated and refined models and other forms of structured knowledge gained from this and prior projects and save it in an experience base to be reused on future projects.
NASA Astrophysics Data System (ADS)
Valencia, J.; Muñoz-Nieto, A.; Rodriguez-Gonzalvez, P.
2015-02-01
3D virtual modeling, visualization, dissemination and management of urban areas is one of the most exciting challenges that must face geomatics in the coming years. This paper aims to review, compare and analyze the new technologies, policies and software tools that are in progress to manage urban 3D information. It is assumed that the third dimension increases the quality of the model provided, allowing new approaches to urban planning, conservation and management of architectural and archaeological areas. Despite the fact that displaying 3D urban environments is an issue nowadays solved, there are some challenges to be faced by geomatics in the coming future. Displaying georeferenced linked information would be considered the first challenge. Another challenge to face is to improve the technical requirements if this georeferenced information must be shown in real time. Are there available software tools ready for this challenge? Are they useful to provide services required in smart cities? Throughout this paper, many practical examples that require 3D georeferenced information and linked data will be shown. Computer advances related to 3D spatial databases and software that are being developed to convert rendering virtual environment to a new enriched environment with linked information will be also analyzed. Finally, different standards that Open Geospatial Consortium has assumed and developed regarding the three-dimensional geographic information will be reviewed. Particular emphasis will be devoted on KML, LandXML, CityGML and the new IndoorGML.
Looking toward the Future: A Case Study of Open Source Software in the Humanities
ERIC Educational Resources Information Center
Quamen, Harvey
2006-01-01
In this article Harvey Quamen examines how the philosophy of open source software might be of particular benefit to humanities scholars in the near future--particularly for academic journals with limited financial resources. To this end he provides a case study in which he describes his use of open source technology (MySQL database software and…
SIMA: Python software for analysis of dynamic fluorescence imaging data.
Kaifosh, Patrick; Zaremba, Jeffrey D; Danielson, Nathan B; Losonczy, Attila
2014-01-01
Fluorescence imaging is a powerful method for monitoring dynamic signals in the nervous system. However, analysis of dynamic fluorescence imaging data remains burdensome, in part due to the shortage of available software tools. To address this need, we have developed SIMA, an open source Python package that facilitates common analysis tasks related to fluorescence imaging. Functionality of this package includes correction of motion artifacts occurring during in vivo imaging with laser-scanning microscopy, segmentation of imaged fields into regions of interest (ROIs), and extraction of signals from the segmented ROIs. We have also developed a graphical user interface (GUI) for manual editing of the automatically segmented ROIs and automated registration of ROIs across multiple imaging datasets. This software has been designed with flexibility in mind to allow for future extension with different analysis methods and potential integration with other packages. Software, documentation, and source code for the SIMA package and ROI Buddy GUI are freely available at http://www.losonczylab.org/sima/.
Kushniruk, Andre; Borycki, Elizabeth; Kuo, Mu-Hsing; Parapini, Eric; Wang, Shu Lin; Ho, Kendall
2014-01-01
Electronic health records and related technologies are being increasingly deployed throughout the world. It is expected that upon graduation health professionals will be able to use these technologies in effective and efficient ways. However, educating health professional students about such technologies has lagged behind. There is a need for software that will allow medical, nursing and health informatics students access to this important software to learn how it works and how to use it effectively. Furthermore, electronic health record educational software that is accessed should provide a range of functions including allowing instructors to build patient cases. Such software should also allow for simulation of a course of a patient's stay and the ability to allow instructors to monitor student use of electronic health records. In this paper we describe our work in developing the requirements for an educational electronic health record to support education about this important technology. We also describe a prototype system being developed based on the requirements gathered.
CellProfiler and KNIME: open source tools for high content screening.
Stöter, Martin; Niederlein, Antje; Barsacchi, Rico; Meyenhofer, Felix; Brandl, Holger; Bickle, Marc
2013-01-01
High content screening (HCS) has established itself in the world of the pharmaceutical industry as an essential tool for drug discovery and drug development. HCS is currently starting to enter the academic world and might become a widely used technology. Given the diversity of problems tackled in academic research, HCS could experience some profound changes in the future, mainly with more imaging modalities and smart microscopes being developed. One of the limitations in the establishment of HCS in academia is flexibility and cost. Flexibility is important to be able to adapt the HCS setup to accommodate the multiple different assays typical of academia. Many cost factors cannot be avoided, but the costs of the software packages necessary to analyze large datasets can be reduced by using Open Source software. We present and discuss the Open Source software CellProfiler for image analysis and KNIME for data analysis and data mining that provide software solutions which increase flexibility and keep costs low.
Software used with the flux mapper at the solar parabolic dish test site
NASA Technical Reports Server (NTRS)
Miyazono, C.
1984-01-01
Software for data archiving and data display was developed for use on a Digital Equipment Corporation (DEC) PDP-11/34A minicomputer for use with the JPL-designed flux mapper. The flux mapper is a two-dimensional, high radiant energy scanning device designed to measure radiant flux energies expected at the focal point of solar parabolic dish concentrators. Interfacing to the DEC equipment was accomplished by standard RS-232C serial lines. The design of the software was dicated by design constraints of the flux-mapper controller. Early attemps at data acquisition from the flux-mapper controller were not without difficulty. Time and personnel limitations result in an alternative method of data recording at the test site with subsequent analysis accomplished at a data evaluation location at some later time. Software for plotting was also written to better visualize the flux patterns. Recommendations for future alternative development are discussed. A listing of the programs used in the anaysis is included in an appendix.
The Software Correlator of the Chinese VLBI Network
NASA Technical Reports Server (NTRS)
Zheng, Weimin; Quan, Ying; Shu, Fengchun; Chen, Zhong; Chen, Shanshan; Wang, Weihua; Wang, Guangli
2010-01-01
The software correlator of the Chinese VLBI Network (CVN) has played an irreplaceable role in the CVN routine data processing, e.g., in the Chinese lunar exploration project. This correlator will be upgraded to process geodetic and astronomical observation data. In the future, with several new stations joining the network, CVN will carry out crustal movement observations, quick UT1 measurements, astrophysical observations, and deep space exploration activities. For the geodetic or astronomical observations, we need a wide-band 10-station correlator. For spacecraft tracking, a realtime and highly reliable correlator is essential. To meet the scientific and navigation requirements of CVN, two parallel software correlators in the multiprocessor environments are under development. A high speed, 10-station prototype correlator using the mixed Pthreads and MPI (Massage Passing Interface) parallel algorithm on a computer cluster platform is being developed. Another real-time software correlator for spacecraft tracking adopts the thread-parallel technology, and it runs on the SMP (Symmetric Multiple Processor) servers. Both correlators have the characteristic of flexible structure and scalability.
Structural Analysis and Design Software
NASA Technical Reports Server (NTRS)
1997-01-01
Collier Research and Development Corporation received a one-of-a-kind computer code for designing exotic hypersonic aircraft called ST-SIZE in the first ever Langley Research Center software copyright license agreement. Collier transformed the NASA computer code into a commercial software package called HyperSizer, which integrates with other Finite Element Modeling and Finite Analysis private-sector structural analysis program. ST-SIZE was chiefly conceived as a means to improve and speed the structural design of a future aerospace plane for Langley Hypersonic Vehicles Office. Including the NASA computer code into HyperSizer has enabled the company to also apply the software to applications other than aerospace, including improved design and construction for offices, marine structures, cargo containers, commercial and military aircraft, rail cars, and a host of everyday consumer products.
ACRF Instrumentation Status: New, Current, and Future - January 2008
DOE Office of Scientific and Technical Information (OSTI.GOV)
AS Koontz; S Choudhury; BD Ermold
2008-01-31
The purpose of this report is to provide status of the ingest software used to process instrument data for the Atmospheric Radiation Measurement Program Climate Research Facility (ACRF). The report is divided into 4 sections: (1) for news about ingests currently under development, (2) for current production ingests, (3) for future ingest development plans, and (4) for information on retired ingests. Please note that datastreams beginning in “xxx” indicate cases where ingests run at multiple ACRF sites, which results in a datastream(s) for each location.
Faster Aerodynamic Simulation With Cart3D
NASA Technical Reports Server (NTRS)
2003-01-01
A NASA-developed aerodynamic simulation tool is ensuring the safety of future space operations while providing designers and engineers with an automated, highly accurate computer simulation suite. Cart3D, co-winner of NASA's 2002 Software of the Year award, is the result of over 10 years of research and software development conducted by Michael Aftosmis and Dr. John Melton of Ames Research Center and Professor Marsha Berger of the Courant Institute at New York University. Cart3D offers a revolutionary approach to computational fluid dynamics (CFD), the computer simulation of how fluids and gases flow around an object of a particular design. By fusing technological advancements in diverse fields such as mineralogy, computer graphics, computational geometry, and fluid dynamics, the software provides a new industrial geometry processing and fluid analysis capability with unsurpassed automation and efficiency.
Collaboration Between NASA Centers of Excellence on Autonomous System Software Development
NASA Technical Reports Server (NTRS)
Goodrich, Charles H.; Larson, William E.; Delgado, H. (Technical Monitor)
2001-01-01
Software for space systems flight operations has its roots in the early days of the space program when computer systems were incapable of supporting highly complex and flexible control logic. Control systems relied on fast data acquisition and supervisory control from a roomful of systems engineers on the ground. Even though computer hardware and software has become many orders of magnitude more capable, space systems have largely adhered to this original paradigm In an effort to break this mold, Kennedy Space Center (KSC) has invested in the development of model-based diagnosis and control applications for ten years having broad experience in both ground and spacecraft systems and software. KSC has now partnered with Ames Research Center (ARC), NASA's Center of Excellence in Information Technology, to create a new paradigm for the control of dynamic space systems. ARC has developed model-based diagnosis and intelligent planning software that enables spacecraft to handle most routine problems automatically and allocate resources in a flexible way to realize mission objectives. ARC demonstrated the utility of onboard diagnosis and planning with an experiment aboard Deep Space I in 1999. This paper highlights the software control system collaboration between KSC and ARC. KSC has developed a Mars In-situ Resource Utilization testbed based on the Reverse Water Gas Shift (RWGS) reaction. This plant, built in KSC's Applied Chemistry Laboratory, is capable of producing the large amount of Oxygen that would be needed to support a Human Mars Mission. KSC and ARC are cooperating to develop an autonomous, fault-tolerant control system for RWGS to meet the need for autonomy on deep space missions. The paper will also describe how the new system software paradigm will be applied to Vehicle Health Monitoring, tested on the new X vehicles and integrated into future launch processing systems.
[The Strategic Organization of Skill
NASA Technical Reports Server (NTRS)
Roberts, Ralph
1996-01-01
Eye-movement software was developed in addition to several studies that focused on expert-novice differences in the acquisition and organization of skill. These studies focused on how increasingly complex strategies utilize and incorporate visual look-ahead to calibrate action. Software for collecting, calibrating, and scoring eye-movements was refined and updated. Some new algorithms were developed for analyzing corneal-reflection eye movement data that detect the location of saccadic eye movements in space and time. Two full-scale studies were carried out which examined how experts use foveal and peripheral vision to acquire information about upcoming environmental circumstances in order to plan future action(s) accordingly.
NASA Astrophysics Data System (ADS)
Lyu, Bo-Han; Wang, Chen; Tsai, Chun-Wei
2017-08-01
Jasper Display Corp. (JDC) offer high reflectivity, high resolution Liquid Crystal on Silicon - Spatial Light Modulator (LCoS-SLM) which include an associated controller ASIC and LabVIEW based modulation software. Based on this LCoS-SLM, also called Education Kit (EDK), we provide a training platform which includes a series of optical theory and experiments to university students. This EDK not only provides a LabVIEW based operation software to produce Computer Generated Holograms (CGH) to generate some basic diffraction image or holographic image, but also provides simulation software to verity the experiment results simultaneously. However, we believe that a robust LCoSSLM, operation software, simulation software, training system, and training course can help students to study the fundamental optics, wave optics, and Fourier optics more easily. Based on these fundamental knowledges, they could develop their unique skills and create their new innovations on the optoelectronic application in the future.
Reengineering legacy software to object-oriented systems
NASA Technical Reports Server (NTRS)
Pitman, C.; Braley, D.; Fridge, E.; Plumb, A.; Izygon, M.; Mears, B.
1994-01-01
NASA has a legacy of complex software systems that are becoming increasingly expensive to maintain. Reengineering is one approach to modemizing these systems. Object-oriented technology, other modem software engineering principles, and automated tools can be used to reengineer the systems and will help to keep maintenance costs of the modemized systems down. The Software Technology Branch at the NASA/Johnson Space Center has been developing and testing reengineering methods and tools for several years. The Software Technology Branch is currently providing training and consulting support to several large reengineering projects at JSC, including the Reusable Objects Software Environment (ROSE) project, which is reengineering the flight analysis and design system (over 2 million lines of FORTRAN code) into object-oriented C++. Many important lessons have been learned during the past years; one of these is that the design must never be allowed to diverge from the code during maintenance and enhancement. Future work on open, integrated environments to support reengineering is being actively planned.
DAQ: Software Architecture for Data Acquisition in Sounding Rockets
NASA Technical Reports Server (NTRS)
Ahmad, Mohammad; Tran, Thanh; Nichols, Heidi; Bowles-Martinez, Jessica N.
2011-01-01
A multithreaded software application was developed by Jet Propulsion Lab (JPL) to collect a set of correlated imagery, Inertial Measurement Unit (IMU) and GPS data for a Wallops Flight Facility (WFF) sounding rocket flight. The data set will be used to advance Terrain Relative Navigation (TRN) technology algorithms being researched at JPL. This paper describes the software architecture and the tests used to meet the timing and data rate requirements for the software used to collect the dataset. Also discussed are the challenges of using commercial off the shelf (COTS) flight hardware and open source software. This includes multiple Camera Link (C-link) based cameras, a Pentium-M based computer, and Linux Fedora 11 operating system. Additionally, the paper talks about the history of the software architecture's usage in other JPL projects and its applicability for future missions, such as cubesats, UAVs, and research planes/balloons. Also talked about will be the human aspect of project especially JPL's Phaeton program and the results of the launch.
Case Study: Audio-Guided Learning, with Computer Graphics.
ERIC Educational Resources Information Center
Koumi, Jack; Daniels, Judith
1994-01-01
Describes teaching packages which involve the use of audiotape recordings with personal computers in Open University (United Kingdom) mathematics courses. Topics addressed include software development; computer graphics; pedagogic principles for distance education; feedback, including course evaluations and student surveys; and future plans.…
1994-07-18
09 Software Product Training 3 .4 .11 Physical Cues Segment Development3 .01 Technical Management .02 SW Requirements Analysis .03 Preliminary Design...Mission Planning Subsystem Development3 .01 Technical Management .02 SW Requirements Analysis .03 Preliminary Design - .04 Detailed Design .05 Code & CSU
Image analysis library software development
NASA Technical Reports Server (NTRS)
Guseman, L. F., Jr.; Bryant, J.
1977-01-01
The Image Analysis Library consists of a collection of general purpose mathematical/statistical routines and special purpose data analysis/pattern recognition routines basic to the development of image analysis techniques for support of current and future Earth Resources Programs. Work was done to provide a collection of computer routines and associated documentation which form a part of the Image Analysis Library.
ERIC Educational Resources Information Center
Rutherford, Teomara; Kibrick, Melissa; Burchinal, Margaret; Richland, Lindsey; Conley, AnneMarie; Osborne, Keara; Schneider, Stephanie; Duran, Lauren; Coulson, Andrew; Antenore, Fran; Daniels, Abby; Martinez, Michael E.
2010-01-01
This paper describes the background, methodology, preliminary findings, and anticipated future directions of a large-scale multi-year randomized field experiment addressing the efficacy of ST Math [Spatial-Temporal Math], a fully-developed math curriculum that uses interactive animated software. ST Math's unique approach minimizes the use of…
ERIC Educational Resources Information Center
Henard, Ralph E.
Possible future developments in artificial intelligence (AI) as well as its limitations are considered that have implications for institutional research in higher education, and especially decision making and decision support systems. It is noted that computer software programs have been developed that store knowledge and mimic the decision-making…
Space-Based Reconfigurable Software Defined Radio Test Bed Aboard International Space Station
NASA Technical Reports Server (NTRS)
Reinhart, Richard C.; Lux, James P.
2014-01-01
The National Aeronautical and Space Administration (NASA) recently launched a new software defined radio research test bed to the International Space Station. The test bed, sponsored by the Space Communications and Navigation (SCaN) Office within NASA is referred to as the SCaN Testbed. The SCaN Testbed is a highly capable communications system, composed of three software defined radios, integrated into a flight system, and mounted to the truss of the International Space Station. Software defined radios offer the future promise of in-flight reconfigurability, autonomy, and eventually cognitive operation. The adoption of software defined radios offers space missions a new way to develop and operate space transceivers for communications and navigation. Reconfigurable or software defined radios with communications and navigation functions implemented in software or VHDL (Very High Speed Hardware Description Language) provide the capability to change the functionality of the radio during development or after launch. The ability to change the operating characteristics of a radio through software once deployed to space offers the flexibility to adapt to new science opportunities, recover from anomalies within the science payload or communication system, and potentially reduce development cost and risk by adapting generic space platforms to meet specific mission requirements. The software defined radios on the SCaN Testbed are each compliant to NASA's Space Telecommunications Radio System (STRS) Architecture. The STRS Architecture is an open, non-proprietary architecture that defines interfaces for the connections between radio components. It provides an operating environment to abstract the communication waveform application from the underlying platform specific hardware such as digital-to-analog converters, analog-to-digital converters, oscillators, RF attenuators, automatic gain control circuits, FPGAs, general-purpose processors, etc. and the interconnections among different radio components.
NASA Technical Reports Server (NTRS)
Uber, James G.
1988-01-01
Software itself is not hazardous, but since software and hardware share common interfaces there is an opportunity for software to create hazards. Further, these software systems are complex, and proven methods for the design, analysis, and measurement of software safety are not yet available. Some past software failures, future NASA software trends, software engineering methods, and tools and techniques for various software safety analyses are reviewed. Recommendations to NASA are made based on this review.
Playbook Data Analysis Tool: Collecting Interaction Data from Extremely Remote Users
NASA Technical Reports Server (NTRS)
Kanefsky, Bob; Zheng, Jimin; Deliz, Ivonne; Marquez, Jessica J.; Hillenius, Steven
2017-01-01
Typically, user tests for software tools are conducted in person. At NASA, the users may be located at the bottom of the ocean in a pressurized habitat, above the atmosphere in the International Space Station, or in an isolated capsule on a simulated asteroid mission. The Playbook Data Analysis Tool (P-DAT) is a human-computer interaction (HCI) evaluation tool that the NASA Ames HCI Group has developed to record user interactions with Playbook, the group's existing planning-and-execution software application. Once the remotely collected user interaction data makes its way back to Earth, researchers can use P-DAT for in-depth analysis. Since a critical component of the Playbook project is to understand how to develop more intuitive software tools for astronauts to plan in space, P-DAT helps guide us in the development of additional easy-to-use features for Playbook, informing the design of future crew autonomy tools.P-DAT has demonstrated the capability of discreetly capturing usability data in amanner that is transparent to Playbook’s end-users. In our experience, P-DAT data hasalready shown its utility, revealing potential usability patterns, helping diagnose softwarebugs, and identifying metrics and events that are pertinent to Playbook usage aswell as spaceflight operations. As we continue to develop this analysis tool, P-DATmay yet provide a method for long-duration, unobtrusive human performance collectionand evaluation for mission controllers back on Earth and researchers investigatingthe effects and mitigations related to future human spaceflight performance.
Past missions - the best way to train future planetary researchers
NASA Astrophysics Data System (ADS)
Kozlova, Natalia; Solodovnikova, Anastasiya; Zubarev, Anatoly; Garov, Andrey; Patraty, Vyacheslav; Kokhanov, Alexander; Karachevtseva, Irina; Nadezhdina, Irina; Konopikhin, Anatoly; Oberst, Juergen
2015-04-01
Practice shows that it is much more interesting and useful to learn from real examples than on imaginary tasks from exercise books. The more technologies and software improves and develops, the more information and new products can be obtained from new processing of archive information collected by past planetary missions. So at MIIGAiK we carry out modern processing of lunar panoramic images obtained by Soviet Lunokhod missions (1970-1973). During two years of the study, which is a part of PRoViDE project (http://www.provide-space.eu/), many students, PhD students, young scientists, as well as professors have taken part in this research. Processing of the data obtained so long ago requires development of specific methods, techniques, special software and extraordinary approach. All these points help to interest young people in planetary science and develop their skills as researchers. Another advantage of data from previous missions is that you can compare your results with the ones obtained during the mission. This also helps to test the developed techniques and software on real data and adjust them for implementation in future missions. The work on Lunokhod data processing became the basis of master and PhD theses of MIIGAiK students and scientists at MExLab. Acknowledgments: The research leading to these results has received funding from the European Community's Seventh Framework Programme (FP7/2007-2013) under grant agreement No 312377 PRoViDE.
NASA Technical Reports Server (NTRS)
Stensrud, Kjell C.; Hamm, Dustin
2007-01-01
NASA's Johnson Space Center (JSC) / Flight Design and Dynamics Division (DM) has prototyped the use of Open Source middleware technology for building its next generation spacecraft mission support system. This is part of a larger initiative to use open standards and open source software as building blocks for future mission and safety critical systems. JSC is hoping to leverage standardized enterprise architectures, such as Java EE, so that its internal software development efforts can be focused on the core aspects of their problem domain. This presentation will outline the design and implementation of the Trajectory system and the lessons learned during the exercise.
A digital flight control system verification laboratory
NASA Technical Reports Server (NTRS)
De Feo, P.; Saib, S.
1982-01-01
A NASA/FAA program has been established for the verification and validation of digital flight control systems (DFCS), with the primary objective being the development and analysis of automated verification tools. In order to enhance the capabilities, effectiveness, and ease of using the test environment, software verification tools can be applied. Tool design includes a static analyzer, an assertion generator, a symbolic executor, a dynamic analysis instrument, and an automated documentation generator. Static and dynamic tools are integrated with error detection capabilities, resulting in a facility which analyzes a representative testbed of DFCS software. Future investigations will ensue particularly in the areas of increase in the number of software test tools, and a cost effectiveness assessment.
Key Questions in Building Defect Prediction Models in Practice
NASA Astrophysics Data System (ADS)
Ramler, Rudolf; Wolfmaier, Klaus; Stauder, Erwin; Kossak, Felix; Natschläger, Thomas
The information about which modules of a future version of a software system are defect-prone is a valuable planning aid for quality managers and testers. Defect prediction promises to indicate these defect-prone modules. However, constructing effective defect prediction models in an industrial setting involves a number of key questions. In this paper we discuss ten key questions identified in context of establishing defect prediction in a large software development project. Seven consecutive versions of the software system have been used to construct and validate defect prediction models for system test planning. Furthermore, the paper presents initial empirical results from the studied project and, by this means, contributes answers to the identified questions.
A Data-Driven Solution for Performance Improvement
NASA Technical Reports Server (NTRS)
2002-01-01
Marketed as the "Software of the Future," Optimal Engineering Systems P.I. EXPERT(TM) technology offers statistical process control and optimization techniques that are critical to businesses looking to restructure or accelerate operations in order to gain a competitive edge. Kennedy Space Center granted Optimal Engineering Systems the funding and aid necessary to develop a prototype of the process monitoring and improvement software. Completion of this prototype demonstrated that it was possible to integrate traditional statistical quality assurance tools with robust optimization techniques in a user- friendly format that is visually compelling. Using an expert system knowledge base, the software allows the user to determine objectives, capture constraints and out-of-control processes, predict results, and compute optimal process settings.
Overview and Software Architecture of the Copernicus Trajectory Design and Optimization System
NASA Technical Reports Server (NTRS)
Williams, Jacob; Senent, Juan S.; Ocampo, Cesar; Mathur, Ravi; Davis, Elizabeth C.
2010-01-01
The Copernicus Trajectory Design and Optimization System represents an innovative and comprehensive approach to on-orbit mission design, trajectory analysis and optimization. Copernicus integrates state of the art algorithms in optimization, interactive visualization, spacecraft state propagation, and data input-output interfaces, allowing the analyst to design spacecraft missions to all possible Solar System destinations. All of these features are incorporated within a single architecture that can be used interactively via a comprehensive GUI interface, or passively via external interfaces that execute batch processes. This paper describes the Copernicus software architecture together with the challenges associated with its implementation. Additionally, future development and planned new capabilities are discussed. Key words: Copernicus, Spacecraft Trajectory Optimization Software.
Adopting Open Source Software to Address Software Risks during the Scientific Data Life Cycle
NASA Astrophysics Data System (ADS)
Vinay, S.; Downs, R. R.
2012-12-01
Software enables the creation, management, storage, distribution, discovery, and use of scientific data throughout the data lifecycle. However, the capabilities offered by software also present risks for the stewardship of scientific data, since future access to digital data is dependent on the use of software. From operating systems to applications for analyzing data, the dependence of data on software presents challenges for the stewardship of scientific data. Adopting open source software provides opportunities to address some of the proprietary risks of data dependence on software. For example, in some cases, open source software can be deployed to avoid licensing restrictions for using, modifying, and transferring proprietary software. The availability of the source code of open source software also enables the inclusion of modifications, which may be contributed by various community members who are addressing similar issues. Likewise, an active community that is maintaining open source software can be a valuable source of help, providing an opportunity to collaborate to address common issues facing adopters. As part of the effort to meet the challenges of software dependence for scientific data stewardship, risks from software dependence have been identified that exist during various times of the data lifecycle. The identification of these risks should enable the development of plans for mitigating software dependencies, where applicable, using open source software, and to improve understanding of software dependency risks for scientific data and how they can be reduced during the data life cycle.
Hilde, Thomas; Paterson, Robert
2014-12-15
Scenario planning continues to gain momentum in the United States as an effective process for building consensus on long-range community plans and creating regional visions for the future. However, efforts to integrate more sophisticated information into the analytical framework to help identify important ecosystem services have lagged in practice. This is problematic because understanding the tradeoffs of land consumption patterns on ecological integrity is central to mitigating the environmental degradation caused by land use change and new development. In this paper we describe how an ecosystem services valuation model, i-Tree, was integrated into a mainstream scenario planning software tool, Envision Tomorrow, to assess the benefits of public street trees for alternative future development scenarios. The tool is then applied to development scenarios from the City of Hutto, TX, a Central Texas Sustainable Places Project demonstration community. The integrated tool represents a methodological improvement for scenario planning practice, offers a way to incorporate ecosystem services analysis into mainstream planning processes, and serves as an example of how open source software tools can expand the range of issues available for community and regional planning consideration, even in cases where community resources are limited. The tool also offers room for future improvements; feasible options include canopy analysis of various future land use typologies, as well as a generalized street tree model for broader U.S. application. Copyright © 2014 Elsevier Ltd. All rights reserved.
Guidance, Navigation, and Control Technology Assessment for Future Planetary Science Missions
NASA Technical Reports Server (NTRS)
Beauchamp, Pat; Cutts, James; Quadrelli, Marco B.; Wood, Lincoln J.; Riedel, Joseph E.; McHenry, Mike; Aung, MiMi; Cangahuala, Laureano A.; Volpe, Rich
2013-01-01
Future planetary explorations envisioned by the National Research Council's (NRC's) report titled Vision and Voyages for Planetary Science in the Decade 2013-2022, developed for NASA Science Mission Directorate (SMD) Planetary Science Division (PSD), seek to reach targets of broad scientific interest across the solar system. This goal requires new capabilities such as innovative interplanetary trajectories, precision landing, operation in close proximity to targets, precision pointing, multiple collaborating spacecraft, multiple target tours, and advanced robotic surface exploration. Advancements in Guidance, Navigation, and Control (GN&C) and Mission Design in the areas of software, algorithm development and sensors will be necessary to accomplish these future missions. This paper summarizes the key GN&C and mission design capabilities and technologies needed for future missions pursuing SMD PSD's scientific goals.
Atmospheric and Oceanographic Information Processing System (AOIPS) system description
NASA Technical Reports Server (NTRS)
Bracken, P. A.; Dalton, J. T.; Billingsley, J. B.; Quann, J. J.
1977-01-01
The development of hardware and software for an interactive, minicomputer based processing and display system for atmospheric and oceanographic information extraction and image data analysis is described. The major applications of the system are discussed as well as enhancements planned for the future.
ERIC Educational Resources Information Center
Pressman, Israel; Rosenbloom, Bruce
1984-01-01
Describes and evaluates costs of hardware, software, training, and maintenance for computer assisted instruction (CAI) as they relate to total system cost. An example of an educational system provides an illustration of CAI cost analysis. Future developments, cost effectiveness, affordability, and applications in public and private environments…
Desktop Publishing: Things Gutenberg Never Taught You.
ERIC Educational Resources Information Center
Bowman, Joel P.; Renshaw, Debbie A.
1989-01-01
Provides a desktop publishing (DTP) overview, including: advantages and disadvantages; hardware and software requirements; and future development. Discusses cost-effectiveness, confidentiality, credibility, effects on volume of paper-based communication, and the need for training in layout and design which DTP creates. Includes a glossary of DTP…
Hagerstown-Jefferson Township Public Library Internet Web Site.
ERIC Educational Resources Information Center
Albertson, Marie
1997-01-01
Describes the development of the Hagerstown (Indiana) public library's Web site. Highlights include writing successful grant proposals for funding; software from Microsoft; community support; free community access to the Internet from home computers as well as at the library; problems encountered; and future plans. (LRW)
Fundamental Travel Demand Model Example
NASA Technical Reports Server (NTRS)
Hanssen, Joel
2010-01-01
Instances of transportation models are abundant and detailed "how to" instruction is available in the form of transportation software help documentation. The purpose of this paper is to look at the fundamental inputs required to build a transportation model by developing an example passenger travel demand model. The example model reduces the scale to a manageable size for the purpose of illustrating the data collection and analysis required before the first step of the model begins. This aspect of the model development would not reasonably be discussed in software help documentation (it is assumed the model developer comes prepared). Recommendations are derived from the example passenger travel demand model to suggest future work regarding the data collection and analysis required for a freight travel demand model.
Computing Properties of Hadrons, Nuclei and Nuclear Matter from Quantum Chromodynamics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Savage, Martin J.
This project was part of a coordinated software development effort which the nuclear physics lattice QCD community pursues in order to ensure that lattice calculations can make optimal use of present, and forthcoming leadership-class and dedicated hardware, including those of the national laboratories, and prepares for the exploitation of future computational resources in the exascale era. The UW team improved and extended software libraries used in lattice QCD calculations related to multi-nucleon systems, enhanced production running codes related to load balancing multi-nucleon production on large-scale computing platforms, and developed SQLite (addressable database) interfaces to efficiently archive and analyze multi-nucleon datamore » and developed a Mathematica interface for the SQLite databases.« less
CAD/CAM approach to improving industry productivity gathers momentum
NASA Technical Reports Server (NTRS)
Fulton, R. E.
1982-01-01
Recent results and planning for the NASA/industry Integrated Programs for Aerospace-Vehicle Design (IPAD) program for improving productivity with CAD/CAM methods are outlined. The industrial group work is being mainly done by Boeing, and progress has been made in defining the designer work environment, developing requirements and a preliminary design for a future CAD/CAM system, and developing CAD/CAM technology. The work environment was defined by conducting a detailed study of a reference design process, and key software elements for a CAD/CAM system have been defined, specifically for interactive design or experiment control processes. Further work is proceeding on executive, data management, geometry and graphics, and general utility software, and dynamic aspects of the programs being developed are outlined
The CMS High Level Trigger System: Experience and Future Development
NASA Astrophysics Data System (ADS)
Bauer, G.; Behrens, U.; Bowen, M.; Branson, J.; Bukowiec, S.; Cittolin, S.; Coarasa, J. A.; Deldicque, C.; Dobson, M.; Dupont, A.; Erhan, S.; Flossdorf, A.; Gigi, D.; Glege, F.; Gomez-Reino, R.; Hartl, C.; Hegeman, J.; Holzner, A.; Hwong, Y. L.; Masetti, L.; Meijers, F.; Meschi, E.; Mommsen, R. K.; O'Dell, V.; Orsini, L.; Paus, C.; Petrucci, A.; Pieri, M.; Polese, G.; Racz, A.; Raginel, O.; Sakulin, H.; Sani, M.; Schwick, C.; Shpakov, D.; Simon, S.; Spataru, A. C.; Sumorok, K.
2012-12-01
The CMS experiment at the LHC features a two-level trigger system. Events accepted by the first level trigger, at a maximum rate of 100 kHz, are read out by the Data Acquisition system (DAQ), and subsequently assembled in memory in a farm of computers running a software high-level trigger (HLT), which selects interesting events for offline storage and analysis at a rate of order few hundred Hz. The HLT algorithms consist of sequences of offline-style reconstruction and filtering modules, executed on a farm of 0(10000) CPU cores built from commodity hardware. Experience from the operation of the HLT system in the collider run 2010/2011 is reported. The current architecture of the CMS HLT, its integration with the CMS reconstruction framework and the CMS DAQ, are discussed in the light of future development. The possible short- and medium-term evolution of the HLT software infrastructure to support extensions of the HLT computing power, and to address remaining performance and maintenance issues, are discussed.
Remote access laboratories in Australia and Europe
NASA Astrophysics Data System (ADS)
Ku, H.; Ahfock, T.; Yusaf, T.
2011-06-01
Remote access laboratories (RALs) were first developed in 1994 in Australia and Switzerland. The main purposes of developing them are to enable students to do their experiments at their own pace, time and locations and to enable students and teaching staff to get access to facilities beyond their institutions. Currently, most of the experiments carried out through RALs in Australia are heavily biased towards electrical, electronic and computer engineering disciplines. However, the experiments carried out through RALs in Europe had more variety, in addition to the traditional electrical, electronic and computer engineering disciplines, there were experiments in mechanical and mechatronic disciplines. It was found that RALs are now being developed aggressively in Australia and Europe and it can be argued that RALs will develop further and faster in the future with improving Internet technology. The rising costs of real experimental equipment will also speed up their development because by making the equipment remotely accessible, the cost can be shared by more universities or institutions and this will improve their cost-effectiveness. Their development would be particularly rapid in large countries with small populations such as Australia, Canada and Russia, because of the scale of economy. Reusability of software, interoperability in software implementation, computer supported collaborative learning and convergence with learning management systems are the required development of future RALs.
Peridigm summary report : lessons learned in development with agile components.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Salinger, Andrew Gerhard; Mitchell, John Anthony; Littlewood, David John
2011-09-01
This report details efforts to deploy Agile Components for rapid development of a peridynamics code, Peridigm. The goal of Agile Components is to enable the efficient development of production-quality software by providing a well-defined, unifying interface to a powerful set of component-based software. Specifically, Agile Components facilitate interoperability among packages within the Trilinos Project, including data management, time integration, uncertainty quantification, and optimization. Development of the Peridigm code served as a testbed for Agile Components and resulted in a number of recommendations for future development. Agile Components successfully enabled rapid integration of Trilinos packages into Peridigm. A cost of thismore » approach, however, was a set of restrictions on Peridigm's architecture which impacted the ability to track history-dependent material data, dynamically modify the model discretization, and interject user-defined routines into the time integration algorithm. These restrictions resulted in modifications to the Agile Components approach, as implemented in Peridigm, and in a set of recommendations for future Agile Components development. Specific recommendations include improved handling of material states, a more flexible flow control model, and improved documentation. A demonstration mini-application, SimpleODE, was developed at the onset of this project and is offered as a potential supplement to Agile Components documentation.« less
RAY-UI: A powerful and extensible user interface for RAY
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baumgärtel, P., E-mail: peter.baumgaertel@helmholtz-berlin.de; Erko, A.; Schäfers, F.
2016-07-27
The RAY-UI project started as a proof-of-concept for an interactive and graphical user interface (UI) for the well-known ray tracing software RAY [1]. In the meantime, it has evolved into a powerful enhanced version of RAY that will serve as the platform for future development and improvement of associated tools. The software as of today supports nearly all sophisticated simulation features of RAY. Furthermore, it delivers very significant usability and work efficiency improvements. Beamline elements can be quickly added or removed in the interactive sequence view. Parameters of any selected element can be accessed directly and in arbitrary order. Withmore » a single click, parameter changes can be tested and new simulation results can be obtained. All analysis results can be explored interactively right after ray tracing by means of powerful integrated image viewing and graphing tools. Unlimited image planes can be positioned anywhere in the beamline, and bundles of image planes can be created for moving the plane along the beam to identify the focus position with live updates of the simulated results. In addition to showing the features and workflow of RAY-UI, we will give an overview of the underlying software architecture as well as examples for use and an outlook for future developments.« less
Data Integration: Charting a Path Forward to 2035
2011-02-14
New York, NY: Gotham Books, 2004. Seligman , Len. Mitre Corporation, e-mail interview, 6 Dec 2010. Singer, P.W. Wired for War: The Robotics...articles.aspx (accessed 4 Dec 2010). Ultra-Large-Scale Systems: The Software Challenge of the Future. Study lead Linda Northrup. Pittsburgh, PA: Carnegie...Virtualization?‖ 1. 41 Ultra-Large-Scale Systems: The Software Challenge of the Future. Study lead Linda Northrup. Pittsburgh, PA: Carnegie Mellon Software
An Introduction to Flight Software Development: FSW Today, FSW 2010
NASA Technical Reports Server (NTRS)
Gouvela, John
2004-01-01
Experience and knowledge gained from ongoing maintenance of Space Shuttle Flight Software and new development projects including Cockpit Avionics Upgrade are applied to projected needs of the National Space Exploration Vision through Spiral 2. Lessons learned from these current activities are applied to create a sustainable, reliable model for development of critical software to support Project Constellation. This presentation introduces the technologies, methodologies, and infrastructure needed to produce and sustain high quality software. It will propose what is needed to support a Vision for Space Exploration that places demands on the innovation and productivity needed to support future space exploration. The technologies in use today within FSW development include tools that provide requirements tracking, integrated change management, modeling and simulation software. Specific challenges that have been met include the introduction and integration of Commercial Off the Shelf (COTS) Real Time Operating System for critical functions. Though technology prediction has proved to be imprecise, Project Constellation requirements will need continued integration of new technology with evolving methodologies and changing project infrastructure. Targets for continued technology investment are integrated health monitoring and management, self healing software, standard payload interfaces, autonomous operation, and improvements in training. Emulation of the target hardware will also allow significant streamlining of development and testing. The methodologies in use today for FSW development are object oriented UML design, iterative development using independent components, as well as rapid prototyping . In addition, Lean Six Sigma and CMMI play a critical role in the quality and efficiency of the workforce processes. Over the next six years, we expect these methodologies to merge with other improvements into a consolidated office culture with all processes being guided by automated office assistants. The infrastructure in use today includes strict software development and configuration management procedures, including strong control of resource management and critical skills coverage. This will evolve to a fully integrated staff organization with efficient and effective communication throughout all levels guided by a Mission-Systems Architecture framework with focus on risk management and attention toward inevitable product obsolescence. This infrastructure of computing equipment, software and processes will itself be subject to technological change and need for management of change and improvement,
C++ Coding Standards for the AMP Project
DOE Office of Scientific and Technical Information (OSTI.GOV)
Evans, Thomas M; Clarno, Kevin T
2009-09-01
This document provides an initial starting point to define the C++ coding standards used by the AMP nuclear fuel performance integrated code project and a part of AMP's software development process. This document draws from the experiences, and documentation [1], of the developers of the Marmot Project at Los Alamos National Laboratory. Much of the software in AMP will be written in C++. The power of C++ can be abused easily, resulting in code that is difficult to understand and maintain. This document gives the practices that should be followed on the AMP project for all new code that ismore » written. The intent is not to be onerous but to ensure that the code can be readily understood by the entire code team and serve as a basis for collectively defining a set of coding standards for use in future development efforts. At the end of the AMP development in fiscal year (FY) 2010, all developers will have experience with the benefits, restrictions, and limitations of the standards described and will collectively define a set of standards for future software development. External libraries that AMP uses do not have to meet these requirements, although we encourage external developers to follow these practices. For any code of which AMP takes ownership, the project will decide on any changes on a case-by-case basis. The practices that we are using in the AMP project have been in use in the Denovo project [2] for several years. The practices build on those given in References [3-5]; the practices given in these references should also be followed. Some of the practices given in this document can also be found in [6].« less
Valjevac, Salih; Ridjanovic, Zoran; Masic, Izet
2009-01-01
CONFLICT OF INTEREST: NONE DECLARED SUMMARY Introduction Agency for healthcare quality and accreditation in Federation of Bosnia and Herzegovina (AKAZ) is authorized body in the field of healthcare quality and safety improvement and accreditation of healthcare institutions. Beside accreditation standards for hospitals and primary health care centers, AKAZ has also developed accreditation standards for family medicine teams. Methods Software development was primarily based on Accreditation Standards for Family Medicine Teams. Seven chapters / topics: (1. Physical factors; 2. Equipment; 3. Organization and Management; 4. Health promotion and illness prevention; 5. Clinical services; 6. Patient survey; and 7. Patient’s rights and obligations) contain 35 standards describing expected level of family medicine team’s quality. Based on accreditation standards structure and needs of different potential users, it was concluded that software backbone should be a database containing all accreditation standards, self assessment and external assessment details. In this article we will present the development of standardized software for self and external evaluation of quality of service in family medicine, as well as plans for the future development of this software package. Conclusion Electronic data gathering and storing enhances the management, access and overall use of information. During this project we came to conclusion that software for self assessment and external assessment is ideal for accreditation standards distribution, their overview by the family medicine team members, their self assessment and external assessment. PMID:24109157
Status and Plans for the Vienna VLBI and Satellite Software (VieVS 3.0)
NASA Astrophysics Data System (ADS)
Gruber, Jakob; Böhm, Johannes; Böhm, Sigrid; Girdiuk, Anastasiia; Hellerschmied, Andreas; Hofmeister, Armin; Krásná, Hana; Kwak, Younghee; Landskron, Daniel; Madzak, Matthias; Mayer, David; McCallum, Jamie; Plank, Lucia; Schartner, Matthias; Shabala, Stas; Teke, Kamil; Sun, Jing
2017-04-01
The Vienna VLBI and Satellite Software (VieVS) is a geodetic analysis software developed and maintained at Technische Universität Wien (TU Wien) with contributions from groups all over the world. It is used for both academic purposes in university courses as well as for providing Very Long Baseline Interferometry (VLBI) analysis results to the geodetic community. Written in a modular structure in Matlab, VieVS offers easy access to the source code and the possibility to adapt the programs for particular purposes. The new version 3.0, released in early 2017, includes several new features, e.g., improved scheduling capabilities for observing quasars and satellites. This poster gives an overview of all VLBI-related activities in Vienna and provides an outlook to future plans concerning the Vienna VLBI and Satellite Software (VieVS).
Flight Planning Branch NASA Co-op Tour
NASA Technical Reports Server (NTRS)
Marr, Aja M.
2013-01-01
This semester I worked with the Flight Planning Branch at the NASA Johnson Space Center. I learned about the different aspects of flight planning for the International Space Station as well as the software that is used internally and ISSLive! which is used to help educate the public on the space program. I had the opportunity to do on the job training in the Mission Control Center with the planning team. I transferred old timeline records from the planning team's old software to the new software in order to preserve the data for the future when the software is retired. I learned about the operations of the International Space Station, the importance of good communication between the different parts of the planning team, and enrolled in professional development classes as well as technical classes to learn about the space station.
The SCEC/UseIT Intern Program: Creating Open-Source Visualization Software Using Diverse Resources
NASA Astrophysics Data System (ADS)
Francoeur, H.; Callaghan, S.; Perry, S.; Jordan, T.
2004-12-01
The Southern California Earthquake Center undergraduate IT intern program (SCEC UseIT) conducts IT research to benefit collaborative earth science research. Through this program, interns have developed real-time, interactive, 3D visualization software using open-source tools. Dubbed LA3D, a distribution of this software is now in use by the seismic community. LA3D enables the user to interactively view Southern California datasets and models of importance to earthquake scientists, such as faults, earthquakes, fault blocks, digital elevation models, and seismic hazard maps. LA3D is now being extended to support visualizations anywhere on the planet. The new software, called SCEC-VIDEO (Virtual Interactive Display of Earth Objects), makes use of a modular, plugin-based software architecture which supports easy development and integration of new data sets. Currently SCEC-VIDEO is in beta testing, with a full open-source release slated for the future. Both LA3D and SCEC-VIDEO were developed using a wide variety of software technologies. These, which included relational databases, web services, software management technologies, and 3-D graphics in Java, were necessary to integrate the heterogeneous array of data sources which comprise our software. Currently the interns are working to integrate new technologies and larger data sets to increase software functionality and value. In addition, both LA3D and SCEC-VIDEO allow the user to script and create movies. Thus program interns with computer science backgrounds have been writing software while interns with other interests, such as cinema, geology, and education, have been making movies that have proved of great use in scientific talks, media interviews, and education. Thus, SCEC UseIT incorporates a wide variety of scientific and human resources to create products of value to the scientific and outreach communities. The program plans to continue with its interdisciplinary approach, increasing the relevance of the software and expanding its use in the scientific community.
Archiving Software Systems: Approaches to Preserve Computational Capabilities
NASA Astrophysics Data System (ADS)
King, T. A.
2014-12-01
A great deal of effort is made to preserve scientific data. Not only because data is knowledge, but it is often costly to acquire and is sometimes collected under unique circumstances. Another part of the science enterprise is the development of software to process and analyze the data. Developed software is also a large investment and worthy of preservation. However, the long term preservation of software presents some challenges. Software often requires a specific technology stack to operate. This can include software, operating systems and hardware dependencies. One past approach to preserve computational capabilities is to maintain ancient hardware long past its typical viability. On an archive horizon of 100 years, this is not feasible. Another approach to preserve computational capabilities is to archive source code. While this can preserve details of the implementation and algorithms, it may not be possible to reproduce the technology stack needed to compile and run the resulting applications. This future forward dilemma has a solution. Technology used to create clouds and process big data can also be used to archive and preserve computational capabilities. We explore how basic hardware, virtual machines, containers and appropriate metadata can be used to preserve computational capabilities and to archive functional software systems. In conjunction with data archives, this provides scientist with both the data and capability to reproduce the processing and analysis used to generate past scientific results.
Increasing productivity through Total Reuse Management (TRM)
NASA Technical Reports Server (NTRS)
Schuler, M. P.
1991-01-01
Total Reuse Management (TRM) is a new concept currently being promoted by the NASA Langley Software Engineering and Ada Lab (SEAL). It uses concepts similar to those promoted in Total Quality Management (TQM). Both technical and management personnel are continually encouraged to think in terms of reuse. Reuse is not something that is aimed for after a product is completed, but rather it is built into the product from inception through development. Lowering software development costs, reducing risk, and increasing code reliability are the more prominent goals of TRM. Procedures and methods used to adopt and apply TRM are described. Reuse is frequently thought of as only being applicable to code. However, reuse can apply to all products and all phases of the software life cycle. These products include management and quality assurance plans, designs, and testing procedures. Specific examples of successfully reused products are given and future goals are discussed.
WIFIP: a web-based user interface for automated synchrotron beamlines.
Sallaz-Damaz, Yoann; Ferrer, Jean Luc
2017-09-01
The beamline control software, through the associated graphical user interface (GUI), is the user access point to the experiment, interacting with synchrotron beamline components and providing automated routines. FIP, the French beamline for the Investigation of Proteins, is a highly automatized macromolecular crystallography (MX) beamline at the European Synchrotron Radiation Facility. On such a beamline, a significant number of users choose to control their experiment remotely. This is often performed with a limited bandwidth and from a large choice of computers and operating systems. Furthermore, this has to be possible in a rapidly evolving experimental environment, where new developments have to be easily integrated. To face these challenges, a light, platform-independent, control software and associated GUI are required. Here, WIFIP, a web-based user interface developed at FIP, is described. Further than being the present FIP control interface, WIFIP is also a proof of concept for future MX control software.
A streamlined Python framework for AT-TPC data analysis
NASA Astrophysics Data System (ADS)
Taylor, J. Z.; Bradt, J.; Bazin, D.; Kuchera, M. P.
2017-09-01
User-friendly data analysis software has been developed for the Active-Target Time Projection Chamber (AT-TPC) experiment at the National Superconducting Cyclotron Laboratory at Michigan State University. The AT-TPC, commissioned in 2014, is a gas-filled detector that acts as both the detector and target for high-efficiency detection of low-intensity, exotic nuclear reactions. The pytpc framework is a Python package for analyzing AT-TPC data. The package was developed for the analysis of 46Ar(p, p) data. The existing software was used to analyze data produced by the 40Ar(p, p) experiment that ran in August, 2015. Usage of the package was documented in an analysis manual both to improve analysis steps and aid in the work of future AT-TPC users. Software features and analysis methods in the pytpc framework will be presented along with the 40Ar results.
Segmenting Images for a Better Diagnosis
NASA Technical Reports Server (NTRS)
2004-01-01
NASA's Hierarchical Segmentation (HSEG) software has been adapted by Bartron Medical Imaging, LLC, for use in segmentation feature extraction, pattern recognition, and classification of medical images. Bartron acquired licenses from NASA Goddard Space Flight Center for application of the HSEG concept to medical imaging, from the California Institute of Technology/Jet Propulsion Laboratory to incorporate pattern-matching software, and from Kennedy Space Center for data-mining and edge-detection programs. The Med-Seg[TM] united developed by Bartron provides improved diagnoses for a wide range of medical images, including computed tomography scans, positron emission tomography scans, magnetic resonance imaging, ultrasound, digitized Z-ray, digitized mammography, dental X-ray, soft tissue analysis, and moving object analysis. It also can be used in analysis of soft-tissue slides. Bartron's future plans include the application of HSEG technology to drug development. NASA is advancing it's HSEG software to learn more about the Earth's magnetosphere.
CymeR: cytometry analysis using KNIME, docker and R
Muchmore, B.; Alarcón-Riquelme, M.E.
2017-01-01
Abstract Summary: Here we present open-source software for the analysis of high-dimensional cytometry data using state of the art algorithms. Importantly, use of the software requires no programming ability, and output files can either be interrogated directly in CymeR or they can be used downstream with any other cytometric data analysis platform. Also, because we use Docker to integrate the multitude of components that form the basis of CymeR, we have additionally developed a proof-of-concept of how future open-source bioinformatic programs with graphical user interfaces could be developed. Availability and Implementation: CymeR is open-source software that ties several components into a single program that is perhaps best thought of as a self-contained data analysis operating system. Please see https://github.com/bmuchmore/CymeR/wiki for detailed installation instructions. Contact: brian.muchmore@genyo.es or marta.alarcon@genyo.es PMID:27998935
CymeR: cytometry analysis using KNIME, docker and R.
Muchmore, B; Alarcón-Riquelme, M E
2017-03-01
Here we present open-source software for the analysis of high-dimensional cytometry data using state of the art algorithms. Importantly, use of the software requires no programming ability, and output files can either be interrogated directly in CymeR or they can be used downstream with any other cytometric data analysis platform. Also, because we use Docker to integrate the multitude of components that form the basis of CymeR, we have additionally developed a proof-of-concept of how future open-source bioinformatic programs with graphical user interfaces could be developed. CymeR is open-source software that ties several components into a single program that is perhaps best thought of as a self-contained data analysis operating system. Please see https://github.com/bmuchmore/CymeR/wiki for detailed installation instructions. brian.muchmore@genyo.es or marta.alarcon@genyo.es. © The Author 2016. Published by Oxford University Press.
Near-Infrared Neuroimaging with NinPy
Strangman, Gary E.; Zhang, Quan; Zeffiro, Thomas
2009-01-01
There has been substantial recent growth in the use of non-invasive optical brain imaging in studies of human brain function in health and disease. Near-infrared neuroimaging (NIN) is one of the most promising of these techniques and, although NIN hardware continues to evolve at a rapid pace, software tools supporting optical data acquisition, image processing, statistical modeling, and visualization remain less refined. Python, a modular and computationally efficient development language, can support functional neuroimaging studies of diverse design and implementation. In particular, Python's easily readable syntax and modular architecture allow swift prototyping followed by efficient transition to stable production systems. As an introduction to our ongoing efforts to develop Python software tools for structural and functional neuroimaging, we discuss: (i) the role of non-invasive diffuse optical imaging in measuring brain function, (ii) the key computational requirements to support NIN experiments, (iii) our collection of software tools to support NIN, called NinPy, and (iv) future extensions of these tools that will allow integration of optical with other structural and functional neuroimaging data sources. Source code for the software discussed here will be made available at www.nmr.mgh.harvard.edu/Neural_SystemsGroup/software.html. PMID:19543449
NASA Technical Reports Server (NTRS)
Biegel, Bryan A.
1999-01-01
We are on the path to meet the major challenges ahead for TCAD (technology computer aided design). The emerging computational grid will ultimately solve the challenge of limited computational power. The Modular TCAD Framework will solve the TCAD software challenge once TCAD software developers realize that there is no other way to meet industry's needs. The modular TCAD framework (MTF) also provides the ideal platform for solving the TCAD model challenge by rapid implementation of models in a partial differential solver.
Design of Mariner 9 Science Sequences using Interactive Graphics Software
NASA Technical Reports Server (NTRS)
Freeman, J. E.; Sturms, F. M, Jr.; Webb, W. A.
1973-01-01
This paper discusses the analyst/computer system used to design the daily science sequences required to carry out the desired Mariner 9 science plan. The Mariner 9 computer environment, the development and capabilities of the science sequence design software, and the techniques followed in the daily mission operations are discussed. Included is a discussion of the overall mission operations organization and the individual components which played an essential role in the sequence design process. A summary of actual sequences processed, a discussion of problems encountered, and recommendations for future applications are given.
U.S. Geological Survey Groundwater Modeling Software: Making Sense of a Complex Natural Resource
Provost, Alden M.; Reilly, Thomas E.; Harbaugh, Arlen W.; Pollock, David W.
2009-01-01
Computer models of groundwater systems simulate the flow of groundwater, including water levels, and the transport of chemical constituents and thermal energy. Groundwater models afford hydrologists a framework on which to organize their knowledge and understanding of groundwater systems, and they provide insights water-resources managers need to plan effectively for future water demands. Building on decades of experience, the U.S. Geological Survey (USGS) continues to lead in the development and application of computer software that allows groundwater models to address scientific and management questions of increasing complexity.
The Collaborative Information Portal and NASA's Mars Exploration Rover Mission
NASA Technical Reports Server (NTRS)
Mak, Ronald; Walton, Joan
2005-01-01
The Collaborative Information Portal was enterprise software developed jointly by the NASA Ames Research Center and the Jet Propulsion Laboratory for NASA's Mars Exploration Rover mission. Mission managers, engineers, scientists, and researchers used this Internet application to view current staffing and event schedules, download data and image files generated by the rovers, receive broadcast messages, and get accurate times in various Mars and Earth time zones. This article describes the features, architecture, and implementation of this software, and concludes with lessons we learned from its deployment and a look towards future missions.
System Risk Balancing Profiles: Software Component
NASA Technical Reports Server (NTRS)
Kelly, John C.; Sigal, Burton C.; Gindorf, Tom
2000-01-01
The Software QA / V&V guide will be reviewed and updated based on feedback from NASA organizations and others with a vested interest in this area. Hardware, EEE Parts, Reliability, and Systems Safety are a sample of the future guides that will be developed. Cost Estimates, Lessons Learned, Probability of Failure and PACTS (Prevention, Avoidance, Control or Test) are needed to provide a more complete risk management strategy. This approach to risk management is designed to help balance the resources and program content for risk reduction for NASA's changing environment.
Spacecraft Internal Acoustic Environment Modeling
NASA Technical Reports Server (NTRS)
Allen, Christopher; Chu, S. Reynold
2008-01-01
The objective of the project is to develop an acoustic modeling capability, based on commercial off-the-shelf software, to be used as a tool for oversight of the future manned Constellation vehicles to ensure compliance with acoustic requirements and thus provide a safe and habitable acoustic environment for the crews, and to validate developed models via building physical mockups and conducting acoustic measurements.
ERIC Educational Resources Information Center
Far West Lab. for Educational Research and Development, San Francisco, CA.
This report is intended as a guide for local comprehensive integrated school-linked services sites and software vendors in developing and implementing case management information systems for the exchange and management of client data. The report is also intended to influence new development and future revisions of data systems, databases, and…
Surface and borehole neutron probes for the Construction and Resource Utilization eXplorer (CRUX)
NASA Technical Reports Server (NTRS)
Elphic, Richard C.; Hahn, Sangkoo; Lawrence, David J.; Feldman, William C.; Johnson, Jerome B.; Haldemann, Albert F. C.
2006-01-01
The Construction and Resource Utilization eXplorer (CRUX) project aims to develop an integrated, flexible suite of instruments with data fusion software and an executive controller for the purpose of in situ resource assessment and characterization for future space exploration.
ERIC Educational Resources Information Center
Burton, Adrian P.
1995-01-01
Discusses accessing online electronic documents at the European Telecommunications Satellite Organization (EUTELSAT). Highlights include off-site paper document storage, the document management system, benefits, the EUTELSAT Standard IBM Access software, implementation, the development process, and future enhancements. (AEF)
Human activities involving significant terrain alteration (e.g., earthworks operations associated with mines, urban development, landslides) can lead to broad-ranging changes in the surrounding terrestrial and aquatic environments. Potential aesthetic impacts can be associated wi...
Intranet-Based Learning: A One-Year Study of Student Utilisation.
ERIC Educational Resources Information Center
Herson, Katie; Sosabowski, M. H.; Lloyd, A. W.
1999-01-01
Reports on the undergraduate utilization and evaluation of an Intranet learning resource developed at the School of Pharmacy of the University of Brighton (United Kingdom). Topics include advantages of Intranets over the Internet, including software licensing and confidentiality; barriers to implementation; and future proposals. (LRW)
48 CFR 1352.209-71 - Limitation of future contracting.
Code of Federal Regulations, 2013 CFR
2013-10-01
... feasibility, proof of design and test, or engineering of programs not yet approved for acquisition or... computer software; and may appear in cost and pricing data or involve classified information. (iv) “System...'s development, production, or support. (vi) “Systems Engineering” means preparing specifications...
48 CFR 1352.209-71 - Limitation of future contracting.
Code of Federal Regulations, 2012 CFR
2012-10-01
... feasibility, proof of design and test, or engineering of programs not yet approved for acquisition or... computer software; and may appear in cost and pricing data or involve classified information. (iv) “System...'s development, production, or support. (vi) “Systems Engineering” means preparing specifications...
48 CFR 1352.209-71 - Limitation of future contracting.
Code of Federal Regulations, 2014 CFR
2014-10-01
... feasibility, proof of design and test, or engineering of programs not yet approved for acquisition or... computer software; and may appear in cost and pricing data or involve classified information. (iv) “System...'s development, production, or support. (vi) “Systems Engineering” means preparing specifications...
Interoperability of Neuroscience Modeling Software
Cannon, Robert C.; Gewaltig, Marc-Oliver; Gleeson, Padraig; Bhalla, Upinder S.; Cornelis, Hugo; Hines, Michael L.; Howell, Fredrick W.; Muller, Eilif; Stiles, Joel R.; Wils, Stefan; De Schutter, Erik
2009-01-01
Neuroscience increasingly uses computational models to assist in the exploration and interpretation of complex phenomena. As a result, considerable effort is invested in the development of software tools and technologies for numerical simulations and for the creation and publication of models. The diversity of related tools leads to the duplication of effort and hinders model reuse. Development practices and technologies that support interoperability between software systems therefore play an important role in making the modeling process more efficient and in ensuring that published models can be reliably and easily reused. Various forms of interoperability are possible including the development of portable model description standards, the adoption of common simulation languages or the use of standardized middleware. Each of these approaches finds applications within the broad range of current modeling activity. However more effort is required in many areas to enable new scientific questions to be addressed. Here we present the conclusions of the “Neuro-IT Interoperability of Simulators” workshop, held at the 11th computational neuroscience meeting in Edinburgh (July 19-20 2006; http://www.cnsorg.org). We assess the current state of interoperability of neural simulation software and explore the future directions that will enable the field to advance. PMID:17873374
Software Analyzes Complex Systems in Real Time
NASA Technical Reports Server (NTRS)
2008-01-01
Expert system software programs, also known as knowledge-based systems, are computer programs that emulate the knowledge and analytical skills of one or more human experts, related to a specific subject. SHINE (Spacecraft Health Inference Engine) is one such program, a software inference engine (expert system) designed by NASA for the purpose of monitoring, analyzing, and diagnosing both real-time and non-real-time systems. It was developed to meet many of the Agency s demanding and rigorous artificial intelligence goals for current and future needs. NASA developed the sophisticated and reusable software based on the experience and requirements of its Jet Propulsion Laboratory s (JPL) Artificial Intelligence Research Group in developing expert systems for space flight operations specifically, the diagnosis of spacecraft health. It was designed to be efficient enough to operate in demanding real time and in limited hardware environments, and to be utilized by non-expert systems applications written in conventional programming languages. The technology is currently used in several ongoing NASA applications, including the Mars Exploration Rovers and the Spacecraft Health Automatic Reasoning Pilot (SHARP) program for the diagnosis of telecommunication anomalies during the Neptune Voyager Encounter. It is also finding applications outside of the Space Agency.
The Implementation of Satellite Control System Software Using Object Oriented Design
NASA Technical Reports Server (NTRS)
Anderson, Mark O.; Reid, Mark; Drury, Derek; Hansell, William; Phillips, Tom
1998-01-01
NASA established the Small Explorer (SMEX) program in 1988 to provide frequent opportunities for highly focused and relatively inexpensive space science missions that can be launched into low earth orbit by small expendable vehicles. The development schedule for each SMEX spacecraft was three years from start to launch. The SMEX program has produced five satellites; Solar Anomalous and Magnetospheric Particle Explorer (SAMPEX), Fast Auroral Snapshot Explorer (FAST), Submillimeter Wave Astronomy Satellite (SWAS), Transition Region and Coronal Explorer (TRACE) and Wide-Field Infrared Explorer (WIRE). SAMPEX and FAST are on-orbit, TRACE is scheduled to be launched in April of 1998, WIRE is scheduled to be launched in September of 1998, and SWAS is scheduled to be launched in January of 1999. In each of these missions, the Attitude Control System (ACS) software was written using a modular procedural design. Current program goals require complete spacecraft development within 18 months. This requirement has increased pressure to write reusable flight software. Object-Oriented Design (OOD) offers the constructs for developing an application that only needs modification for mission unique requirements. This paper describes the OOD that was used to develop the SMEX-Lite ACS software. The SMEX-Lite ACS is three-axis controlled, momentum stabilized, and is capable of performing sub-arc-minute pointing. The paper first describes the high level requirements which governed the architecture of the SMEX-Lite ACS software. Next, the context in which the software resides is explained. The paper describes the benefits of encapsulation, inheritance and polymorphism with respect to the implementation of an ACS software system. This paper will discuss the design of several software components that comprise the ACS software. Specifically, Object-Oriented designs are presented for sensor data processing, attitude control, attitude determination and failure detection. The paper addresses the benefits of the OOD versus a conventional procedural design. The final discussion in this paper will address the establishment of the ACS Foundation Class (AFC) Library. The AFC is a large software repository, requiring a minimal amount of code modifications to produce ACS software for future projects, saving production time and costs.
Research in software allocation for advanced manned mission communications and tracking systems
NASA Technical Reports Server (NTRS)
Warnagiris, Tom; Wolff, Bill; Kusmanoff, Antone
1990-01-01
An assessment of the planned processing hardware and software/firmware for the Communications and Tracking System of the Space Station Freedom (SSF) was performed. The intent of the assessment was to determine the optimum distribution of software/firmware in the processing hardware for maximum throughput with minimum required memory. As a product of the assessment process an assessment methodology was to be developed that could be used for similar assessments of future manned spacecraft system designs. The assessment process was hampered by changing requirements for the Space Station. As a result, the initial objective of determining the optimum software/firmware allocation was not fulfilled, but several useful conclusions and recommendations resulted from the assessment. It was concluded that the assessment process would not be completely successful for a system with changing requirements. It was also concluded that memory requirements and hardware requirements were being modified to fit as a consequence of the change process, and although throughput could not be quantitized, potential problem areas could be identified. Finally, inherent flexibility of the system design was essential for the success of a system design with changing requirements. Recommendations resulting from the assessment included development of common software for some embedded controller functions, reduction of embedded processor requirements by hardwiring some Orbital Replacement Units (ORUs) to make better use of processor capabilities, and improvement in communications between software development personnel to enhance the integration process. Lastly, a critical observation was made regarding the software integration tasks did not appear to be addressed in the design process to the degree necessary for successful satisfaction of the system requirements.
NASA Technical Reports Server (NTRS)
Shalkhauser, Mary Jo W.; Roche, Rigoberto
2017-01-01
The Space Telecommunications Radio System (STRS) provides a common, consistent framework for software defined radios (SDRs) to abstract the application software from the radio platform hardware. The STRS standard aims to reduce the cost and risk of using complex, configurable and reprogrammable radio systems across NASA missions. To promote the use of the STRS architecture for future NASA advanced exploration missions, NASA Glenn Research Center (GRC) developed an STRS-compliant SDR on a radio platform used by the Advance Exploration System program at the Johnson Space Center (JSC) in their Integrated Power, Avionics, and Software (iPAS) laboratory. The iPAS STRS Radio was implemented on the Reconfigurable, Intelligently-Adaptive Communication System (RIACS) platform, currently being used for radio development at JSC. The platform consists of a Xilinx(Trademark) ML605 Virtex(Trademark)-6 FPGA board, an Analog Devices FMCOMMS1-EBZ RF transceiver board, and an Embedded PC (Axiomtek(Trademark) eBox 620-110-FL) running the Ubuntu 12.4 operating system. The result of this development is a very low cost STRS compliant platform that can be used for waveform developments for multiple applications. The purpose of this document is to describe how to develop a new waveform using the RIACS platform and the Very High Speed Integrated Circuits (VHSIC) Hardware Description Language (VHDL) FPGA wrapper code and the STRS implementation on the Axiomtek processor.
Terra Harvest software architecture
NASA Astrophysics Data System (ADS)
Humeniuk, Dave; Klawon, Kevin
2012-06-01
Under the Terra Harvest Program, the DIA has the objective of developing a universal Controller for the Unattended Ground Sensor (UGS) community. The mission is to define, implement, and thoroughly document an open architecture that universally supports UGS missions, integrating disparate systems, peripherals, etc. The Controller's inherent interoperability with numerous systems enables the integration of both legacy and future UGS System (UGSS) components, while the design's open architecture supports rapid third-party development to ensure operational readiness. The successful accomplishment of these objectives by the program's Phase 3b contractors is demonstrated via integration of the companies' respective plug-'n'-play contributions that include controllers, various peripherals, such as sensors, cameras, etc., and their associated software drivers. In order to independently validate the Terra Harvest architecture, L-3 Nova Engineering, along with its partner, the University of Dayton Research Institute, is developing the Terra Harvest Open Source Environment (THOSE), a Java Virtual Machine (JVM) running on an embedded Linux Operating System. The Use Cases on which the software is developed support the full range of UGS operational scenarios such as remote sensor triggering, image capture, and data exfiltration. The Team is additionally developing an ARM microprocessor-based evaluation platform that is both energy-efficient and operationally flexible. The paper describes the overall THOSE architecture, as well as the design decisions for some of the key software components. Development process for THOSE is discussed as well.
Future Directions for Astronomical Image Display
NASA Technical Reports Server (NTRS)
Mandel, Eric
2000-01-01
In the "Future Directions for Astronomical Image Displav" project, the Smithsonian Astrophysical Observatory (SAO) and the National Optical Astronomy Observatories (NOAO) evolved our existing image display program into fully extensible. cross-platform image display software. We also devised messaging software to support integration of image display into astronomical analysis systems. Finally, we migrated our software from reliance on Unix and the X Window System to a platform-independent architecture that utilizes the cross-platform Tcl/Tk technology.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chamana, Manohar; Prabakar, Kumaraguru; Palmintier, Bryan
A software process is developed to convert distribution network models from a quasi-static time-series tool (OpenDSS) to a real-time dynamic phasor simulator (ePHASORSIM). The description of this process in this paper would be helpful for researchers who intend to perform similar conversions. The converter could be utilized directly by users of real-time simulators who intend to perform software-in-the-loop or hardware-in-the-loop tests on large distribution test feeders for a range of use cases, including testing functions of advanced distribution management systems against a simulated distribution system. In the future, the developers intend to release the conversion tool as open source tomore » enable use by others.« less
University Software Ownership and Litigation: A First Examination*
Rai, Arti K.; Allison, John R.; Sampat, Bhaven N.
2013-01-01
Software patents and university-owned patents represent two of the most controversial intellectual property developments of the last twenty-five years. Despite this reality, and concerns that universities act as “patent trolls” when they assert software patents in litigation against successful commercializers, no scholar has systematically examined the ownership and litigation of university software patents. In this Article, we present the first such examination. Our empirical research reveals that software patents represent a significant and growing proportion of university patent holdings. Additionally, the most important determinant of the number of software patents a university owns is not its research and development (“R&D”) expenditures (whether computer science-related or otherwise) but, rather, its tendency to seek patents in other areas. In other words, universities appear to take a “one size fits all” approach to patenting their inventions. This one size fits all approach is problematic given the empirical evidence that software is likely to follow a different commercialization path than other types of invention. Thus, it is perhaps not surprising that we see a number of lawsuits in which university software patents have been used not for purposes of fostering commercialization, but instead, to extract rents in apparent holdup litigation. The Article concludes by examining whether this trend is likely to continue in the future, particularly given a 2006 Supreme Court decision that appears to diminish the holdup threat by recognizing the possibility of liability rules in patent suits, as well as recent case law that may call into question certain types of software patents. PMID:23750052
NASA Technical Reports Server (NTRS)
Lange, R. Connor
2012-01-01
Ever since Explorer-1, the United States' first Earth satellite, was developed and launched in 1958, JPL has developed many more spacecraft, including landers and orbiters. While these spacecraft vary greatly in their missions, capabilities,and destination, they all have something in common. All of the components of these spacecraft had to be comprehensively tested. While thorough testing is important to mitigate risk, it is also a very expensive and time consuming process. Thankfully,since virtually all of the software testing procedures for SMAP are computer controlled, these procedures can be automated. Most people testing SMAP flight software (FSW) would only need to write tests that exercise specific requirements and then check the filtered results to verify everything occurred as planned. This gives developers the ability to automatically launch tests on the testbed, distill the resulting logs into only the important information, generate validation documentation, and then deliver the documentation to management. With many of the steps in FSW testing automated, developers can use their limited time more effectively and can validate SMAP FSW modules quicker and test them more rigorously. As a result of the various benefits of automating much of the testing process, management is considering this automated tools use in future FSW validation efforts.
NASA Technical Reports Server (NTRS)
Bremmer, D. A.
1986-01-01
The feasibility of some off-the-shelf microprocessors and state-of-art software is assessed (1) as a development system for the principle investigator (pi) in the design of the experiment model, (2) as an example of available technology application for future PI's experiments, (3) as a system capable of being interactive in the PCTC's simulation of the dedicated experiment processor (DEP), preferably by bringing the PI's DEP software directly into the simulation model, (4) as a system having bus compatibility with host VAX simulation computers, (5) as a system readily interfaced with mock-up panels and information displays, and (6) as a functional system for post mission data analysis.
Ajay, Dara; Gangwal, Rahul P; Sangamwar, Abhay T
2015-01-01
Intelligent Patent Analysis Tool (IPAT) is an online data retrieval tool, operated based on text mining algorithm to extract specific patent information in a predetermined pattern into an Excel sheet. The software is designed and developed to retrieve and analyze technology information from multiple patent documents and generate various patent landscape graphs and charts. The software is C# coded in visual studio 2010, which extracts the publicly available patent information from the web pages like Google Patent and simultaneously study the various technology trends based on user-defined parameters. In other words, IPAT combined with the manual categorization will act as an excellent technology assessment tool in competitive intelligence and due diligence for predicting the future R&D forecast.
Generic Software Architecture for Launchers
NASA Astrophysics Data System (ADS)
Carre, Emilien; Gast, Philippe; Hiron, Emmanuel; Leblanc, Alain; Lesens, David; Mescam, Emmanuelle; Moro, Pierre
2015-09-01
The definition and reuse of generic software architecture for launchers is not so usual for several reasons: the number of European launcher families is very small (Ariane 5 and Vega for these last decades); the real time constraints (reactivity and determinism needs) are very hard; low levels of versatility are required (implying often an ad hoc development of the launcher mission). In comparison, satellites are often built on a generic platform made up of reusable hardware building blocks (processors, star-trackers, gyroscopes, etc.) and reusable software building blocks (middleware, TM/TC, On Board Control Procedure, etc.). If some of these reasons are still valid (e.g. the limited number of development), the increase of the available CPU power makes today an approach based on a generic time triggered middleware (ensuring the full determinism of the system) and a centralised mission and vehicle management (offering more flexibility in the design and facilitating the long term maintenance) achievable. This paper presents an example of generic software architecture which could be envisaged for future launchers, based on the previously described principles and supported by model driven engineering and automatic code generation.
NASA Astrophysics Data System (ADS)
Broten, Gregory S.; Monckton, Simon P.; Collier, Jack; Giesbrecht, Jared
2006-05-01
In 2002 Defence R&D Canada changed research direction from pure tele-operated land vehicles to general autonomy for land, air, and sea craft. The unique constraints of the military environment coupled with the complexity of autonomous systems drove DRDC to carefully plan a research and development infrastructure that would provide state of the art tools without restricting research scope. DRDC's long term objectives for its autonomy program address disparate unmanned ground vehicle (UGV), unattended ground sensor (UGS), air (UAV), and subsea and surface (UUV and USV) vehicles operating together with minimal human oversight. Individually, these systems will range in complexity from simple reconnaissance mini-UAVs streaming video to sophisticated autonomous combat UGVs exploiting embedded and remote sensing. Together, these systems can provide low risk, long endurance, battlefield services assuming they can communicate and cooperate with manned and unmanned systems. A key enabling technology for this new research is a software architecture capable of meeting both DRDC's current and future requirements. DRDC built upon recent advances in the computing science field while developing its software architecture know as the Architecture for Autonomy (AFA). Although a well established practice in computing science, frameworks have only recently entered common use by unmanned vehicles. For industry and government, the complexity, cost, and time to re-implement stable systems often exceeds the perceived benefits of adopting a modern software infrastructure. Thus, most persevere with legacy software, adapting and modifying software when and wherever possible or necessary -- adopting strategic software frameworks only when no justifiable legacy exists. Conversely, academic programs with short one or two year projects frequently exploit strategic software frameworks but with little enduring impact. The open-source movement radically changes this picture. Academic frameworks, open to public scrutiny and modification, now rival commercial frameworks in both quality and economic impact. Further, industry now realizes that open source frameworks can reduce cost and risk of systems engineering. This paper describes the Architecture for Autonomy implemented by DRDC and how this architecture meets DRDC's current needs. It also presents an argument for why this architecture should also satisfy DRDC's future requirements as well.
Saul, Katherine R.; Hu, Xiao; Goehler, Craig M.; Vidt, Meghan E.; Daly, Melissa; Velisar, Anca; Murray, Wendy M.
2014-01-01
Several opensource or commercially available software platforms are widely used to develop dynamic simulations of movement. While computational approaches are conceptually similar across platforms, technical differences in implementation may influence output. We present a new upper limb dynamic model as a tool to evaluate potential differences in predictive behavior between platforms. We evaluated to what extent differences in technical implementations in popular simulation software environments result in differences in kinematic predictions for single and multijoint movements using EMG- and optimization-based approaches for deriving control signals. We illustrate the benchmarking comparison using SIMM-Dynamics Pipeline-SD/Fast and OpenSim platforms. The most substantial divergence results from differences in muscle model and actuator paths. This model is a valuable resource and is available for download by other researchers. The model, data, and simulation results presented here can be used by future researchers to benchmark other software platforms and software upgrades for these two platforms. PMID:24995410
Saul, Katherine R; Hu, Xiao; Goehler, Craig M; Vidt, Meghan E; Daly, Melissa; Velisar, Anca; Murray, Wendy M
2015-01-01
Several opensource or commercially available software platforms are widely used to develop dynamic simulations of movement. While computational approaches are conceptually similar across platforms, technical differences in implementation may influence output. We present a new upper limb dynamic model as a tool to evaluate potential differences in predictive behavior between platforms. We evaluated to what extent differences in technical implementations in popular simulation software environments result in differences in kinematic predictions for single and multijoint movements using EMG- and optimization-based approaches for deriving control signals. We illustrate the benchmarking comparison using SIMM-Dynamics Pipeline-SD/Fast and OpenSim platforms. The most substantial divergence results from differences in muscle model and actuator paths. This model is a valuable resource and is available for download by other researchers. The model, data, and simulation results presented here can be used by future researchers to benchmark other software platforms and software upgrades for these two platforms.
The Microcomputer in the Library: VI. Implementation and Future Development.
ERIC Educational Resources Information Center
Leggate, Peter; Dyer, Hilary
1986-01-01
This sixth article in a series discusses planning for the installation and implementation of automated systems in the library, workstation design and location, scheduling of software implementation, security, data input, staff and reader training, job design, impact of automation on library procedures, evaluation of system performance, and future…
ERIC Educational Resources Information Center
Bertot, John Carlo; McClure, Charles R.
This report describes the results of an assessment of Sailor, Maryland's Online Public Information Network, which provides statewide Internet connection to 100% of Maryland public libraries. The concept of a "statewide networked environment" includes information services, products, hardware and software, telecommunications…
Microcomputers in Education. Report No. 4798.
ERIC Educational Resources Information Center
Feurzeig, W.; And Others
A brief review of the history of computer-assisted instruction and discussion of the current and potential roles of microcomputers in education introduce this review of the capabilities of state-of-the-art microcomputers and currently available software for them, and some speculations about future trends and developments. A survey of current…
Summary of 1971 pattern recognition program development
NASA Technical Reports Server (NTRS)
Whitley, S. L.
1972-01-01
Eight areas related to pattern recognition analysis at the Earth Resources Laboratory are discussed: (1) background; (2) Earth Resources Laboratory goals; (3) software problems/limitations; (4) operational problems/limitations; (5) immediate future capabilities; (6) Earth Resources Laboratory data analysis system; (7) general program needs and recommendations; and (8) schedule and milestones.
Low-Budget, Cost-Effective OCR: Optical Character Recognition for MS-DOS Micros.
ERIC Educational Resources Information Center
Perez, Ernest
1990-01-01
Discusses optical character recognition (OCR) for use with MS-DOS microcomputers. Cost effectiveness is considered, three types of software approaches to character recognition are explained, hardware and operation requirements are described, possible library applications are discussed, future OCR developments are suggested, and a list of OCR…
E-Learning Experiences and Future
ERIC Educational Resources Information Center
Soomro, Safeeullah, Ed.
2010-01-01
Chapters in this book include: (1) E-Learning Indicators: A Multidimensional Model for Planning Developing and Evaluating E-Learning Software Solutions (Bekim Fetaji and Majlinda Fetaji); (2) Barriers to Effective use of Information Technology in Science Education at Yanbu Kingdom of Saudi Arabia (Abdulkareem Eid S. Alwani and Safeeullah Soomro);…
Point-and-Click Pedagogy: Is It Effective for Teaching Information Technology?
ERIC Educational Resources Information Center
Angolia, Mark G.; Pagliari, Leslie R.
2016-01-01
This paper assesses the effectiveness of the adoption of curriculum content developed and supported by a global academic university-industry alliance sponsored by one of the world's largest information technology software providers. Academic alliances promote practical and future-oriented education while providing access to proprietary software…
Preparation of Teachers for Computer and Multimedia-Based Instruction in Literacy.
ERIC Educational Resources Information Center
Balajthy, Ernest
Recent developments in computer and multimedia technologies bring about the need to reconsider the education of today's teachers and future teachers and to update the technology-related content of literacy education coursework. "Application" software receives the most attention from researchers and theorists in literacy education. Use of…
Computer-Aided Authoring of Programmed Instruction for Teaching Symbol Recognition. Final Report.
ERIC Educational Resources Information Center
Braby, Richard; And Others
This description of AUTHOR, a computer program for the automated authoring of programmed texts designed to teach symbol recognition, includes discussions of the learning strategies incorporated in the design of the instructional materials, hardware description and the algorithm for the software, and current and future developments. Appendices…
Collecting and Using Networked Statistics: Current Status, Future Goals
ERIC Educational Resources Information Center
Hiott, Judith
2004-01-01
For more than five years the Houston Public Library has collected statistics for measuring networked collections and services based on emerging guidelines. While the guidelines have provided authority and stability to the process, the clarification process continues. The development of information discovery software, such as federated search tools…
Future Development of Instructional Television.
ERIC Educational Resources Information Center
Barnett, H. J.; Denzau, A. T.
Instructional television (ITV) has been little used in the nation's schools because ITV hardware and software has been unreliable and expensive and teachers have yet to learn to use ITV. The perfection of inexpensive videotape recorders/players (VTR) and inexpensive tapes and cameras could remedy the problem. A package consisting of 10 mobile…
Software Engineering Techniques for Computer-Aided Learning.
ERIC Educational Resources Information Center
Ibrahim, Bertrand
1989-01-01
Describes the process for developing tutorials for computer-aided learning (CAL) using a programing language rather than an authoring system. The workstation used is described, the use of graphics is discussed, the role of a local area network (LAN) is explained, and future plans are discussed. (five references) (LRW)
MOOsburg: Multi-User Domain Support for a Community Network.
ERIC Educational Resources Information Center
Carroll, John M.; Rosson, Mary Beth; Isenhour, Philip L.; Van Metre, Christina; Schafer, Wendy A.; Ganoe, Craig H.
2001-01-01
Explains MOOsburg, a community-oriented MOO that models the geography of the town of Blacksburg, Virginia and is designed to be used by local residents. Highlights include the software architecture; client-server communication; spatial database; user interface; interaction; map-based navigation; application development; and future plans. (LRW)
The Future of Digital Working: Knowledge Migration and Learning
ERIC Educational Resources Information Center
Malcolm, Irene
2014-01-01
Against the backdrop of intensified migration linked to globalisation, this article considers the implications of knowledge migration for future digital workers. It draws empirically on a socio-material analysis of the international software localisation industry. Localisers' work requires linguistic, cultural and software engineering skills to…
Design on intelligent gateway technique in home network
NASA Astrophysics Data System (ADS)
Hu, Zhonggong; Feng, Xiancheng
2008-12-01
Based on digitization, multimedia, mobility, wide band, real-time interaction and so on,family networks, because can provide diverse and personalized synthesis service in information, correspondence work, entertainment, education and health care and so on, are more and more paid attention by the market. The family network product development has become the focus of the related industry. In this paper,the concept of the family network and the overall reference model of the family network are introduced firstly.Then the core techniques and the correspondence standard related with the family network are proposed.The key analysis is made for the function of family gateway, the function module of the software,the key technologies to client side software architecture and the trend of development of the family network entertainment seeing and hearing service and so on. Product present situation of the family gateway and the future trend of development, application solution of the digital family service are introduced. The development of the family network product bringing about the digital family network industry is introduced finally.It causes the development of software industries,such as communication industry,electrical appliances industry, computer and game and so on.It also causes the development of estate industry.
Fault tolerant testbed evaluation, phase 1
NASA Technical Reports Server (NTRS)
Caluori, V., Jr.; Newberry, T.
1993-01-01
In recent years, avionics systems development costs have become the driving factor in the development of space systems, military aircraft, and commercial aircraft. A method of reducing avionics development costs is to utilize state-of-the-art software application generator (autocode) tools and methods. The recent maturity of application generator technology has the potential to dramatically reduce development costs by eliminating software development steps that have historically introduced errors and the need for re-work. Application generator tools have been demonstrated to be an effective method for autocoding non-redundant, relatively low-rate input/output (I/O) applications on the Space Station Freedom (SSF) program; however, they have not been demonstrated for fault tolerant, high-rate I/O, flight critical environments. This contract will evaluate the use of application generators in these harsh environments. Using Boeing's quad-redundant avionics system controller as the target system, Space Shuttle Guidance, Navigation, and Control (GN&C) software will be autocoded, tested, and evaluated in the Johnson (Space Center) Avionics Engineering Laboratory (JAEL). The response of the autocoded system will be shown to match the response of the existing Shuttle General Purpose Computers (GPC's), thereby demonstrating the viability of using autocode techniques in the development of future avionics systems.
On the use and the performance of software reliability growth models
NASA Technical Reports Server (NTRS)
Keiller, Peter A.; Miller, Douglas R.
1991-01-01
We address the problem of predicting future failures for a piece of software. The number of failures occurring during a finite future time interval is predicted from the number failures observed during an initial period of usage by using software reliability growth models. Two different methods for using the models are considered: straightforward use of individual models, and dynamic selection among models based on goodness-of-fit and quality-of-prediction criteria. Performance is judged by the relative error of the predicted number of failures over future finite time intervals relative to the number of failures eventually observed during the intervals. Six of the former models and eight of the latter are evaluated, based on their performance on twenty data sets. Many open questions remain regarding the use and the performance of software reliability growth models.
NASA Astrophysics Data System (ADS)
Kiekebusch, Mario J.; Lucuix, Christian; Erm, Toomas M.; Chiozzi, Gianluca; Zamparelli, Michele; Kern, Lothar; Brast, Roland; Pirani, Werther; Reiss, Roland; Popovic, Dan; Knudstrup, Jens; Duchateau, Michel; Sandrock, Stefan; Di Lieto, Nicola
2014-07-01
ESO is currently in the final phase of the standardization process for PC-based Programmable Logical Controllers (PLCs) as the new platform for the development of control systems for future VLT/VLTI instruments. The standard solution used until now consists of a Local Control Unit (LCU), a VME-based system having a CPU and commercial and proprietary boards. This system includes several layers of software and many thousands of lines of code developed and maintained in house. LCUs have been used for several years as the interface to control instrument functions but now are being replaced by commercial off-the-shelf (COTS) systems based on BECKHOFF Embedded PCs and the EtherCAT fieldbus. ESO is working on the completion of the software framework that enables a seamless integration into the VLT control system in order to be ready to support upcoming instruments like ESPRESSO and ERIS, that will be the first fully VLT compliant instruments using the new standard. The technology evaluation and standardization process has been a long and combined effort of various engineering disciplines like electronics, control and software, working together to define a solution that meets the requirements and minimizes the impact on the observatory operations and maintenance. This paper presents the challenges of the standardization process and the steps involved in such a change. It provides a technical overview of how industrial standards like EtherCAT, OPC-UA, PLCOpen MC and TwinCAT can be used to replace LCU features in various areas like software engineering and programming languages, motion control, time synchronization and astronomical tracking.
A Hardware-in-the-Loop Testbed for Spacecraft Formation Flying Applications
NASA Technical Reports Server (NTRS)
Leitner, Jesse; Bauer, Frank H. (Technical Monitor)
2001-01-01
The Formation Flying Test Bed (FFTB) at NASA Goddard Space Flight Center (GSFC) is being developed as a modular, hybrid dynamic simulation facility employed for end-to-end guidance, navigation, and control (GN&C) analysis and design for formation flying clusters and constellations of satellites. The FFTB will support critical hardware and software technology development to enable current and future missions for NASA, other government agencies, and external customers for a wide range of missions, particularly those involving distributed spacecraft operations. The initial capabilities of the FFTB are based upon an integration of high fidelity hardware and software simulation, emulation, and test platforms developed at GSFC in recent years; including a high-fidelity GPS simulator which has been a fundamental component of the Guidance, Navigation, and Control Center's GPS Test Facility. The FFTB will be continuously evolving over the next several years from a too[ with initial capabilities in GPS navigation hardware/software- in-the- loop analysis and closed loop GPS-based orbit control algorithm assessment to one with cross-link communications and relative navigation analysis and simulation capability. Eventually the FFT13 will provide full capability to support all aspects of multi-sensor, absolute and relative position determination and control, in all (attitude and orbit) degrees of freedom, as well as information management for satellite clusters and constellations. In this paper we focus on the architecture for the FFT13 as a general GN&C analysis environment for the spacecraft formation flying community inside and outside of NASA GSFC and we briefly reference some current and future activities which will drive the requirements and development.
Software engineering the mixed model for genome-wide association studies on large samples.
Zhang, Zhiwu; Buckler, Edward S; Casstevens, Terry M; Bradbury, Peter J
2009-11-01
Mixed models improve the ability to detect phenotype-genotype associations in the presence of population stratification and multiple levels of relatedness in genome-wide association studies (GWAS), but for large data sets the resource consumption becomes impractical. At the same time, the sample size and number of markers used for GWAS is increasing dramatically, resulting in greater statistical power to detect those associations. The use of mixed models with increasingly large data sets depends on the availability of software for analyzing those models. While multiple software packages implement the mixed model method, no single package provides the best combination of fast computation, ability to handle large samples, flexible modeling and ease of use. Key elements of association analysis with mixed models are reviewed, including modeling phenotype-genotype associations using mixed models, population stratification, kinship and its estimation, variance component estimation, use of best linear unbiased predictors or residuals in place of raw phenotype, improving efficiency and software-user interaction. The available software packages are evaluated, and suggestions made for future software development.
Modi, Riddhi A; Mugavero, Michael J; Amico, Rivet K; Keruly, Jeanne; Quinlivan, Evelyn Byrd; Crane, Heidi M; Guzman, Alfredo; Zinski, Anne; Montue, Solange; Roytburd, Katya; Church, Anna; Willig, James H
2017-06-16
Meticulous tracking of study data must begin early in the study recruitment phase and must account for regulatory compliance, minimize missing data, and provide high information integrity and/or reduction of errors. In behavioral intervention trials, participants typically complete several study procedures at different time points. Among HIV-infected patients, behavioral interventions can favorably affect health outcomes. In order to empower newly diagnosed HIV positive individuals to learn skills to enhance retention in HIV care, we developed the behavioral health intervention Integrating ENGagement and Adherence Goals upon Entry (iENGAGE) funded by the National Institute of Allergy and Infectious Diseases (NIAID), where we deployed an in-clinic behavioral health intervention in 4 urban HIV outpatient clinics in the United States. To scale our intervention strategy homogenously across sites, we developed software that would function as a behavioral sciences research platform. This manuscript aimed to: (1) describe the design and implementation of a Web-based software application to facilitate deployment of a multisite behavioral science intervention; and (2) report on results of a survey to capture end-user perspectives of the impact of this platform on the conduct of a behavioral intervention trial. In order to support the implementation of the NIAID-funded trial iENGAGE, we developed software to deploy a 4-site behavioral intervention for new clinic patients with HIV/AIDS. We integrated the study coordinator into the informatics team to participate in the software development process. Here, we report the key software features and the results of the 25-item survey to evaluate user perspectives on research and intervention activities specific to the iENGAGE trial (N=13). The key features addressed are study enrollment, participant randomization, real-time data collection, facilitation of longitudinal workflow, reporting, and reusability. We found 100% user agreement (13/13) that participation in the database design and/or testing phase made it easier to understand user roles and responsibilities and recommended participation of research teams in developing databases for future studies. Users acknowledged ease of use, color flags, longitudinal work flow, and data storage in one location as the most useful features of the software platform and issues related to saving participant forms, security restrictions, and worklist layout as least useful features. The successful development of the iENGAGE behavioral science research platform validated an approach of early and continuous involvement of the study team in design development. In addition, we recommend post-hoc collection of data from users as this led to important insights on how to enhance future software and inform standard clinical practices. Clinicaltrials.gov NCT01900236; (https://clinicaltrials.gov/ct2/show/NCT01900236 (Archived by WebCite at http://www.webcitation.org/6qAa8ld7v). ©Riddhi A Modi, Michael J Mugavero, Rivet K Amico, Jeanne Keruly, Evelyn Byrd Quinlivan, Heidi M Crane, Alfredo Guzman, Anne Zinski, Solange Montue, Katya Roytburd, Anna Church, James H Willig. Originally published in JMIR Research Protocols (http://www.researchprotocols.org), 16.06.2017.
NASA Technical Reports Server (NTRS)
Hinchey, Mike
2006-01-01
The explosion of capabilities and new products within ICT (Information and Communication Technology) has fostered widespread, overly optimistic opinions regarding the industry, based on common but unjustified assumptions of quality and correctness of software. These assumptions are encouraged by software producers and vendors, who have not succeeded in finding a way to overcome the lack of an automated, mathematically sound way to develop correct systems from requirements. NASA faces this dilemma as it envisages advanced mission concepts in future exploration missions, which may well be the most ambitious computer-based systems ever developed. Such missions entail levels of complexity that beg for new methods for system development. NASA-led research in such areas as sensor networks, formal methods, autonomic computing, and requirements-based programming (to name but a few) will offer some innovative approaches to achieving correctness in complex system development.
Open high-level data formats and software for gamma-ray astronomy
NASA Astrophysics Data System (ADS)
Deil, Christoph; Boisson, Catherine; Kosack, Karl; Perkins, Jeremy; King, Johannes; Eger, Peter; Mayer, Michael; Wood, Matthew; Zabalza, Victor; Knödlseder, Jürgen; Hassan, Tarek; Mohrmann, Lars; Ziegler, Alexander; Khelifi, Bruno; Dorner, Daniela; Maier, Gernot; Pedaletti, Giovanna; Rosado, Jaime; Contreras, José Luis; Lefaucheur, Julien; Brügge, Kai; Servillat, Mathieu; Terrier, Régis; Walter, Roland; Lombardi, Saverio
2017-01-01
In gamma-ray astronomy, a variety of data formats and proprietary software have been traditionally used, often developed for one specific mission or experiment. Especially for ground-based imaging atmospheric Cherenkov telescopes (IACTs), data and software are mostly private to the collaborations operating the telescopes. However, there is a general movement in science towards the use of open data and software. In addition, the next-generation IACT instrument, the Cherenkov Telescope Array (CTA), will be operated as an open observatory. We have created a Github organisation at https://github.com/open-gamma-ray-astro where we are developing high-level data format specifications. A public mailing list was set up at https://lists.nasa.gov/mailman/listinfo/open-gamma-ray-astro and a first face-to-face meeting on the IACT high-level data model and formats took place in April 2016 in Meudon (France). This open multi-mission effort will help to accelerate the development of open data formats and open-source software for gamma-ray astronomy, leading to synergies in the development of analysis codes and eventually better scientific results (reproducible, multi-mission). This write-up presents this effort for the first time, explaining the motivation and context, the available resources and process we use, as well as the status and planned next steps for the data format specifications. We hope that it will stimulate feedback and future contributions from the gamma-ray astronomy community.
NASA Technical Reports Server (NTRS)
Marnock, M. J.
1971-01-01
The protection of intellectual property by a patent, a copyright, or trade secrets is reviewed. The present and future use of computers and software are discussed, along with the governmental uses of software. The popularity of contractual agreements for sale or lease of computer programs and software services is also summarized.
ELISA, a demonstrator environment for information systems architecture design
NASA Technical Reports Server (NTRS)
Panem, Chantal
1994-01-01
This paper describes an approach of reusability of software engineering technology in the area of ground space system design. System engineers have lots of needs similar to software developers: sharing of a common data base, capitalization of knowledge, definition of a common design process, communication between different technical domains. Moreover system designers need to simulate dynamically their system as early as possible. Software development environments, methods and tools now become operational and widely used. Their architecture is based on a unique object base, a set of common management services and they host a family of tools for each life cycle activity. In late '92, CNES decided to develop a demonstrative software environment supporting some system activities. The design of ground space data processing systems was chosen as the application domain. ELISA (Integrated Software Environment for Architectures Specification) was specified as a 'demonstrator', i.e. a sufficient basis for demonstrations, evaluation and future operational enhancements. A process with three phases was implemented: system requirements definition, design of system architectures models, and selection of physical architectures. Each phase is composed of several activities that can be performed in parallel, with the provision of Commercial Off the Shelves Tools. ELISA has been delivered to CNES in January 94, currently used for demonstrations and evaluations on real projects (e.g. SPOT4 Satellite Control Center). It is on the way of new evolutions.
An Object Oriented Extensible Architecture for Affordable Aerospace Propulsion Systems
NASA Technical Reports Server (NTRS)
Follen, Gregory J.
2003-01-01
Driven by a need to explore and develop propulsion systems that exceeded current computing capabilities, NASA Glenn embarked on a novel strategy leading to the development of an architecture that enables propulsion simulations never thought possible before. Full engine 3 Dimensional Computational Fluid Dynamic propulsion system simulations were deemed impossible due to the impracticality of the hardware and software computing systems required. However, with a software paradigm shift and an embracing of parallel and distributed processing, an architecture was designed to meet the needs of future propulsion system modeling. The author suggests that the architecture designed at the NASA Glenn Research Center for propulsion system modeling has potential for impacting the direction of development of affordable weapons systems currently under consideration by the Applied Vehicle Technology Panel (AVT).
NASA Technical Reports Server (NTRS)
Lunsford, Myrtis Leigh
1998-01-01
The Army-NASA Virtual Innovations Laboratory (ANVIL) was recently created to provide virtual reality tools for performing Human Engineering and operations analysis for both NASA and the Army. The author's summer research project consisted of developing and refining these tools for NASA's Reusable Launch Vehicle (RLV) program. Several general simulations were developed for use by the ANVIL for the evaluation of the X34 Engine Changeout procedure. These simulations were developed with the software tool dVISE 4.0.0 produced by Division Inc. All software was run on an SGI Indigo2 High Impact. This paper describes the simulations, various problems encountered with the simulations, other summer activities, and possible work for the future. We first begin with a brief description of virtual reality systems.
Development of a prototype multi-processing interactive software invocation system
NASA Technical Reports Server (NTRS)
Berman, W. J.
1983-01-01
The Interactive Software Invocation System (NASA-ISIS) was first transported to the M68000 microcomputer, and then rewritten in the programming language Path Pascal. Path Pascal is a significantly enhanced derivative of Pascal, allowing concurrent algorithms to be expressed using the simple and elegant concept of Path Expressions. The primary results of this contract was to verify the viability of Path Pascal as a system's development language. The NASA-ISIS implementation using Path Pascal is a prototype of a large, interactive system in Path Pascal. As such, it is an excellent demonstration of the feasibility of using Path Pascal to write even more extensive systems. It is hoped that future efforts will build upon this research and, ultimately, that a full Path Pascal/ISIS Operating System (PPIOS) might be developed.
ATLAS event display: Virtual Point-1 visualization software
NASA Astrophysics Data System (ADS)
Seeley, Kaelyn; Dimond, David; Bianchi, R. M.; Boudreau, Joseph; Hong, Tae Min; Atlas Collaboration
2017-01-01
Virtual Point-1 (VP1) is an event display visualization software for the ATLAS Experiment. VP1 is a software framework that makes use of ATHENA, the ATLAS software infrastructure, to access the complete detector geometry. This information is used to draw graphics representing the components of the detector at any scale. Two new features are added to VP1. The first is a traditional ``lego'' plot, displaying the calorimeter energy deposits in eta-phi space. The second is another lego plot focusing on the forward endcap region, displaying the energy deposits in r-phi space. Currently, these new additions display the energy deposits based on the granularity of the middle layer of the liquid-Argon electromagnetic calorimeter. Since VP1 accesses the complete detector geometry and all experimental data, future developments are outlined for a more detailed display involving multiple layers of the calorimeter along with their distinct granularities.
Characterizing and Modeling the Cost of Rework in a Library of Reusable Software Components
NASA Technical Reports Server (NTRS)
Basili, Victor R.; Condon, Steven E.; ElEmam, Khaled; Hendrick, Robert B.; Melo, Walcelio
1997-01-01
In this paper we characterize and model the cost of rework in a Component Factory (CF) organization. A CF is responsible for developing and packaging reusable software components. Data was collected on corrective maintenance activities for the Generalized Support Software reuse asset library located at the Flight Dynamics Division of NASA's GSFC. We then constructed a predictive model of the cost of rework using the C4.5 system for generating a logical classification model. The predictor variables for the model are measures of internal software product attributes. The model demonstrates good prediction accuracy, and can be used by managers to allocate resources for corrective maintenance activities. Furthermore, we used the model to generate proscriptive coding guidelines to improve programming, practices so that the cost of rework can be reduced in the future. The general approach we have used is applicable to other environments.
Parallel software for lattice N = 4 supersymmetric Yang-Mills theory
NASA Astrophysics Data System (ADS)
Schaich, David; DeGrand, Thomas
2015-05-01
We present new parallel software, SUSY LATTICE, for lattice studies of four-dimensional N = 4 supersymmetric Yang-Mills theory with gauge group SU(N). The lattice action is constructed to exactly preserve a single supersymmetry charge at non-zero lattice spacing, up to additional potential terms included to stabilize numerical simulations. The software evolved from the MILC code for lattice QCD, and retains a similar large-scale framework despite the different target theory. Many routines are adapted from an existing serial code (Catterall and Joseph, 2012), which SUSY LATTICE supersedes. This paper provides an overview of the new parallel software, summarizing the lattice system, describing the applications that are currently provided and explaining their basic workflow for non-experts in lattice gauge theory. We discuss the parallel performance of the code, and highlight some notable aspects of the documentation for those interested in contributing to its future development.
NASA Technical Reports Server (NTRS)
Srinivasan, J.; Farrington, A.; Gray, A.
2001-01-01
They present an overview of long-life reconfigurable processor technologies and of a specific architecture for implementing a software reconfigurable (software-defined) network processor for space applications.
Spacecraft Internal Acoustic Environment Modeling
NASA Technical Reports Server (NTRS)
Chu, S. Reynold; Allen, Chris
2009-01-01
The objective of the project is to develop an acoustic modeling capability, based on commercial off-the-shelf software, to be used as a tool for oversight of the future manned Constellation vehicles. The use of such a model will help ensure compliance with acoustic requirements. Also, this project includes modeling validation and development feedback via building physical mockups and conducting acoustic measurements to compare with the predictions.
Geometry-Based Observability Metric
NASA Technical Reports Server (NTRS)
Eaton, Colin; Naasz, Bo
2012-01-01
The Satellite Servicing Capabilities Office (SSCO) is currently developing and testing Goddard s Natural Feature Image Recognition (GNFIR) software for autonomous rendezvous and docking missions. GNFIR has flight heritage and is still being developed and tailored for future missions with non-cooperative targets: (1) DEXTRE Pointing Package System on the International Space Station, (2) Relative Navigation System (RNS) on the Space Shuttle for the fourth Hubble Servicing Mission.
Recommended system of application and development
NASA Astrophysics Data System (ADS)
Wang, Wei
2018-04-01
A recommender system is a project that helps users identify their wishes and needs. The recommender system has been successfully applied to many e-commerce environments, such as news, film, music, books and other areas of recommendation. This paper mainly discusses the application of recommendation technology in software engineering, data and knowledge engineering, configurable projects and persuasion technology, and summarizes the development trend of recommendation technology in the future.
The use of Graphic User Interface for development of a user-friendly CRS-Stack software
NASA Astrophysics Data System (ADS)
Sule, Rachmat; Prayudhatama, Dythia; Perkasa, Muhammad D.; Hendriyana, Andri; Fatkhan; Sardjito; Adriansyah
2017-04-01
The development of a user-friendly Common Reflection Surface (CRS) Stack software that has been built by implementing Graphical User Interface (GUI) is described in this paper. The original CRS-Stack software developed by WIT Consortium is compiled in the unix/linux environment, which is not a user-friendly software, so that a user must write the commands and parameters manually in a script file. Due to this limitation, the CRS-Stack become a non popular method, although applying this method is actually a promising way in order to obtain better seismic sections, which have better reflector continuity and S/N ratio. After obtaining successful results that have been tested by using several seismic data belong to oil companies in Indonesia, it comes to an idea to develop a user-friendly software in our own laboratory. Graphical User Interface (GUI) is a type of user interface that allows people to interact with computer programs in a better way. Rather than typing commands and module parameters, GUI allows the users to use computer programs in much simple and easy. Thus, GUI can transform the text-based interface into graphical icons and visual indicators. The use of complicated seismic unix shell script can be avoided. The Java Swing GUI library is used to develop this CRS-Stack GUI. Every shell script that represents each seismic process is invoked from Java environment. Besides developing interactive GUI to perform CRS-Stack processing, this CRS-Stack GUI is design to help geophysicists to manage a project with complex seismic processing procedures. The CRS-Stack GUI software is composed by input directory, operators, and output directory, which are defined as a seismic data processing workflow. The CRS-Stack processing workflow involves four steps; i.e. automatic CMP stack, initial CRS-Stack, optimized CRS-Stack, and CRS-Stack Supergather. Those operations are visualized in an informative flowchart with self explanatory system to guide the user inputting the parameter values for each operation. The knowledge of CRS-Stack processing procedure is still preserved in the software, which is easy and efficient to be learned. The software will still be developed in the future. Any new innovative seismic processing workflow will also be added into this GUI software.
NASA Astrophysics Data System (ADS)
1991-08-01
Consideration is given to operational characteristics of future launch vehicles, trends in propulsion technology, technology challenges in the development of cryogenic propulsion systems for future reusable space-launch vehicles, estimation of the overall drag coefficient of an aerospace plane, and self-reliance in aerospace structures. Attention is also given to basic design concepts for smart actuators for aerospace plane control, a software package for the preliminary design of a helicopter, and multiconstraint wing optimization.
Symbolic Processing Combined with Model-Based Reasoning
NASA Technical Reports Server (NTRS)
James, Mark
2009-01-01
A computer program for the detection of present and prediction of future discrete states of a complex, real-time engineering system utilizes a combination of symbolic processing and numerical model-based reasoning. One of the biggest weaknesses of a purely symbolic approach is that it enables prediction of only future discrete states while missing all unmodeled states or leading to incorrect identification of an unmodeled state as a modeled one. A purely numerical approach is based on a combination of statistical methods and mathematical models of the applicable physics and necessitates development of a complete model to the level of fidelity required for prediction. In addition, a purely numerical approach does not afford the ability to qualify its results without some form of symbolic processing. The present software implements numerical algorithms to detect unmodeled events and symbolic algorithms to predict expected behavior, correlate the expected behavior with the unmodeled events, and interpret the results in order to predict future discrete states. The approach embodied in this software differs from that of the BEAM methodology (aspects of which have been discussed in several prior NASA Tech Briefs articles), which provides for prediction of future measurements in the continuous-data domain.
NASA TSRV essential flight control system requirements via object oriented analysis
NASA Technical Reports Server (NTRS)
Duffy, Keith S.; Hoza, Bradley J.
1992-01-01
The objective was to analyze the baseline flight control system of the Transport Systems Research Vehicle (TSRV) and to develop a system specification that offers high visibility of the essential system requirements in order to facilitate the future development of alternate, more advanced software architectures. The flight control system is defined to be the baseline software for the TSRV research flight deck, including all navigation, guidance, and control functions, and primary pilot displays. The Object Oriented Analysis (OOA) methodology developed is used to develop a system requirement definition. The scope of the requirements definition contained herein is limited to a portion of the Flight Management/Flight Control computer functionality. The development of a partial system requirements definition is documented, and includes a discussion of the tasks required to increase the scope of the requirements definition and recommendations for follow-on research.
Smith, M.; Murphy, D.; Laxmisan, A.; Sittig, D.; Reis, B.; Esquivel, A.; Singh, H.
2013-01-01
Summary Background Abnormal test results do not always receive timely follow-up, even when providers are notified through electronic health record (EHR)-based alerts. High workload, alert fatigue, and other demands on attention disrupt a provider’s prospective memory for tasks required to initiate follow-up. Thus, EHR-based tracking and reminding functionalities are needed to improve follow-up. Objectives The purpose of this study was to develop a decision-support software prototype enabling individual and system-wide tracking of abnormal test result alerts lacking follow-up, and to conduct formative evaluations, including usability testing. Methods We developed a working prototype software system, the Alert Watch And Response Engine (AWARE), to detect abnormal test result alerts lacking documented follow-up, and to present context-specific reminders to providers. Development and testing took place within the VA’s EHR and focused on four cancer-related abnormal test results. Design concepts emphasized mitigating the effects of high workload and alert fatigue while being minimally intrusive. We conducted a multifaceted formative evaluation of the software, addressing fit within the larger socio-technical system. Evaluations included usability testing with the prototype and interview questions about organizational and workflow factors. Participants included 23 physicians, 9 clinical information technology specialists, and 8 quality/safety managers. Results Evaluation results indicated that our software prototype fit within the technical environment and clinical workflow, and physicians were able to use it successfully. Quality/safety managers reported that the tool would be useful in future quality assurance activities to detect patients who lack documented follow-up. Additionally, we successfully installed the software on the local facility’s “test” EHR system, thus demonstrating technical compatibility. Conclusion To address the factors involved in missed test results, we developed a software prototype to account for technical, usability, organizational, and workflow needs. Our evaluation has shown the feasibility of the prototype as a means of facilitating better follow-up for cancer-related abnormal test results. PMID:24155789
Smith, M; Murphy, D; Laxmisan, A; Sittig, D; Reis, B; Esquivel, A; Singh, H
2013-01-01
Abnormal test results do not always receive timely follow-up, even when providers are notified through electronic health record (EHR)-based alerts. High workload, alert fatigue, and other demands on attention disrupt a provider's prospective memory for tasks required to initiate follow-up. Thus, EHR-based tracking and reminding functionalities are needed to improve follow-up. The purpose of this study was to develop a decision-support software prototype enabling individual and system-wide tracking of abnormal test result alerts lacking follow-up, and to conduct formative evaluations, including usability testing. We developed a working prototype software system, the Alert Watch And Response Engine (AWARE), to detect abnormal test result alerts lacking documented follow-up, and to present context-specific reminders to providers. Development and testing took place within the VA's EHR and focused on four cancer-related abnormal test results. Design concepts emphasized mitigating the effects of high workload and alert fatigue while being minimally intrusive. We conducted a multifaceted formative evaluation of the software, addressing fit within the larger socio-technical system. Evaluations included usability testing with the prototype and interview questions about organizational and workflow factors. Participants included 23 physicians, 9 clinical information technology specialists, and 8 quality/safety managers. Evaluation results indicated that our software prototype fit within the technical environment and clinical workflow, and physicians were able to use it successfully. Quality/safety managers reported that the tool would be useful in future quality assurance activities to detect patients who lack documented follow-up. Additionally, we successfully installed the software on the local facility's "test" EHR system, thus demonstrating technical compatibility. To address the factors involved in missed test results, we developed a software prototype to account for technical, usability, organizational, and workflow needs. Our evaluation has shown the feasibility of the prototype as a means of facilitating better follow-up for cancer-related abnormal test results.
GCS component development cycle
NASA Astrophysics Data System (ADS)
Rodríguez, Jose A.; Macias, Rosa; Molgo, Jordi; Guerra, Dailos; Pi, Marti
2012-09-01
The GTC1 is an optical-infrared 10-meter segmented mirror telescope at the ORM observatory in Canary Islands (Spain). First light was at 13/07/2007 and since them it is in the operation phase. The GTC control system (GCS) is a distributed object & component oriented system based on RT-CORBA8 and it is responsible for the management and operation of the telescope, including its instrumentation. GCS has used the Rational Unified process (RUP9) in its development. RUP is an iterative software development process framework. After analysing (use cases) and designing (UML10) any of GCS subsystems, an initial component description of its interface is obtained and from that information a component specification is written. In order to improve the code productivity, GCS has adopted the code generation to transform this component specification into the skeleton of component classes based on a software framework, called Device Component Framework. Using the GCS development tools, based on javadoc and gcc, in only one step, the component is generated, compiled and deployed to be tested for the first time through our GUI inspector. The main advantages of this approach are the following: It reduces the learning curve of new developers and the development error rate, allows a systematic use of design patterns in the development and software reuse, speeds up the deliverables of the software product and massively increase the timescale, design consistency and design quality, and eliminates the future refactoring process required for the code.
Software and the future of programming languages.
Aho, Alfred V
2004-02-27
Although software is the key enabler of the global information infrastructure, the amount and extent of software in use in the world today are not widely understood, nor are the programming languages and paradigms that have been used to create the software. The vast size of the embedded base of existing software and the increasing costs of software maintenance, poor security, and limited functionality are posing significant challenges for the software R&D community.
An Architecture, System Engineering, and Acquisition Approach for Space System Software Resiliency
NASA Astrophysics Data System (ADS)
Phillips, Dewanne Marie
Software intensive space systems can harbor defects and vulnerabilities that may enable external adversaries or malicious insiders to disrupt or disable system functions, risking mission compromise or loss. Mitigating this risk demands a sustained focus on the security and resiliency of the system architecture including software, hardware, and other components. Robust software engineering practices contribute to the foundation of a resilient system so that the system "can take a hit to a critical component and recover in a known, bounded, and generally acceptable period of time". Software resiliency must be a priority and addressed early in the life cycle development to contribute a secure and dependable space system. Those who develop, implement, and operate software intensive space systems must determine the factors and systems engineering practices to address when investing in software resiliency. This dissertation offers methodical approaches for improving space system resiliency through software architecture design, system engineering, increased software security, thereby reducing the risk of latent software defects and vulnerabilities. By providing greater attention to the early life cycle phases of development, we can alter the engineering process to help detect, eliminate, and avoid vulnerabilities before space systems are delivered. To achieve this objective, this dissertation will identify knowledge, techniques, and tools that engineers and managers can utilize to help them recognize how vulnerabilities are produced and discovered so that they can learn to circumvent them in future efforts. We conducted a systematic review of existing architectural practices, standards, security and coding practices, various threats, defects, and vulnerabilities that impact space systems from hundreds of relevant publications and interviews of subject matter experts. We expanded on the system-level body of knowledge for resiliency and identified a new software architecture framework and acquisition methodology to improve the resiliency of space systems from a software perspective with an emphasis on the early phases of the systems engineering life cycle. This methodology involves seven steps: 1) Define technical resiliency requirements, 1a) Identify standards/policy for software resiliency, 2) Develop a request for proposal (RFP)/statement of work (SOW) for resilient space systems software, 3) Define software resiliency goals for space systems, 4) Establish software resiliency quality attributes, 5) Perform architectural tradeoffs and identify risks, 6) Conduct architecture assessments as part of the procurement process, and 7) Ascertain space system software architecture resiliency metrics. Data illustrates that software vulnerabilities can lead to opportunities for malicious cyber activities, which could degrade the space mission capability for the user community. Reducing the number of vulnerabilities by improving architecture and software system engineering practices can contribute to making space systems more resilient. Since cyber-attacks are enabled by shortfalls in software, robust software engineering practices and an architectural design are foundational to resiliency, which is a quality that allows the system to "take a hit to a critical component and recover in a known, bounded, and generally acceptable period of time". To achieve software resiliency for space systems, acquirers and suppliers must identify relevant factors and systems engineering practices to apply across the lifecycle, in software requirements analysis, architecture development, design, implementation, verification and validation, and maintenance phases.
Near real-time, on-the-move software PED using VPEF
NASA Astrophysics Data System (ADS)
Green, Kevin; Geyer, Chris; Burnette, Chris; Agarwal, Sanjeev; Swett, Bruce; Phan, Chung; Deterline, Diane
2015-05-01
The scope of the Micro-Cloud for Operational, Vehicle-Based EO-IR Reconnaissance System (MOVERS) development effort, managed by the Night Vision and Electronic Sensors Directorate (NVESD), is to develop, integrate, and demonstrate new sensor technologies and algorithms that improve improvised device/mine detection using efficient and effective exploitation and fusion of sensor data and target cues from existing and future Route Clearance Package (RCP) sensor systems. Unfortunately, the majority of forward looking Full Motion Video (FMV) and computer vision processing, exploitation, and dissemination (PED) algorithms are often developed using proprietary, incompatible software. This makes the insertion of new algorithms difficult due to the lack of standardized processing chains. In order to overcome these limitations, EOIR developed the Government off-the-shelf (GOTS) Video Processing and Exploitation Framework (VPEF) to be able to provide standardized interfaces (e.g., input/output video formats, sensor metadata, and detected objects) for exploitation software and to rapidly integrate and test computer vision algorithms. EOIR developed a vehicle-based computing framework within the MOVERS and integrated it with VPEF. VPEF was further enhanced for automated processing, detection, and publishing of detections in near real-time, thus improving the efficiency and effectiveness of RCP sensor systems.
Yu, Xuefei; Lin, Liangzhuo; Shen, Jie; Chen, Zhi; Jian, Jun; Li, Bin; Xin, Sherman Xuegang
2018-01-01
The mean amplitude of glycemic excursions (MAGE) is an essential index for glycemic variability assessment, which is treated as a key reference for blood glucose controlling at clinic. However, the traditional "ruler and pencil" manual method for the calculation of MAGE is time-consuming and prone to error due to the huge data size, making the development of robust computer-aided program an urgent requirement. Although several software products are available instead of manual calculation, poor agreement among them is reported. Therefore, more studies are required in this field. In this paper, we developed a mathematical algorithm based on integer nonlinear programming. Following the proposed mathematical method, an open-code computer program named MAGECAA v1.0 was developed and validated. The results of the statistical analysis indicated that the developed program was robust compared to the manual method. The agreement among the developed program and currently available popular software is satisfied, indicating that the worry about the disagreement among different software products is not necessary. The open-code programmable algorithm is an extra resource for those peers who are interested in the related study on methodology in the future.
Software Component Technologies and Space Applications
NASA Technical Reports Server (NTRS)
Batory, Don
1995-01-01
In the near future, software systems will be more reconfigurable than hardware. This will be possible through the advent of software component technologies which have been prototyped in universities and research labs. In this paper, we outline the foundations for those technologies and suggest how they might impact software for space applications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fako, Raluca; Sociu, Florin; Stan, Camelia
Romania is actively engaged to update the Medium and Long Term National Strategy for Safe Management of Radioactive Waste and to approve the Road Map for Geological Repository Development. Considering relevant documents to be further updated, about 122,000 m{sup 3} SL-LILW are to be disposed in a near surface facility that will have room, also, for quantities of VLLW. Planned date for commissioning is under revision. Taking into account that in this moment there are initiated several actions for the improvement of the technical capability for LILW treatment and conditioning, several steps for the possible use of SAFRAN software weremore » considered. In view of specific data for Romanian radioactive waste inventory, authors are trying to highlight the expected limitations and unknown data related with the implementation of SAFRAN software for the foreseen pre-disposal waste management activities. There are challenges that have to be faced in the near future related with clear definition of the properties of each room, area and waste management activity. This work has the aim to address several LILW management issues in accordance with national and international regulatory framework for the assurance of nuclear safety. Also, authors intend to develop their institutional capability for the safety demonstration of the existent and future radioactive waste management facilities and activities. (authors)« less
ROSE::FTTransform - A Source-to-Source Translation Framework for Exascale Fault-Tolerance Research
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lidman, J; Quinlan, D; Liao, C
2012-03-26
Exascale computing systems will require sufficient resilience to tolerate numerous types of hardware faults while still assuring correct program execution. Such extreme-scale machines are expected to be dominated by processors driven at lower voltages (near the minimum 0.5 volts for current transistors). At these voltage levels, the rate of transient errors increases dramatically due to the sensitivity to transient and geographically localized voltage drops on parts of the processor chip. To achieve power efficiency, these processors are likely to be streamlined and minimal, and thus they cannot be expected to handle transient errors entirely in hardware. Here we present anmore » open, compiler-based framework to automate the armoring of High Performance Computing (HPC) software to protect it from these types of transient processor errors. We develop an open infrastructure to support research work in this area, and we define tools that, in the future, may provide more complete automated and/or semi-automated solutions to support software resiliency on future exascale architectures. Results demonstrate that our approach is feasible, pragmatic in how it can be separated from the software development process, and reasonably efficient (0% to 30% overhead for the Jacobi iteration on common hardware; and 20%, 40%, 26%, and 2% overhead for a randomly selected subset of benchmarks from the Livermore Loops [1]).« less
Technical aspects of telepathology with emphasis on future development.
Schwarzmann, P; Binder, B; Klose, R
2000-01-01
Pathology undergoes presently changes due to new developments in diagnostic opportunities and cost saving efforts in health care. Out of the wide field of telepathology the paper selects three prototype applications: telepathology in teleeducation, expert advice for preselected details of a slide and finally telepathology for remote diagnosis. The most challenging field for remote diagnosis is the application in the frozen section scenario. The paper starts with the mental experiment to map conventional procedures to counterparts in telepathology. Technical opportunities and economical restrictions of telepathology equipment are discussed with respect to the components: electronic camera, display devices, haptic sensors and displays, available telecommunication channels and telepathology software. As an example and for illustration of the state of the art for an advanced telemicroscopy system able to perform remote frozen section diagnosis, the HISTKOM equipment is presented in more details. The section concerning future developments regards the aspects of the acceptance by tentative users, legal aspects, costs and affordability of equipment, the market for equipment components and the adequate telecommunication services. Further is regarded the mutual influence of properties of existing systems and application experiences gained with them on the next generation of equipment and application software. Conclusions and references close the paper.
Bánfai, Balázs; Porció, Roland; Kovács, Tibor
2014-01-01
SNOMED CT is a vital component in the future of semantic interoperability in healthcare as it provides the meaning to EHRs via its semantically rich, controlled terminology. Communicating the concepts of this terminology to both humans and machines is crucial therefore formal guidelines for diagram and expression representations have been developed by the curators of SNOMED CT. This paper presents a novel, model-based approach to implementing these guidelines that allows simultaneous editing of a concept via both diagram and expression editors. The implemented extensible software component can be embedded both both desktop and web applications.
Software Engineering and Swarm-Based Systems
NASA Technical Reports Server (NTRS)
Hinchey, Michael G.; Sterritt, Roy; Pena, Joaquin; Rouff, Christopher A.
2006-01-01
We discuss two software engineering aspects in the development of complex swarm-based systems. NASA researchers have been investigating various possible concept missions that would greatly advance future space exploration capabilities. The concept mission that we have focused on exploits the principles of autonomic computing as well as being based on the use of intelligent swarms, whereby a (potentially large) number of similar spacecraft collaborate to achieve mission goals. The intent is that such systems not only can be sent to explore remote and harsh environments but also are endowed with greater degrees of protection and longevity to achieve mission goals.
A study of fault prediction and reliability assessment in the SEL environment
NASA Technical Reports Server (NTRS)
Basili, Victor R.; Patnaik, Debabrata
1986-01-01
An empirical study on estimation and prediction of faults, prediction of fault detection and correction effort, and reliability assessment in the Software Engineering Laboratory environment (SEL) is presented. Fault estimation using empirical relationships and fault prediction using curve fitting method are investigated. Relationships between debugging efforts (fault detection and correction effort) in different test phases are provided, in order to make an early estimate of future debugging effort. This study concludes with the fault analysis, application of a reliability model, and analysis of a normalized metric for reliability assessment and reliability monitoring during development of software.
Programs Model the Future of Air Traffic Management
NASA Technical Reports Server (NTRS)
2010-01-01
Through Small Business Innovation Research (SBIR) contracts with Ames Research Center, Intelligent Automation Inc., based in Rockville, Maryland, advanced specialized software the company had begun developing with U.S. Department of Defense funding. The agent-based infrastructure now allows NASA's Airspace Concept Evaluation System to explore ways of improving the utilization of the National Airspace System (NAS), providing flexible modeling of every part of the NAS down to individual planes, airports, control centers, and even weather. The software has been licensed to a number of aerospace and robotics customers, and has even been used to model the behavior of crowds.
The Implementation of Satellite Attitude Control System Software Using Object Oriented Design
NASA Technical Reports Server (NTRS)
Reid, W. Mark; Hansell, William; Phillips, Tom; Anderson, Mark O.; Drury, Derek
1998-01-01
NASA established the Small Explorer (SNMX) program in 1988 to provide frequent opportunities for highly focused and relatively inexpensive space science missions. The SMEX program has produced five satellites, three of which have been successfully launched. The remaining two spacecraft are scheduled for launch within the coming year. NASA has recently developed a prototype for the next generation Small Explorer spacecraft (SMEX-Lite). This paper describes the object-oriented design (OOD) of the SMEX-Lite Attitude Control System (ACS) software. The SMEX-Lite ACS is three-axis controlled and is capable of performing sub-arc-minute pointing. This paper first describes high level requirements governing the SMEX-Lite ACS software architecture. Next, the context in which the software resides is explained. The paper describes the principles of encapsulation, inheritance, and polymorphism with respect to the implementation of an ACS software system. This paper will also discuss the design of several ACS software components. Specifically, object-oriented designs are presented for sensor data processing, attitude determination, attitude control, and failure detection. Finally, this paper will address the establishment of the ACS Foundation Class (AFC) Library. The AFC is a large software repository, requiring a minimal amount of code modifications to produce ACS software for future projects.
Platform-independent software for medical image processing on the Internet
NASA Astrophysics Data System (ADS)
Mancuso, Michael E.; Pathak, Sayan D.; Kim, Yongmin
1997-05-01
We have developed a software tool for image processing over the Internet. The tool is a general purpose, easy to use, flexible, platform independent image processing software package with functions most commonly used in medical image processing.It provides for processing of medical images located wither remotely on the Internet or locally. The software was written in Java - the new programming language developed by Sun Microsystems. It was compiled and tested using Microsoft's Visual Java 1.0 and Microsoft's Just in Time Compiler 1.00.6211. The software is simple and easy to use. In order to use the tool, the user needs to download the software from our site before he/she runs it using any Java interpreter, such as those supplied by Sun, Symantec, Borland or Microsoft. Future versions of the operating systems supplied by Sun, Microsoft, Apple, IBM, and others will include Java interpreters. The software is then able to access and process any image on the iNternet or on the local computer. Using a 512 X 512 X 8-bit image, a 3 X 3 convolution took 0.88 seconds on an Intel Pentium Pro PC running at 200 MHz with 64 Mbytes of memory. A window/level operation took 0.38 seconds while a 3 X 3 median filter took 0.71 seconds. These performance numbers demonstrate the feasibility of using this software interactively on desktop computes. Our software tool supports various image processing techniques commonly used in medical image processing and can run without the need of any specialized hardware. It can become an easily accessible resource over the Internet to promote the learning and of understanding image processing algorithms. Also, it could facilitate sharing of medical image databases and collaboration amongst researchers and clinicians, regardless of location.
PrimerSuite: A High-Throughput Web-Based Primer Design Program for Multiplex Bisulfite PCR.
Lu, Jennifer; Johnston, Andrew; Berichon, Philippe; Ru, Ke-Lin; Korbie, Darren; Trau, Matt
2017-01-24
The analysis of DNA methylation at CpG dinucleotides has become a major research focus due to its regulatory role in numerous biological processes, but the requisite need for assays which amplify bisulfite-converted DNA represents a major bottleneck due to the unique design constraints imposed on bisulfite-PCR primers. Moreover, a review of the literature indicated no available software solutions which accommodated both high-throughput primer design, support for multiplex amplification assays, and primer-dimer prediction. In response, the tri-modular software package PrimerSuite was developed to support bisulfite multiplex PCR applications. This software was constructed to (i) design bisulfite primers against multiple regions simultaneously (PrimerSuite), (ii) screen for primer-primer dimerizing artefacts (PrimerDimer), and (iii) support multiplex PCR assays (PrimerPlex). Moreover, a major focus in the development of this software package was the emphasis on extensive empirical validation, and over 1300 unique primer pairs have been successfully designed and screened, with over 94% of them producing amplicons of the expected size, and an average mapping efficiency of 93% when screened using bisulfite multiplex resequencing. The potential use of the software in other bisulfite-based applications such as methylation-specific PCR is under consideration for future updates. This resource is freely available for use at PrimerSuite website (www.primer-suite.com).
NASA Astrophysics Data System (ADS)
Konnik, Mikhail V.; Welsh, James
2012-09-01
Numerical simulators for adaptive optics systems have become an essential tool for the research and development of the future advanced astronomical instruments. However, growing software code of the numerical simulator makes it difficult to continue to support the code itself. The problem of adequate documentation of the astronomical software for adaptive optics simulators may complicate the development since the documentation must contain up-to-date schemes and mathematical descriptions implemented in the software code. Although most modern programming environments like MATLAB or Octave have in-built documentation abilities, they are often insufficient for the description of a typical adaptive optics simulator code. This paper describes a general cross-platform framework for the documentation of scientific software using open-source tools such as LATEX, mercurial, Doxygen, and Perl. Using the Perl script that translates M-files MATLAB comments into C-like, one can use Doxygen to generate and update the documentation for the scientific source code. The documentation generated by this framework contains the current code description with mathematical formulas, images, and bibliographical references. A detailed description of the framework components is presented as well as the guidelines for the framework deployment. Examples of the code documentation for the scripts and functions of a MATLAB-based adaptive optics simulator are provided.
Evaluation of a Mobile Platform for Proof-of-Concept Autonomous Site Selection and Preparation
NASA Astrophysics Data System (ADS)
Gammell, Jonathan
A mobile robotic platform for Autonomous Site Selection and Preparation (ASSP) was developed for an analogue deployment to Mauna Kea, Hawai`i. A team of rovers performed an autonomous Ground Penetrating Radar (GPR) survey and constructed a level landing pad. They used interchangeable payloads that allowed the GPR and blade to be easily exchanged. Autonomy was accomplished by integrating the individual hardware devices with software based on the ArgoSoft framework previously developed at UTIAS. The rovers were controlled by an on-board netbook. The successes and failures of the devices and software modules are evaluated within. Recommendations are presented to address problems discovered during the deployment and to guide future research on the platform.
NASA Technical Reports Server (NTRS)
Shalkhauser, Mary Jo W.
2017-01-01
The Space Telecommunications Radio System (STRS) provides a common, consistent framework for software defined radios (SDRs) to abstract the application software from the radio platform hardware. The STRS standard aims to reduce the cost and risk of using complex, configurable and reprogrammable radio systems across NASA missions. To promote the use of the STRS architecture for future NASA advanced exploration missions, NASA Glenn Research Center (GRC) developed an STRS compliant SDR on a radio platform used by the Advance Exploration System program at the Johnson Space Center (JSC) in their Integrated Power, Avionics, and Software (iPAS) laboratory. At the conclusion of the development, the software and hardware description language (HDL) code was delivered to JSC for their use in their iPAS test bed to get hands-on experience with the STRS standard, and for development of their own STRS Waveforms on the now STRS compliant platform.The iPAS STRS Radio was implemented on the Reconfigurable, Intelligently-Adaptive Communication System (RIACS) platform, currently being used for radio development at JSC. The platform consists of a Xilinx ML605 Virtex-6 FPGA board, an Analog Devices FMCOMMS1-EBZ RF transceiver board, and an Embedded PC (Axiomtek eBox 620-110-FL) running the Ubuntu 12.4 operating system. Figure 1 shows the RIACS platform hardware. The result of this development is a very low cost STRS compliant platform that can be used for waveform developments for multiple applications.The purpose of this document is to describe the design of the HDL code for the FPGA portion of the iPAS STRS Radio particularly the design of the FPGA wrapper and the test waveform.
Computational Modeling of Tires
NASA Technical Reports Server (NTRS)
Noor, Ahmed K. (Compiler); Tanner, John A. (Compiler)
1995-01-01
This document contains presentations and discussions from the joint UVA/NASA Workshop on Computational Modeling of Tires. The workshop attendees represented NASA, the Army and Air force, tire companies, commercial software developers, and academia. The workshop objectives were to assess the state of technology in the computational modeling of tires and to provide guidelines for future research.
BIBLIO: A Computer System Designed to Support the Near-Library User Model of Information Retrieval.
ERIC Educational Resources Information Center
Belew, Richard K.; Holland, Maurita Peterson
1988-01-01
Description of the development of the Information Exchange Facility, a prototype microcomputer-based personal bibliographic facility, covers software selection, user selection, overview of the system, and evaluation. The plan for an integrated system, BIBLIO, and the future role of libraries are discussed. (eight references) (MES)
On the Emergence of New Computer Technologies
ERIC Educational Resources Information Center
Asaolu, Olumuyiwa Sunday
2006-01-01
This work presents a review of the development and application of computers. It traces the highlights of emergent computing technologies shaping our world. Recent trends in hardware and software deployment are chronicled as well as their impact on various segments of the society. The expectations for the future are also discussed along with…
Lin, Zhoumeng; Jaberi-Douraki, Majid; He, Chunla; Jin, Shiqiang; Yang, Raymond S H; Fisher, Jeffrey W; Riviere, Jim E
2017-07-01
Many physiologically based pharmacokinetic (PBPK) models for environmental chemicals, drugs, and nanomaterials have been developed to aid risk and safety assessments using acslX. However, acslX has been rendered sunset since November 2015. Alternative modeling tools and tutorials are needed for future PBPK applications. This forum article aimed to: (1) demonstrate the performance of 4 PBPK modeling software packages (acslX, Berkeley Madonna, MATLAB, and R language) tested using 2 existing models (oxytetracycline and gold nanoparticles); (2) provide a tutorial of PBPK model code conversion from acslX to Berkeley Madonna, MATLAB, and R language; (3) discuss the advantages and disadvantages of each software package in the implementation of PBPK models in toxicology, and (4) share our perspective about future direction in this field. Simulation results of plasma/tissue concentrations/amounts of oxytetracycline and gold from different models were compared visually and statistically with linear regression analyses. Simulation results from the original models were correlated well with results from the recoded models, with time-concentration/amount curves nearly superimposable and determination coefficients of 0.86-1.00. Step-by-step explanations of the recoding of the models in different software programs are provided in the Supplementary Data. In summary, this article presents a tutorial of PBPK model code conversion for a small molecule and a nanoparticle among 4 software packages, and a performance comparison of these software packages in PBPK model implementation. This tutorial helps beginners learn PBPK modeling, provides suggestions for selecting a suitable tool for future projects, and may lead to the transition from acslX to alternative modeling tools. © The Author 2017. Published by Oxford University Press on behalf of the Society of Toxicology. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Overview of the Integrated Programs for Aerospace Vehicle Design (IPAD) project
NASA Technical Reports Server (NTRS)
Venneri, S. L.
1983-01-01
To respond to national needs for improved productivity in engineering design and manufacturing, a NASA supported joint industry/government project is underway denoted Integrated Programs for Aerospace Vehicle Design (IPAD). The objective is to improve engineering productivity through better use of computer technology. It focuses on development of data base management technology and associated software for integrated company wide management of engineering and manufacturing information. Results to date on the IPAD project include an in depth documentation of a representative design process for a large engineering project, the definition and design of computer aided design software needed to support that process, and the release of prototype software to manage engineering information. This paper provides an overview of the IPAD project and summarizes progress to date and future plans.
Merlin - Massively parallel heterogeneous computing
NASA Technical Reports Server (NTRS)
Wittie, Larry; Maples, Creve
1989-01-01
Hardware and software for Merlin, a new kind of massively parallel computing system, are described. Eight computers are linked as a 300-MIPS prototype to develop system software for a larger Merlin network with 16 to 64 nodes, totaling 600 to 3000 MIPS. These working prototypes help refine a mapped reflective memory technique that offers a new, very general way of linking many types of computer to form supercomputers. Processors share data selectively and rapidly on a word-by-word basis. Fast firmware virtual circuits are reconfigured to match topological needs of individual application programs. Merlin's low-latency memory-sharing interfaces solve many problems in the design of high-performance computing systems. The Merlin prototypes are intended to run parallel programs for scientific applications and to determine hardware and software needs for a future Teraflops Merlin network.
Prediction of ball and roller bearing thermal and kinematic performance by computer analysis
NASA Technical Reports Server (NTRS)
Pirvics, J.; Kleckner, R. J.
1983-01-01
Characteristics of good computerized analysis software are suggested. These general remarks and an overview of representative software precede a more detailed discussion of load support system analysis program structure. Particular attention is directed at a recent cylindrical roller bearing analysis as an example of the available design tools. Selected software modules are then examined to reveal the detail inherent in contemporary analysis. This leads to a brief section on current design computation which seeks to suggest when and why computerized analysis is warranted. An example concludes the argument offered for such design methodology. Finally, remarks are made concerning needs for model development to address effects which are now considered to be secondary but are anticipated to emerge to primary status in the near future.
Absorbing Software Testing into the Scrum Method
NASA Astrophysics Data System (ADS)
Tuomikoski, Janne; Tervonen, Ilkka
In this paper we study, how to absorb software testing into the Scrum method. We conducted the research as an action research during the years 2007-2008 with three iterations. The result showed that testing can and even should be absorbed to the Scrum method. The testing team was merged into the Scrum teams. The teams can now deliver better working software in a shorter time, because testing keeps track of the progress of the development. Also the team spirit is higher, because the Scrum team members are committed to the same goal. The biggest change from test manager’s point of view was the organized Product Owner Team. Test manager don’t have testing team anymore, and in the future all the testing tasks have to be assigned through the Product Backlog.
NASA Technical Reports Server (NTRS)
Lu, George C.
2003-01-01
The purpose of the EXPRESS (Expedite the PRocessing of Experiments to Space Station) rack project is to provide a set of predefined interfaces for scientific payloads which allow rapid integration into a payload rack on International Space Station (ISS). VxWorks' was selected as the operating system for the rack and payload resource controller, primarily based on the proliferation of VME (Versa Module Eurocard) products. These products provide needed flexibility for future hardware upgrades to meet everchanging science research rack configuration requirements. On the International Space Station, there are multiple science research rack configurations, including: 1) Human Research Facility (HRF); 2) EXPRESS ARIS (Active Rack Isolation System); 3) WORF (Window Observational Research Facility); and 4) HHR (Habitat Holding Rack). The RIC (Rack Interface Controller) connects payloads to the ISS bus architecture for data transfer between the payload and ground control. The RIC is a general purpose embedded computer which supports multiple communication protocols, including fiber optic communication buses, Ethernet buses, EIA-422, Mil-Std-1553 buses, SMPTE (Society Motion Picture Television Engineers)-170M video, and audio interfaces to payloads and the ISS. As a cost saving and software reliability strategy, the Boeing Payload Software Organization developed reusable common software where appropriate. These reusable modules included a set of low-level driver software interfaces to 1553B. RS232, RS422, Ethernet buses, HRDL (High Rate Data Link), video switch functionality, telemetry processing, and executive software hosted on the FUC computer. These drivers formed the basis for software development of the HRF, EXPRESS, EXPRESS ARIS, WORF, and HHR RIC executable modules. The reusable RIC common software has provided extensive benefits, including: 1) Significant reduction in development flow time; 2) Minimal rework and maintenance; 3) Improved reliability; and 4) Overall reduction in software life cycle cost. Due to the limited number of crew hours available on ISS for science research, operational efficiency is a critical customer concern. The current method of upgrading RIC software is a time consuming process; thus, an improved methodology for uploading RIC software is currently under evaluation.
Intelligence Applied to Air Vehicles
NASA Technical Reports Server (NTRS)
Rosen, Robert; Gross, Anthony R.; Fletcher, L. Skip; Zornetzer, Steven (Technical Monitor)
2000-01-01
The exponential growth in information technology has provided the potential for air vehicle capabilities that were previously unavailable to mission and vehicle designers. The increasing capabilities of computer hardware and software, including new developments such as neural networks, provide a new balance of work between humans and machines. This paper will describe several NASA projects, and review results and conclusions from ground and flight investigations where vehicle intelligence was developed and applied to aeronautical and space systems. In the first example, flight results from a neural network flight control demonstration will be reviewed. Using, a highly-modified F-15 aircraft, a NASA/Dryden experimental flight test program has demonstrated how the neural network software can correctly identify and respond to changes in aircraft stability and control characteristics. Using its on-line learning capability, the neural net software would identify that something in the vehicle has changed, then reconfigure the flight control computer system to adapt to those changes. The results of the Remote Agent software project will be presented. This capability will reduce the cost of future spacecraft operations as computers become "thinking" partners along with humans. In addition, the paper will describe the objectives and plans for the autonomous airplane program and the autonomous rotorcraft project. Technologies will also be developed.
UltraPse: A Universal and Extensible Software Platform for Representing Biological Sequences.
Du, Pu-Feng; Zhao, Wei; Miao, Yang-Yang; Wei, Le-Yi; Wang, Likun
2017-11-14
With the avalanche of biological sequences in public databases, one of the most challenging problems in computational biology is to predict their biological functions and cellular attributes. Most of the existing prediction algorithms can only handle fixed-length numerical vectors. Therefore, it is important to be able to represent biological sequences with various lengths using fixed-length numerical vectors. Although several algorithms, as well as software implementations, have been developed to address this problem, these existing programs can only provide a fixed number of representation modes. Every time a new sequence representation mode is developed, a new program will be needed. In this paper, we propose the UltraPse as a universal software platform for this problem. The function of the UltraPse is not only to generate various existing sequence representation modes, but also to simplify all future programming works in developing novel representation modes. The extensibility of UltraPse is particularly enhanced. It allows the users to define their own representation mode, their own physicochemical properties, or even their own types of biological sequences. Moreover, UltraPse is also the fastest software of its kind. The source code package, as well as the executables for both Linux and Windows platforms, can be downloaded from the GitHub repository.
NASA Technical Reports Server (NTRS)
Strekalov, Dmitry V.
2012-01-01
Ring Image Analyzer software analyzes images to recognize elliptical patterns. It determines the ellipse parameters (axes ratio, centroid coordinate, tilt angle). The program attempts to recognize elliptical fringes (e.g., Newton Rings) on a photograph and determine their centroid position, the short-to-long-axis ratio, and the angle of rotation of the long axis relative to the horizontal direction on the photograph. These capabilities are important in interferometric imaging and control of surfaces. In particular, this program has been developed and applied for determining the rim shape of precision-machined optical whispering gallery mode resonators. The program relies on a unique image recognition algorithm aimed at recognizing elliptical shapes, but can be easily adapted to other geometric shapes. It is robust against non-elliptical details of the image and against noise. Interferometric analysis of precision-machined surfaces remains an important technological instrument in hardware development and quality analysis. This software automates and increases the accuracy of this technique. The software has been developed for the needs of an R&TD-funded project and has become an important asset for the future research proposal to NASA as well as other agencies.
Souza, Eliana Pereira Salles de; Cabrera, Eliana Márcia Sotello; Braile, Domingo Marcolino
2010-01-01
Technological advances and the Internet have contributed to the increased disclosure and updating of knowledge and science. Scientific papers are considered the best form of disclosure of information and have been undergoing many changes, not on their way of development, but on the structure of publication. The Future paper, a name for this new structure, uses hypermediatic resources, allowing a quick, easy and organized access to these items online. The exchange of information, comments and criticisms can be performed in real time, providing agility in science disclosure. The trend for the future of documents, both from professionals or enterprises, is the "cloud computing", in which all documents will be developed and updated with the use of various equipments: computer, palm, netbook, ipad, without need to have the software installed on your computer, requiring only an Internet connection.
Automated road marking recognition system
NASA Astrophysics Data System (ADS)
Ziyatdinov, R. R.; Shigabiev, R. R.; Talipov, D. N.
2017-09-01
Development of the automated road marking recognition systems in existing and future vehicles control systems is an urgent task. One way to implement such systems is the use of neural networks. To test the possibility of using neural network software has been developed with the use of a single-layer perceptron. The resulting system based on neural network has successfully coped with the task both when driving in the daytime and at night.
General Mission Analysis Tool (GMAT): Mission, Vision, and Business Case
NASA Technical Reports Server (NTRS)
Hughes, Steven P.
2007-01-01
The Goal of the GMAT project is to develop new space trajectory optimization and mission design technology by working inclusively with ordinary people, universities businesses and other government organizations; and to share that technology in an open and unhindered way. GMAT's a free and open source software system; free for anyone to use in development of new mission concepts or to improve current missions, freely available in source code form for enhancement or future technology development.
Improving automation standards via semantic modelling: Application to ISA88.
Dombayci, Canan; Farreres, Javier; Rodríguez, Horacio; Espuña, Antonio; Graells, Moisès
2017-03-01
Standardization is essential for automation. Extensibility, scalability, and reusability are important features for automation software that rely in the efficient modelling of the addressed systems. The work presented here is from the ongoing development of a methodology for semi-automatic ontology construction methodology from technical documents. The main aim of this work is to systematically check the consistency of technical documents and support the improvement of technical document consistency. The formalization of conceptual models and the subsequent writing of technical standards are simultaneously analyzed, and guidelines proposed for application to future technical standards. Three paradigms are discussed for the development of domain ontologies from technical documents, starting from the current state of the art, continuing with the intermediate method presented and used in this paper, and ending with the suggested paradigm for the future. The ISA88 Standard is taken as a representative case study. Linguistic techniques from the semi-automatic ontology construction methodology is applied to the ISA88 Standard and different modelling and standardization aspects that are worth sharing with the automation community is addressed. This study discusses different paradigms for developing and sharing conceptual models for the subsequent development of automation software, along with presenting the systematic consistency checking method. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.
Object-oriented programming for the biosciences.
Wiechert, W; Joksch, B; Wittig, R; Hartbrich, A; Höner, T; Möllney, M
1995-10-01
The development of software systems for the biosciences is always closely connected to experimental practice. Programs must be able to handle the inherent complexity and heterogeneous structure of biological systems in combination with the measuring equipment. Moreover, a high degree of flexibility is required to treat rapidly changing experimental conditions. Object-oriented methodology seems to be well suited for this purpose. It enables an evolutionary approach to software development that still maintains a high degree of modularity. This paper presents experience with object-oriented technology gathered during several years of programming in the fields of bioprocess development and metabolic engineering. It concentrates on the aspects of experimental support, data analysis, interaction and visualization. Several examples are presented and discussed in the general context of the experimental cycle of knowledge acquisition, thus pointing out the benefits and problems of object-oriented technology in the specific application field of the biosciences. Finally, some strategies for future development are described.
Building confidence and credibility amid growing model and computing complexity
NASA Astrophysics Data System (ADS)
Evans, K. J.; Mahajan, S.; Veneziani, C.; Kennedy, J. H.
2017-12-01
As global Earth system models are developed to answer an ever-wider range of science questions, software products that provide robust verification, validation, and evaluation must evolve in tandem. Measuring the degree to which these new models capture past behavior, predict the future, and provide the certainty of predictions is becoming ever more challenging for reasons that are generally well known, yet are still challenging to address. Two specific and divergent needs for analysis of the Accelerated Climate Model for Energy (ACME) model - but with a similar software philosophy - are presented to show how a model developer-based focus can address analysis needs during expansive model changes to provide greater fidelity and execute on multi-petascale computing facilities. A-PRIME is a python script-based quick-look overview of a fully-coupled global model configuration to determine quickly if it captures specific behavior before significant computer time and expense is invested. EVE is an ensemble-based software framework that focuses on verification of performance-based ACME model development, such as compiler or machine settings, to determine the equivalence of relevant climate statistics. The challenges and solutions for analysis of multi-petabyte output data are highlighted from the aspect of the scientist using the software, with the aim of fostering discussion and further input from the community about improving developer confidence and community credibility.
Developing Software for NASA Missions in the New Millennia
NASA Technical Reports Server (NTRS)
Truszkowski, Walt; Rash, James; Rouff, Christopher; Hinchey, Mike
2004-01-01
NASA is working on new mission concepts for exploration of the solar system. The concepts for these missions include swarms of hundreds of cooperating intelligent spacecraft which will be able to work in teams and gather more data than current single spacecraft missions. These spacecraft will not only have to operate independently for long periods of time on their own and in teams, but will also need to have autonomic properties of self healing, self configuring, self optimizing and self protecting for them to survive in the harsh space environment. Software for these types of missions has never been developed before and represents some of the challenges of software development in the new millennia. The Autonomous Nano Technology Swarm (ANTS) mission is an example of one of the swarm missions NASA is considering. The ANTS mission will use a swarm of one thousand pico-spacecraft that weigh less than five pounds. Using an insect colony analog, ANTS will explore the asteroid belt and catalog the mass, density, morphology, and chemical composition of the asteroids. Due to the size of the spacecraft, each will only carry a single miniaturized science instrument which will require them to cooperate in searching for asteroids that are of scientific interest. This article also discusses the ANTS mission, the properties the spacecraft will need and how that will effect future software development.
Inside a VAMDC data node—putting standards into practical software
NASA Astrophysics Data System (ADS)
Regandell, Samuel; Marquart, Thomas; Piskunov, Nikolai
2018-03-01
Access to molecular and atomic data is critical for many forms of remote sensing analysis across different fields. Many atomic and molecular databases are however highly specialised for their intended application, complicating querying and combination data between sources. The Virtual Atomic and Molecular Data Centre, VAMDC, is an electronic infrastructure that allows each database to register as a ‘node’. Through services such as VAMDC’s portal website, users can then access and query all nodes in a homogenised way. Today all major Atomic and Molecular databases are attached to VAMDC This article describes the software tools we developed to help data providers create and manage a VAMDC node. It gives an overview of the VAMDC infrastructure and of the various standards it uses. The article then discusses the development choices made and how the standards are implemented in practice. It concludes with a full example of implementing a VAMDC node using a real-life case as well as future plans for the node software.
Spaceport Command and Control System Software Development
NASA Technical Reports Server (NTRS)
Kessluk, Jonathan
2017-01-01
During the course of this internship, software was developed to ensure the efficient management and allocation of an employee to an area of their expertise or experience that may otherwise have gone unnoticed. This directly affects any future missions prescribed to the SLS by allowing management to easily see where employees can be re-allocated to different mission specific projects or assist a project that may be lacking in a specific field. This software is intended to provide proof of NASA's diligence and deliberation in hiring new employees and in providing training and guidance to employees who may have fallen short of expectations. Allowing management to more easily statistically track and monitor the supply and demand of employees with specific experience will help introduce a beneficial culture where employees are given the ability to grow and hone skills which might otherwise atrophy over time. With this new system in place, NASA can prove the employees they hire and already have are exemplary and will remain exemplary to serve the nation as a whole.
2009-04-23
of Code Need for increased functionality will be a forcing function to bring the fields of software and systems engineering... of Software-Intensive Systems is Increasing 3 How Evolving Trends in Systems and Software Technologies Bode Well for Advancing the Precision of ...Engineering in Continued Partnership 4 How Evolving Trends in Systems and Software Technologies Bode Well for Advancing the
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gutti, V; Morrow, A; Kim, S
Purpose: Stereotactic radiosurgery (SRS) treatments using conical collimators can potentially result in gantry collision with treatment table due to limited collision-clear spaces. An in-house software was developed to help the SRS treatment planner mitigate potential SRS conical collimator (Varian Medical System, Palo Alto, CA) collisions with the treatment table. This software was designed to remove treatment re-planning secondary to unexpected collisions. Methods: A BrainLAB SRS ICT Frameless Extension used for SRS treatments in our clinic was mathematically modelled using surface points registered to the 3D co-ordinate space of the couch extension. The surface points are transformed based on the treatmentmore » isocenter point and potential collisions are determined in 3D space for couch and gantry angle combinations. The distance between the SRS conical collimators and LINAC isocenter is known. The collision detection model was programmed in MATLAB (Mathwork, Natick, MA) to display graphical plots of the calculations, and the plotted data is used to avoid the gantry and couch angle combinations that would likely result in a collision. We have utilized the cone collision tool for 23 SRS cone treatment plans (8 retrospective and 15 prospective for 10 patients). Results: Twenty one plans strongly agreed with the software tool prediction for collision. However, in two plans, a collision was observed with a 0.5 cm margin when the software predicted no collision. Therefore, additional margins were added to the clearance criteria in the program to achieve a lower risk of actual collisions. Conclusion: Our in-house developed collision check software successfully avoided SRS cone re-planning by 91.3% due to a reduction in cone collisions with the treatment table. Future developments to our software will include a CT image data set based collision prediction model as well as a beam angle optimization tool to avoid normal critical tissues as well as previously treated lesions.« less
Unified Geophysical Cloud Platform (UGCP) for Seismic Monitoring and other Geophysical Applications.
NASA Astrophysics Data System (ADS)
Synytsky, R.; Starovoit, Y. O.; Henadiy, S.; Lobzakov, V.; Kolesnikov, L.
2016-12-01
We present Unified Geophysical Cloud Platform (UGCP) or UniGeoCloud as an innovative approach for geophysical data processing in the Cloud environment with the ability to run any type of data processing software in isolated environment within the single Cloud platform. We've developed a simple and quick method of several open-source widely known software seismic packages (SeisComp3, Earthworm, Geotool, MSNoise) installation which does not require knowledge of system administration, configuration, OS compatibility issues etc. and other often annoying details preventing time wasting for system configuration work. Installation process is simplified as "mouse click" on selected software package from the Cloud market place. The main objective of the developed capability was the software tools conception with which users are able to design and install quickly their own highly reliable and highly available virtual IT-infrastructure for the organization of seismic (and in future other geophysical) data processing for either research or monitoring purposes. These tools provide access to any seismic station data available in open IP configuration from the different networks affiliated with different Institutions and Organizations. It allows also setting up your own network as you desire by selecting either regionally deployed stations or the worldwide global network based on stations selection form the global map. The processing software and products and research results could be easily monitored from everywhere using variety of user's devices form desk top computers to IT gadgets. Currents efforts of the development team are directed to achieve Scalability, Reliability and Sustainability (SRS) of proposed solutions allowing any user to run their applications with the confidence of no data loss and no failure of the monitoring or research software components. The system is suitable for quick rollout of NDC-in-Box software package developed for State Signatories and aimed for promotion of data processing collected by the IMS Network.
NASA Technical Reports Server (NTRS)
Siamidis, John; Yuko, Jim
2014-01-01
The Space Communications and Navigation (SCaN) Program Office at NASA Headquarters oversees all of NASAs space communications activities. SCaN manages and directs the ground-based facilities and services provided by the Deep Space Network (DSN), Near Earth Network (NEN), and the Space Network (SN). Through the SCaN Program Office, NASA GRC developed a Software Defined Radio (SDR) testbed experiment (SCaN testbed experiment) for use on the International Space Station (ISS). It is comprised of three different SDR radios, the Jet Propulsion Laboratory (JPL) radio, Harris Corporation radio, and the General Dynamics Corporation radio. The SCaN testbed experiment provides an on-orbit, adaptable, SDR Space Telecommunications Radio System (STRS) - based facility to conduct a suite of experiments to advance the Software Defined Radio, Space Telecommunications Radio Systems (STRS) standards, reduce risk (Technology Readiness Level (TRL) advancement) for candidate Constellation future space flight hardware software, and demonstrate space communication links critical to future NASA exploration missions. The SCaN testbed project provides NASA, industry, other Government agencies, and academic partners the opportunity to develop and field communications, navigation, and networking technologies in the laboratory and space environment based on reconfigurable, software defined radio platforms and the STRS Architecture.The SCaN testbed is resident on the P3 Express Logistics Carrier (ELC) on the exterior truss of the International Space Station (ISS). The SCaN testbed payload launched on the Japanese Aerospace Exploration Agency (JAXA) H-II Transfer Vehicle (HTV) and was installed on the ISS P3 ELC located on the inboard RAM P3 site. The daily operations and testing are managed out of NASA GRC in the Telescience Support Center (TSC).
The Instrumental Genesis Process in Future Primary Teachers Using Dynamic Geometry Software
ERIC Educational Resources Information Center
Ruiz-López, Natalia
2018-01-01
This paper, which describes a study undertaken with pairs of future primary teachers using GeoGebra software to solve geometry problems, includes a brief literature review, the theoretical framework and methodology used. An analysis of the instrumental genesis process for a pair participating in the case study is also provided. This analysis…
Telescience Resource Kit (TReK)
NASA Technical Reports Server (NTRS)
Lippincott, Jeff
2015-01-01
Telescience Resource Kit (TReK) is one of the Huntsville Operations Support Center (HOSC) remote operations solutions. It can be used to monitor and control International Space Station (ISS) payloads from anywhere in the world. It is comprised of a suite of software applications and libraries that provide generic data system capabilities and access to HOSC services. The TReK Software has been operational since 2000. A new cross-platform version of TReK is under development. The new software is being released in phases during the 2014-2016 timeframe. The TReK Release 3.x series of software is the original TReK software that has been operational since 2000. This software runs on Windows. It contains capabilities to support traditional telemetry and commanding using CCSDS (Consultative Committee for Space Data Systems) packets. The TReK Release 4.x series of software is the new cross platform software. It runs on Windows and Linux. The new TReK software will support communication using standard IP protocols and traditional telemetry and commanding. All the software listed above is compatible and can be installed and run together on Windows. The new TReK software contains a suite of software that can be used by payload developers on the ground and onboard (TReK Toolkit). TReK Toolkit is a suite of lightweight libraries and utility applications for use onboard and on the ground. TReK Desktop is the full suite of TReK software -most useful on the ground. When TReK Desktop is released, the TReK installation program will provide the option to choose just the TReK Toolkit portion of the software or the full TReK Desktop suite. The ISS program is providing the TReK Toolkit software as a generic flight software capability offered as a standard service to payloads. TReK Software Verification was conducted during the April/May 2015 timeframe. Payload teams using the TReK software onboard can reference the TReK software verification. TReK will be demonstrated on-orbit running on an ISS provided T61p laptop. Target Timeframe: September 2015 -2016. The on-orbit demonstration will collect benchmark metrics, and will be used in the future to provide live demonstrations during ISS Payload Conferences. Benchmark metrics and demonstrations will address the protocols described in SSP 52050-0047 Ku Forward section 3.3.7. (Associated term: CCSDS File Delivery Protocol (CFDP)).
Open source and healthcare in Europe - time to put leading edge ideas into practice.
Murray, Peter J; Wright, Graham; Karopka, Thomas; Betts, Helen; Orel, Andrej
2009-01-01
Free/Libre and Open Source Software (FLOSS) is a process of software development, a method of licensing and a philosophy. Although FLOSS plays a significant role in several market areas, the impact in the health care arena is still limited. FLOSS is promoted as one of the most effective means for overcoming fragmentation in the health care sector and providing a basis for more efficient, timely and cost effective health care provision. The 2008 European Federation for Medical Informatics (EFMI) Special Topic Conference (STC) explored a range of current and future issues related to FLOSS in healthcare (FLOSS-HC). In particular, there was a focus on health records, ubiquitous computing, knowledge sharing, and current and future applications. Discussions resulted in a list of main barriers and challenges for use of FLOSS-HC. Based on the outputs of this event, the 2004 Open Steps events and subsequent workshops at OSEHC2009 and Med-e-Tel 2009, a four-step strategy has been proposed for FLOSS-HC: 1) a FLOSS-HC inventory; 2) a FLOSS-HC collaboration platform, use case database and knowledge base; 3) a worldwide FLOSS-HC network; and 4) FLOSS-HC dissemination activities. The workshop will further refine this strategy and elaborate avenues for FLOSS-HC from scientific, business and end-user perspectives. To gain acceptance by different stakeholders in the health care industry, different activities have to be conducted in collaboration. The workshop will focus on the scientific challenges in developing methodologies and criteria to support FLOSS-HC in becoming a viable alternative to commercial and proprietary software development and deployment.
Usability study of clinical exome analysis software: top lessons learned and recommendations.
Shyr, Casper; Kushniruk, Andre; Wasserman, Wyeth W
2014-10-01
New DNA sequencing technologies have revolutionized the search for genetic disruptions. Targeted sequencing of all protein coding regions of the genome, called exome analysis, is actively used in research-oriented genetics clinics, with the transition to exomes as a standard procedure underway. This transition is challenging; identification of potentially causal mutation(s) amongst ∼10(6) variants requires specialized computation in combination with expert assessment. This study analyzes the usability of user interfaces for clinical exome analysis software. There are two study objectives: (1) To ascertain the key features of successful user interfaces for clinical exome analysis software based on the perspective of expert clinical geneticists, (2) To assess user-system interactions in order to reveal strengths and weaknesses of existing software, inform future design, and accelerate the clinical uptake of exome analysis. Surveys, interviews, and cognitive task analysis were performed for the assessment of two next-generation exome sequence analysis software packages. The subjects included ten clinical geneticists who interacted with the software packages using the "think aloud" method. Subjects' interactions with the software were recorded in their clinical office within an urban research and teaching hospital. All major user interface events (from the user interactions with the packages) were time-stamped and annotated with coding categories to identify usability issues in order to characterize desired features and deficiencies in the user experience. We detected 193 usability issues, the majority of which concern interface layout and navigation, and the resolution of reports. Our study highlights gaps in specific software features typical within exome analysis. The clinicians perform best when the flow of the system is structured into well-defined yet customizable layers for incorporation within the clinical workflow. The results highlight opportunities to dramatically accelerate clinician analysis and interpretation of patient genomic data. We present the first application of usability methods to evaluate software interfaces in the context of exome analysis. Our results highlight how the study of user responses can lead to identification of usability issues and challenges and reveal software reengineering opportunities for improving clinical next-generation sequencing analysis. While the evaluation focused on two distinctive software tools, the results are general and should inform active and future software development for genome analysis software. As large-scale genome analysis becomes increasingly common in healthcare, it is critical that efficient and effective software interfaces are provided to accelerate clinical adoption of the technology. Implications for improved design of such applications are discussed. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.
Detection of possible restriction sites for type II restriction enzymes in DNA sequences.
Gagniuc, P; Cimponeriu, D; Ionescu-Tîrgovişte, C; Mihai, Andrada; Stavarachi, Monica; Mihai, T; Gavrilă, L
2011-01-01
In order to make a step forward in the knowledge of the mechanism operating in complex polygenic disorders such as diabetes and obesity, this paper proposes a new algorithm (PRSD -possible restriction site detection) and its implementation in Applied Genetics software. This software can be used for in silico detection of potential (hidden) recognition sites for endonucleases and for nucleotide repeats identification. The recognition sites for endonucleases may result from hidden sequences through deletion or insertion of a specific number of nucleotides. Tests were conducted on DNA sequences downloaded from NCBI servers using specific recognition sites for common type II restriction enzymes introduced in the software database (n = 126). Each possible recognition site indicated by the PRSD algorithm implemented in Applied Genetics was checked and confirmed by NEBcutter V2.0 and Webcutter 2.0 software. In the sequence NG_008724.1 (which includes 63632 nucleotides) we found a high number of potential restriction sites for ECO R1 that may be produced by deletion (n = 43 sites) or insertion (n = 591 sites) of one nucleotide. The second module of Applied Genetics has been designed to find simple repeats sizes with a real future in understanding the role of SNPs (Single Nucleotide Polymorphisms) in the pathogenesis of the complex metabolic disorders. We have tested the presence of simple repetitive sequences in five DNA sequence. The software indicated exact position of each repeats detected in the tested sequences. Future development of Applied Genetics can provide an alternative for powerful tools used to search for restriction sites or repetitive sequences or to improve genotyping methods.
Simulation-To-Flight (STF-1): A Mission to Enable CubeSat Software-Based Validation and Verification
NASA Technical Reports Server (NTRS)
Morris, Justin; Zemerick, Scott; Grubb, Matt; Lucas, John; Jaridi, Majid; Gross, Jason N.; Ohi, Nicholas; Christian, John A.; Vassiliadis, Dimitris; Kadiyala, Anand;
2016-01-01
The Simulation-to-Flight 1 (STF-1) CubeSat mission aims to demonstrate how legacy simulation technologies may be adapted for flexible and effective use on missions using the CubeSat platform. These technologies, named NASA Operational Simulator (NOS), have demonstrated significant value on several missions such as James Webb Space Telescope, Global Precipitation Measurement, Juno, and Deep Space Climate Observatory in the areas of software development, mission operations/training, verification and validation (V&V), test procedure development and software systems check-out. STF-1 will demonstrate a highly portable simulation and test platform that allows seamless transition of mission development artifacts to flight products. This environment will decrease development time of future CubeSat missions by lessening the dependency on hardware resources. In addition, through a partnership between NASA GSFC, the West Virginia Space Grant Consortium and West Virginia University, the STF-1 CubeSat will hosts payloads for three secondary objectives that aim to advance engineering and physical-science research in the areas of navigation systems of small satellites, provide useful data for understanding magnetosphere-ionosphere coupling and space weather, and verify the performance and durability of III-V Nitride-based materials.
The Core Flight System (cFS) Community: Providing Low Cost Solutions for Small Spacecraft
NASA Technical Reports Server (NTRS)
McComas, David; Wilmot, Jonathan; Cudmore, Alan
2016-01-01
In February 2015 the NASA Goddard Space Flight Center (GSFC) completed the open source release of the entire Core Flight Software (cFS) suite. After the open source release a multi-NASA center Configuration Control Board (CCB) was established that has managed multiple cFS product releases. The cFS was developed and is being maintained in compliance with the NASA Class B software development process requirements and the open source release includes all Class B artifacts. The cFS is currently running on three operational science spacecraft and is being used on multiple spacecraft and instrument development efforts. While the cFS itself is a viable flight software (FSW) solution, we have discovered that the cFS community is a continuous source of innovation and growth that provides products and tools that serve the entire FSW lifecycle and future mission needs. This paper summarizes the current state of the cFS community, the key FSW technologies being pursued, the development/verification tools and opportunities for the small satellite community to become engaged. The cFS is a proven high quality and cost-effective solution for small satellites with constrained budgets.
Microcomputer Software for Libraries: A Survey.
ERIC Educational Resources Information Center
Nolan, Jeanne M.
1983-01-01
Reports on findings of research done by Nolan Information Management Services concerning availability of microcomputer software for libraries. Highlights include software categories (specific, generic-database management programs, original); number of programs available in 1982 for 12 applications; projections for 1983; and future software…
Drawert, Brian; Engblom, Stefan; Hellander, Andreas
2012-06-22
Experiments in silico using stochastic reaction-diffusion models have emerged as an important tool in molecular systems biology. Designing computational software for such applications poses several challenges. Firstly, realistic lattice-based modeling for biological applications requires a consistent way of handling complex geometries, including curved inner- and outer boundaries. Secondly, spatiotemporal stochastic simulations are computationally expensive due to the fast time scales of individual reaction- and diffusion events when compared to the biological phenomena of actual interest. We therefore argue that simulation software needs to be both computationally efficient, employing sophisticated algorithms, yet in the same time flexible in order to meet present and future needs of increasingly complex biological modeling. We have developed URDME, a flexible software framework for general stochastic reaction-transport modeling and simulation. URDME uses Unstructured triangular and tetrahedral meshes to resolve general geometries, and relies on the Reaction-Diffusion Master Equation formalism to model the processes under study. An interface to a mature geometry and mesh handling external software (Comsol Multiphysics) provides for a stable and interactive environment for model construction. The core simulation routines are logically separated from the model building interface and written in a low-level language for computational efficiency. The connection to the geometry handling software is realized via a Matlab interface which facilitates script computing, data management, and post-processing. For practitioners, the software therefore behaves much as an interactive Matlab toolbox. At the same time, it is possible to modify and extend URDME with newly developed simulation routines. Since the overall design effectively hides the complexity of managing the geometry and meshes, this means that newly developed methods may be tested in a realistic setting already at an early stage of development. In this paper we demonstrate, in a series of examples with high relevance to the molecular systems biology community, that the proposed software framework is a useful tool for both practitioners and developers of spatial stochastic simulation algorithms. Through the combined efforts of algorithm development and improved modeling accuracy, increasingly complex biological models become feasible to study through computational methods. URDME is freely available at http://www.urdme.org.
Software reuse issues affecting AdaNET
NASA Technical Reports Server (NTRS)
Mcbride, John G.
1989-01-01
The AdaNet program is reviewing its long-term goals and strategies. A significant concern is whether current AdaNet plans adequately address the major strategic issues of software reuse technology. The major reuse issues of providing AdaNet services that should be addressed as part of future AdaNet development are identified and reviewed. Before significant development proceeds, a plan should be developed to resolve the aforementioned issues. This plan should also specify a detailed approach to develop AdaNet. A three phased strategy is recommended. The first phase would consist of requirements analysis and produce an AdaNet system requirements specification. It would consider the requirements of AdaNet in terms of mission needs, commercial realities, and administrative policies affecting development, and the experience of AdaNet and other projects promoting the transfer software engineering technology. Specifically, requirements analysis would be performed to better understand the requirements for AdaNet functions. The second phase would provide a detailed design of the system. The AdaNet should be designed with emphasis on the use of existing technology readily available to the AdaNet program. A number of reuse products are available upon which AdaNet could be based. This would significantly reduce the risk and cost of providing an AdaNet system. Once a design was developed, implementation would proceed in the third phase.
Constructing spherical panoramas of a bladder phantom from endoscopic video using bundle adjustment
NASA Astrophysics Data System (ADS)
Soper, Timothy D.; Chandler, John E.; Porter, Michael P.; Seibel, Eric J.
2011-03-01
The high recurrence rate of bladder cancer requires patients to undergo frequent surveillance screenings over their lifetime following initial diagnosis and resection. Our laboratory is developing panoramic stitching software that would compile several minutes of cystoscopic video into a single panoramic image, covering the entire bladder, for review by an urolgist at a later time or remote location. Global alignment of video frames is achieved by using a bundle adjuster that simultaneously recovers both the 3D structure of the bladder as well as the scope motion using only the video frames as input. The result of the algorithm is a complete 360° spherical panorama of the outer surface. The details of the software algorithms are presented here along with results from both a virtual cystoscopy as well from real endoscopic imaging of a bladder phantom. The software successfully stitched several hundred video frames into a single panoramic with subpixel accuracy and with no knowledge of the intrinsic camera properties, such as focal length and radial distortion. In the discussion, we outline future work in development of the software as well as identifying factors pertinent to clinical translation of this technology.
Ship electric propulsion simulator based on networking technology
NASA Astrophysics Data System (ADS)
Zheng, Huayao; Huang, Xuewu; Chen, Jutao; Lu, Binquan
2006-11-01
According the new ship building tense, a novel electric propulsion simulator (EPS) had been developed in Marine Simulation Center of SMU. The architecture, software function and FCS network technology of EPS and integrated power system (IPS) were described. In allusion to the POD propeller in ship, a special physical model was built. The POD power was supplied from the simulative 6.6 kV Medium Voltage Main Switchboard, its control could be realized in local or remote mode. Through LAN, the simulated feature information of EPS will pass to the physical POD model, which would reflect the real thruster working status in different sea conditions. The software includes vessel-propeller math module, thruster control system, distribution and emergency integrated management, double closed loop control system, vessel static water resistance and dynamic software; instructor main control software. The monitor and control system is realized by real time data collection system and CAN bus technology. During the construction, most devices such as monitor panels and intelligent meters, are developed in lab which were based on embedded microcomputer system with CAN interface to link the network. They had also successfully used in practice and would be suitable for the future demands of digitalization ship.
Development of Lidar Sensor Systems for Autonomous Safe Landing on Planetary Bodies
NASA Technical Reports Server (NTRS)
Amzajerdian, Farzin; Pierottet, Diego F.; Petway, Larry B.; Vanek, Michael D.
2010-01-01
Lidar has been identified by NASA as a key technology for enabling autonomous safe landing of future robotic and crewed lunar landing vehicles. NASA LaRC has been developing three laser/lidar sensor systems under the ALHAT project. The capabilities of these Lidar sensor systems were evaluated through a series of static tests using a calibrated target and through dynamic tests aboard helicopters and a fixed wing aircraft. The airborne tests were performed over Moon-like terrain in the California and Nevada deserts. These tests provided the necessary data for the development of signal processing software, and algorithms for hazard detection and navigation. The tests helped identify technology areas needing improvement and will also help guide future technology advancement activities.
NASA Technical Reports Server (NTRS)
Shalkhauser, Mary Jo W.; Roche, Rigoberto
2017-01-01
The Space Telecommunications Radio System (STRS) provides a common, consistent framework for software defined radios (SDRs) to abstract the application software from the radio platform hardware. The STRS standard aims to reduce the cost and risk of using complex, configurable and reprogrammable radio systems across NASA missions. To promote the use of the STRS architecture for future NASA advanced exploration missions, NASA Glenn Research Center (GRC) developed an STRS-compliant SDR on a radio platform used by the Advance Exploration System program at the Johnson Space Center (JSC) in their Integrated Power, Avionics, and Software (iPAS) laboratory. The iPAS STRS Radio was implemented on the Reconfigurable, Intelligently-Adaptive Communication System (RIACS) platform, currently being used for radio development at JSC. The platform consists of a Xilinx ML605 Virtex-6 FPGA board, an Analog Devices FMCOMMS1-EBZ RF transceiver board, and an Embedded PC (Axiomtek eBox 620-110-FL) running the Ubuntu 12.4 operating system. Figure 1 shows the RIACS platform hardware. The result of this development is a very low cost STRS compliant platform that can be used for waveform developments for multiple applications.The purpose of this document is to describe how to develop a new waveform using the RIACS platform and the Very High Speed Integrated Circuits (VHSIC) Hardware Description Language (VHDL) FPGA wrapper code and the STRS implementation on the Axiomtek processor.
NASA Technical Reports Server (NTRS)
Zhang, Zhong
1997-01-01
The development of large-scale, composite software in a geographically distributed environment is an evolutionary process. Often, in such evolving systems, striving for consistency is complicated by many factors, because development participants have various locations, skills, responsibilities, roles, opinions, languages, terminology and different degrees of abstraction they employ. This naturally leads to many partial specifications or viewpoints. These multiple views on the system being developed usually overlap. From another aspect, these multiple views give rise to the potential for inconsistency. Existing CASE tools do not efficiently manage inconsistencies in distributed development environment for a large-scale project. Based on the ViewPoints framework the WHERE (Web-Based Hypertext Environment for requirements Evolution) toolkit aims to tackle inconsistency management issues within geographically distributed software development projects. Consequently, WHERE project helps make more robust software and support software assurance process. The long term goal of WHERE tools aims to the inconsistency analysis and management in requirements specifications. A framework based on Graph Grammar theory and TCMJAVA toolkit is proposed to detect inconsistencies among viewpoints. This systematic approach uses three basic operations (UNION, DIFFERENCE, INTERSECTION) to study the static behaviors of graphic and tabular notations. From these operations, subgraphs Query, Selection, Merge, Replacement operations can be derived. This approach uses graph PRODUCTIONS (rewriting rules) to study the dynamic transformations of graphs. We discuss the feasibility of implementation these operations. Also, We present the process of porting original TCM (Toolkit for Conceptual Modeling) project from C++ to Java programming language in this thesis. A scenario based on NASA International Space Station Specification is discussed to show the applicability of our approach. Finally, conclusion and future work about inconsistency management issues in WHERE project will be summarized.
Software Assurance Challenges for the Commercial Crew Program
NASA Technical Reports Server (NTRS)
Cuyno, Patrick; Malnick, Kathy D.; Schaeffer, Chad E.
2015-01-01
This paper will provide a description of some of the challenges NASA is facing in providing software assurance within the new commercial space services paradigm, namely with the Commercial Crew Program (CCP). The CCP will establish safe, reliable, and affordable access to the International Space Station (ISS) by purchasing a ride from commercial companies. The CCP providers have varying experience with software development in safety-critical space systems. NASA's role in providing effective software assurance support to the CCP providers is critical to the success of CCP. These challenges include funding multiple vehicles that execute in parallel and have different rules of engagement, multiple providers with unique proprietary concerns, providing equivalent guidance to all providers, permitting alternates to NASA standards, and a large number of diverse stakeholders. It is expected that these challenges will exist in future programs, especially if the CCP paradigm proves successful. The proposed CCP approach to address these challenges includes a risk-based assessment with varying degrees of engagement and a distributed assurance model. This presentation will describe NASA IV&V Program's software assurance support and responses to these challenges.
Space Shuttle GN and C Development History and Evolution
NASA Technical Reports Server (NTRS)
Zimpfer, Douglas; Hattis, Phil; Ruppert, John; Gavert, Don
2011-01-01
Completion of the final Space Shuttle flight marks the end of a significant era in Human Spaceflight. Developed in the 1970 s, first launched in 1981, the Space Shuttle embodies many significant engineering achievements. One of these is the development and operation of the first extensive fly-by-wire human space transportation Guidance, Navigation and Control (GN&C) System. Development of the Space Shuttle GN&C represented first time inclusions of modern techniques for electronics, software, algorithms, systems and management in a complex system. Numerous technical design trades and lessons learned continue to drive current vehicle development. For example, the Space Shuttle GN&C system incorporated redundant systems, complex algorithms and flight software rigorously verified through integrated vehicle simulations and avionics integration testing techniques. Over the past thirty years, the Shuttle GN&C continued to go through a series of upgrades to improve safety, performance and to enable the complex flight operations required for assembly of the international space station. Upgrades to the GN&C ranged from the addition of nose wheel steering to modifications that extend capabilities to control of the large flexible configurations while being docked to the Space Station. This paper provides a history of the development and evolution of the Space Shuttle GN&C system. Emphasis is placed on key architecture decisions, design trades and the lessons learned for future complex space transportation system developments. Finally, some of the interesting flight operations experience is provided to inform future developers of flight experiences.
Airframe Noise Studies: Review and Future Direction
NASA Technical Reports Server (NTRS)
Rackl, Robert G.; Miller, Gregory; Guo, Yueping; Yamamoto, Kingo
2005-01-01
This report contains the following information: 1) a review of airframe noise research performed under NASA's Advanced Subsonic Transport (AST) program up to the year 2000, 2) a comparison of the year 1992 airframe noise predictions with those using a year 2000 baseline, 3) an assessment of various airframe noise reduction concepts as applied to the year 2000 baseline predictions, and 4) prioritized recommendations for future airframe noise reduction work. NASA's Aircraft Noise Prediction Program was the software used for all noise predictions and assessments. For future work, the recommendations for the immediate future focus on the development of design tools sensitive to airframe noise treatment effects and on improving the basic understanding of noise generation by the landing gear as well as on its reduction.
A new Scheme for ATLAS Trigger Simulation using Legacy Code
NASA Astrophysics Data System (ADS)
Galster, Gorm; Stelzer, Joerg; Wiedenmann, Werner
2014-06-01
Analyses at the LHC which search for rare physics processes or determine with high precision Standard Model parameters require accurate simulations of the detector response and the event selection processes. The accurate determination of the trigger response is crucial for the determination of overall selection efficiencies and signal sensitivities. For the generation and the reconstruction of simulated event data, the most recent software releases are usually used to ensure the best agreement between simulated data and real data. For the simulation of the trigger selection process, however, ideally the same software release that was deployed when the real data were taken should be used. This potentially requires running software dating many years back. Having a strategy for running old software in a modern environment thus becomes essential when data simulated for past years start to present a sizable fraction of the total. We examined the requirements and possibilities for such a simulation scheme within the ATLAS software framework and successfully implemented a proof-of-concept simulation chain. One of the greatest challenges was the choice of a data format which promises long term compatibility with old and new software releases. Over the time periods envisaged, data format incompatibilities are also likely to emerge in databases and other external support services. Software availability may become an issue, when e.g. the support for the underlying operating system might stop. In this paper we present the encountered problems and developed solutions, and discuss proposals for future development. Some ideas reach beyond the retrospective trigger simulation scheme in ATLAS as they also touch more generally aspects of data preservation.
ERIC Educational Resources Information Center
Lin, Yu-Wei; Zini, Enrico
2008-01-01
This empirical paper shows how free/libre open source software (FLOSS) contributes to mutual and collaborative learning in an educational environment. Unlike proprietary software, FLOSS allows extensive customisation of software to support the needs of local users better. This also allows users to participate more proactively in the development…
Risk Mitigation for the Development of the New Ariane 5 On-Board Computer
NASA Astrophysics Data System (ADS)
Stransky, Arnaud; Chevalier, Laurent; Dubuc, Francois; Conde-Reis, Alain; Ledoux, Alain; Miramont, Philippe; Johansson, Leif
2010-08-01
In the frame of the Ariane 5 production, some equipment will become obsolete and need to be redesigned and redeveloped. This is the case for the On-Board Computer, which has to be completely redesigned and re-qualified by RUAG Space, as well as all its on-board software and associated development tools by ASTRIUM ST. This paper presents this obsolescence treatment, which has started in 2007 under an ESA contract, in the frame of ACEP and ARTA accompaniment programmes, and is very critical in technical term but also from schedule point of view: it gives the context and overall development plan, and details the risk mitigation actions agreed with ESA, especially those related to the development of the input/output ASIC, and also the on-board software porting and revalidation strategy. The efficiency of these risk mitigation actions has been proven by the outcome schedule; this development constitutes an up-to-date case for good practices, including some experience report and feedback for future other developments.
Medical image computing for computer-supported diagnostics and therapy. Advances and perspectives.
Handels, H; Ehrhardt, J
2009-01-01
Medical image computing has become one of the most challenging fields in medical informatics. In image-based diagnostics of the future software assistance will become more and more important, and image analysis systems integrating advanced image computing methods are needed to extract quantitative image parameters to characterize the state and changes of image structures of interest (e.g. tumors, organs, vessels, bones etc.) in a reproducible and objective way. Furthermore, in the field of software-assisted and navigated surgery medical image computing methods play a key role and have opened up new perspectives for patient treatment. However, further developments are needed to increase the grade of automation, accuracy, reproducibility and robustness. Moreover, the systems developed have to be integrated into the clinical workflow. For the development of advanced image computing systems methods of different scientific fields have to be adapted and used in combination. The principal methodologies in medical image computing are the following: image segmentation, image registration, image analysis for quantification and computer assisted image interpretation, modeling and simulation as well as visualization and virtual reality. Especially, model-based image computing techniques open up new perspectives for prediction of organ changes and risk analysis of patients and will gain importance in diagnostic and therapy of the future. From a methodical point of view the authors identify the following future trends and perspectives in medical image computing: development of optimized application-specific systems and integration into the clinical workflow, enhanced computational models for image analysis and virtual reality training systems, integration of different image computing methods, further integration of multimodal image data and biosignals and advanced methods for 4D medical image computing. The development of image analysis systems for diagnostic support or operation planning is a complex interdisciplinary process. Image computing methods enable new insights into the patient's image data and have the future potential to improve medical diagnostics and patient treatment.
Acoustic Emission Analysis Applet (AEAA) Software
NASA Technical Reports Server (NTRS)
Nichols, Charles T.; Roth, Don J.
2013-01-01
NASA Glenn Research and NASA White Sands Test Facility have developed software supporting an automated pressure vessel structural health monitoring (SHM) system based on acoustic emissions (AE). The software, referred to as the Acoustic Emission Analysis Applet (AEAA), provides analysts with a tool that can interrogate data collected on Digital Wave Corp. and Physical Acoustics Corp. software using a wide spectrum of powerful filters and charts. This software can be made to work with any data once the data format is known. The applet will compute basic AE statistics, and statistics as a function of time and pressure (see figure). AEAA provides value added beyond the analysis provided by the respective vendors' analysis software. The software can handle data sets of unlimited size. A wide variety of government and commercial applications could benefit from this technology, notably requalification and usage tests for compressed gas and hydrogen-fueled vehicles. Future enhancements will add features similar to a "check engine" light on a vehicle. Once installed, the system will ultimately be used to alert International Space Station crewmembers to critical structural instabilities, but will have little impact to missions otherwise. Diagnostic information could then be transmitted to experienced technicians on the ground in a timely manner to determine whether pressure vessels have been impacted, are structurally unsound, or can be safely used to complete the mission.
Prior, Fred W; Erickson, Bradley J; Tarbox, Lawrence
2007-11-01
The Cancer Bioinformatics Grid (caBIG) program was created by the National Cancer Institute to facilitate sharing of IT infrastructure, data, and applications among the National Cancer Institute-sponsored cancer research centers. The program was launched in February 2004 and now links more than 50 cancer centers. In April 2005, the In Vivo Imaging Workspace was added to promote the use of imaging in cancer clinical trials. At the inaugural meeting, four special interest groups (SIGs) were established. The Software SIG was charged with identifying projects that focus on open-source software for image visualization and analysis. To date, two projects have been defined by the Software SIG. The eXtensible Imaging Platform project has produced a rapid application development environment that researchers may use to create targeted workflows customized for specific research projects. The Algorithm Validation Tools project will provide a set of tools and data structures that will be used to capture measurement information and associated needed to allow a gold standard to be defined for the given database against which change analysis algorithms can be tested. Through these and future efforts, the caBIG In Vivo Imaging Workspace Software SIG endeavors to advance imaging informatics and provide new open-source software tools to advance cancer research.
ERIC Educational Resources Information Center
Shearn, Joseph
1987-01-01
Selection of administrative software requires analyzing present needs and, to meet future needs, choosing software that will function with a more powerful computer system. Other important factors to include are a professional system demonstration, maintenance and training, and financial considerations that allow leasing or renting alternatives.…
Editorial: Challenges and solutions in GW calculations for complex systems
NASA Astrophysics Data System (ADS)
Giustino, F.; Umari, P.; Rubio, A.
2012-09-01
We report key advances in the area of GW calculations, review the available software implementations and define standardization criteria to render the comparison between GW calculations from different codes meaningful, and identify future major challenges in the area of quasiparticle calculations. This Topical Issue should be a reference point for further developments in the field.
Integrated System Health Management Development Toolkit
NASA Technical Reports Server (NTRS)
Figueroa, Jorge; Smith, Harvey; Morris, Jon
2009-01-01
This software toolkit is designed to model complex systems for the implementation of embedded Integrated System Health Management (ISHM) capability, which focuses on determining the condition (health) of every element in a complex system (detect anomalies, diagnose causes, and predict future anomalies), and to provide data, information, and knowledge (DIaK) to control systems for safe and effective operation.
An Analysis of Category Management of Service Contracts
2017-12-01
management teams a way to make informed , data-driven decisions. Data-driven decisions derived from clustering not only align with Category...savings. Furthermore, this methodology provides a data-driven visualization to inform sound business decisions on potential Category Management ...Category Management initiatives. The Maptitude software will allow future research to collect data and develop visualizations to inform Category
Perspective on intelligent avionics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jones, H.L.
1987-01-01
Technical issues which could potentially limit the capability and acceptibility of expert systems decision-making for avionics applications are addressed. These issues are: real-time AI, mission-critical software, conventional algorithms, pilot interface, knowledge acquisition, and distributed expert systems. Examples from on-going expert system development programs are presented to illustrate likely architectures and applications of future intelligent avionic systems. 13 references.
ERIC Educational Resources Information Center
Snyder, Robin M.
2017-01-01
The author has attended and presented at most ASCUE meetings since 1994, and has worked professionally in research and development, industry, military, government, business, and private and public academia--moving between computer science, software engineering, and business fields at both the undergraduate and graduate level, and even running…
Looking to the Future: Higher Education in the Metaverse
ERIC Educational Resources Information Center
Collins, Chris
2008-01-01
With the advances in computational power, Internet access and speed, and graphical 3D reproductions possible on an "ordinary" home computer, and with the development of new software products that place the ability to create new digital content in the hands of "ordinary" people, everyone is beginning to see in virtual worlds emergent behaviors that…
Copyright Law and Information Policy Planning: Public Rights of Use in the 1990s and Beyond.
ERIC Educational Resources Information Center
Crews, Kenneth D.
1995-01-01
Summarizes recent developments in copyright law, with a focus on their consequences for users in colleges, universities, and libraries. Highlights include the concept of fair use; library reproduction rights; recent court cases and legislation; and future copyright concerns, including fair use of computer software and electronic text, and license…
NASA Technical Reports Server (NTRS)
Lytle, John
2001-01-01
This report provides an overview presentation of the 2000 NPSS (Numerical Propulsion System Simulation) Review and Planning Meeting. Topics include: 1) a background of the program; 2) 1999 Industry Feedback; 3) FY00 Status, including resource distribution and major accomplishments; 4) FY01 Major Milestones; and 5) Future direction for the program. Specifically, simulation environment/production software and NPSS CORBA Security Development are discussed.
Leveraging Open Source Software in the Education Management and Leadership Training
ERIC Educational Resources Information Center
Nordin, Norazah; Ibrahim, Sham; Mohd. Hamzah, Mohd. Izham; Embi, Mohamed Amin; Din, Rosseni
2012-01-01
The development in information technology has now moved from the first wave that emphasises on computer technical skills to the second wave which focuses on the application and management aspects. This paper aims to investigate the use of learning management system among future school heads in education management and leadership. The study was…
Developing a 300C Analog Tool for EGS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Normann, Randy
2015-03-23
This paper covers the development of a 300°C geothermal well monitoring tool for supporting future EGS (enhanced geothermal systems) power production. This is the first of 3 tools planed. This is an analog tool designed for monitoring well pressure and temperature. There is discussion on 3 different circuit topologies and the development of the supporting surface electronics and software. There is information on testing electronic circuits and component. One of the major components is the cable used to connect the analog tool to the surface.
DOE Office of Scientific and Technical Information (OSTI.GOV)
AS Koontz; S Choudhury; BD Ermold
The purpose of this report is to provide status of the ingest software used to process instrument data for the Atmospheric Radiation Measurement (ARM) Climate Research Facility (ACRF). The report is divided into four sections: (1) news about ingests currently under development, (2) current production ingests, (3) future ingest development plans, and (4) information on retired ingests. Please note that datastreams beginning in “xxx” indicate cases where ingests run at multiple ACRF sites, which results in a datastream(s) for each location.
Portable Planetariums Teach Science
NASA Technical Reports Server (NTRS)
2015-01-01
With the Internet proving to be the wave of the future, in the 1990s Johnson Space Center awarded grants to Rice University in Houston for developing the world's first Internet-accessible museum kiosk. Further grants were awarded to the school for creating educational software for use in homes and schools, leading to the creation of Museums Teaching Planet Earth Inc. The company has gone on to develop and sell portable planetariums and accompanying educational shows.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chamana, Manohar; Prabakar, Kumaraguru; Palmintier, Bryan
A software process is developed to convert distribution network models from a quasi-static time-series tool (OpenDSS) to a real-time dynamic phasor simulator (ePHASORSIM). The description of this process in this paper would be helpful for researchers who intend to perform similar conversions. The converter could be utilized directly by users of real-time simulators who intend to perform software-in-the-loop or hardware-in-the-loop tests on large distribution test feeders for a range of use cases, including testing functions of advanced distribution management systems against a simulated distribution system. In the future, the developers intend to release the conversion tool as open source tomore » enable use by others.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chamana, Manohar; Prabakar, Kumaraguru; Palmintier, Bryan
A software process is developed to convert distribution network models from a quasi-static time-series tool (OpenDSS) to a real-time dynamic phasor simulator (ePHASORSIM). The description of this process in this paper would be helpful for researchers who intend to perform similar conversions. The converter could be utilized directly by users of real-time simulators who intend to perform software-in-the-loop or hardware-in-the-loop tests on large distribution test feeders for a range of use cases, including testing functions of advanced distribution management systems against a simulated distribution system. In the future, the developers intend to release the conversion tool as open source tomore » enable use by others.« less
Analyzing clinical phonological data using Phon
McAllister Byun, Tara
2016-01-01
In this paper, we describe how Phon, a software program for the transcription and analysis of phonological data, can be applied to facilitate clinical phonological analyses. We begin with a summary of the types of analyses that are frequently used in the assessment and management of speech sound disorders. We then discuss challenges inherent to the transcription and analysis of clinical phonological data. For each challenge, we discuss solutions currently available within Phon, and offer an outlook on future methodological and technical developments in the area of clinical phonology. This paper includes a step-by-step introduction to Phon suitable for readers who lack previous experience with the software. We conclude with a discussion of data sharing and its vital role in advancing research and intervention practices in the area of speech development and disorders. PMID:27111269
IWSSA 2009 PC Co-chairs' Message
NASA Astrophysics Data System (ADS)
Chung, Lawrence; Noguera, Manuel; Subramanian, Nary; Garrido, José Luis
Important changes in society are being predicted for the very near future. In many countries, governments look ahead by increasing reserve funds and budgets for strategically critical areas in order to identify key issues and find effective solutions. Not surprisingly, many institutions are launching research and development programs focused on health-care, elderly people, quality of life, social inclusion, energy, education, ecology, etc. Innovation is required for systems supporting such a new assisted, interactive and collaborative world. System and software designers have to be able to address how to reflect in the same system/software architecture a great amount of (sometimes conflicting) requirements. In particular, user-oriented nonfunctional requirements and developer-oriented non-functional requirements (or design constraints) gain special relevance due to the new environments in which the systems have to operate.
The CEOS International Directory Network: Progress and Plans, Spring, 1999
NASA Technical Reports Server (NTRS)
Olsen, Lola M.
1999-01-01
The Global Change Master Directory (GCMD) serves as the software development hub for the Committee on Earth observation Satellites' (CEOS) International Directory Network (IDN). The GCMD has upgraded the software for the IDN nodes as Version 7 of the GCMD: MD7-Oracle and MD7-Isite, as well as three other MD7 experimental interfaces. The contribution by DLR representatives (Germany) of the DLR Thesaurus will be demonstrated as an educational tool for use with MD7-Isite. The software will be installed at twelve nodes around the world: Brazil, Argentina, the Netherlands, Canada, France, Germany, Italy, Japan, Australia, New Zealand, Switzerland, and several sites in the United States. Representing NASA for the International Directory Network and the CEOS Data Access Subgroup, NASA's contribution to this international interoperability effort will be updated. Discussion will include interoperability with the CEOS Interoperability Protocol (CIP), features of the latest version of the software, including upgraded capabilities for distributed input by the IDN nodes, installation logistics, "mirroring", population objectives, and future plans.
The CEOS International Directory Network Progress and Plans: Spring, 1999
NASA Technical Reports Server (NTRS)
Olsen, Lola M.
1999-01-01
The Global Change Master Directory (GCMD) serves as the software development hub for the Committee on Earth Observation Satellites' (CEOS) International Directory Network (IDN). The GCMD has upgraded the software for the IDN nodes as Version 7 of the GCMD: MD7-Oracle and MD7-Isite, as well as three other MD7 experimental interfaces. The contribution by DLR representatives (Germany) of the DLR Thesaurus will be demonstrated as an educational tool for use with MD7-Isite. The software will be installed at twelve nodes around the world: Brazil, Argentina, the Netherlands, Canada, France, Germany, Italy, Japan, Australia, New Zealand, Switzerland, and several sites in the United States. Representing NASA for the International Directory Network and the CEOS Data Access Subgroup, NASA's contribution to this international interoperability effort will be updated. Discussion will include interoperability with the CEOS Interoperability Protocol (CIP), features of the latest version of the software, including upgraded capabilities for distributed input by the IDN nodes, installation logistics, "mirroring', population objectives, and future plans.
Gaudi Evolution for Future Challenges
NASA Astrophysics Data System (ADS)
Clemencic, M.; Hegner, B.; Leggett, C.
2017-10-01
The LHCb Software Framework Gaudi was initially designed and developed almost twenty years ago, when computing was very different from today. It has also been used by a variety of other experiments, including ATLAS, Daya Bay, GLAST, HARP, LZ, and MINERVA. Although it has been always actively developed all these years, stability and backward compatibility have been favoured, reducing the possibilities of adopting new techniques, like multithreaded processing. R&D efforts like GaudiHive have however shown its potential to cope with the new challenges. In view of the LHC second Long Shutdown approaching and to prepare for the computing challenges for the Upgrade of the collider and the detectors, now is a perfect moment to review the design of Gaudi and plan future developments of the project. To do this LHCb, ATLAS and the Future Circular Collider community joined efforts to bring Gaudi forward and prepare it for the upcoming needs of the experiments. We present here how Gaudi will evolve in the next years and the long term development plans.
A flexible continuous-variable QKD system using off-the-shelf components
NASA Astrophysics Data System (ADS)
Comandar, Lucian C.; Brunner, Hans H.; Bettelli, Stefano; Fung, Fred; Karinou, Fotini; Hillerkuss, David; Mikroulis, Spiros; Wang, Dawei; Kuschnerov, Maxim; Xie, Changsong; Poppe, Andreas; Peev, Momtchil
2017-10-01
We present the development of a robust and versatile CV-QKD architecture based on commercially available optical and electronic components. The system uses a pilot tone for phase synchronization with a local oscillator, as well as local feedback loops to mitigate frequency and polarization drifts. Transmit and receive-side digital signal processing is performed fully in software, allowing for rapid protocol reconfiguration. The quantum link is complemented with a software stack for secure-key processing, key storage and encrypted communication. All these features allow for the system to be at the same time a prototype for a future commercial product and a research platform.
PHM Enabled Autonomous Propellant Loading Operations
NASA Technical Reports Server (NTRS)
Walker, Mark; Figueroa, Fernando
2017-01-01
The utility of Prognostics and Health Management (PHM) software capability applied to Autonomous Operations (AO) remains an active research area within aerospace applications. The ability to gain insight into which assets and subsystems are functioning properly, along with the derivation of confident predictions concerning future ability, reliability, and availability, are important enablers for making sound mission planning decisions. When coupled with software that fully supports mission planning and execution, an integrated solution can be developed that leverages state assessment and estimation for the purposes of delivering autonomous operations. The authors have been applying this integrated, model-based approach to the autonomous loading of cryogenic spacecraft propellants at Kennedy Space Center.
The first SPIE software Hack Day
NASA Astrophysics Data System (ADS)
Kendrew, S.; Deen, C.; Radziwill, N.; Crawford, S.; Gilbert, J.; Gully-Santiago, M.; Kubánek, P.
2014-07-01
We report here on the software Hack Day organised at the 2014 SPIE conference on Astronomical Telescopes and Instrumentation in Montréal. The first ever Hack Day to take place at an SPIE event, the aim of the day was to bring together developers to collaborate on innovative solutions to problems of their choice. Such events have proliferated in the technology community, providing opportunities to showcase, share and learn skills. In academic environments, these events are often also instrumental in building community beyond the limits of national borders, institutions and projects. We show examples of projects the participants worked on, and provide some lessons learned for future events.
Engine structures analysis software: Component Specific Modeling (COSMO)
NASA Astrophysics Data System (ADS)
McKnight, R. L.; Maffeo, R. J.; Schwartz, S.
1994-08-01
A component specific modeling software program has been developed for propulsion systems. This expert program is capable of formulating the component geometry as finite element meshes for structural analysis which, in the future, can be spun off as NURB geometry for manufacturing. COSMO currently has geometry recipes for combustors, turbine blades, vanes, and disks. Component geometry recipes for nozzles, inlets, frames, shafts, and ducts are being added. COSMO uses component recipes that work through neutral files with the Technology Benefit Estimator (T/BEST) program which provides the necessary base parameters and loadings. This report contains the users manual for combustors, turbine blades, vanes, and disks.
Engine Structures Analysis Software: Component Specific Modeling (COSMO)
NASA Technical Reports Server (NTRS)
Mcknight, R. L.; Maffeo, R. J.; Schwartz, S.
1994-01-01
A component specific modeling software program has been developed for propulsion systems. This expert program is capable of formulating the component geometry as finite element meshes for structural analysis which, in the future, can be spun off as NURB geometry for manufacturing. COSMO currently has geometry recipes for combustors, turbine blades, vanes, and disks. Component geometry recipes for nozzles, inlets, frames, shafts, and ducts are being added. COSMO uses component recipes that work through neutral files with the Technology Benefit Estimator (T/BEST) program which provides the necessary base parameters and loadings. This report contains the users manual for combustors, turbine blades, vanes, and disks.
NASA Technical Reports Server (NTRS)
Roth, Don J.; Rapchun, David A.; Jones, Hollis H.
2001-01-01
The Cloud Absorption Radiometer (CAR) instrument has been the most frequently used airborne instrument built in-house at NASA Goddard Space Flight Center, having flown scientific research missions on-board various aircraft to many locations in the United States, Azores, Brazil, and Kuwait since 1983. The CAR instrument is capable of measuring scattered light by clouds in fourteen spectral bands in UV, visible and near-infrared region. This document describes the control, data acquisition, display, and file storage software for the new version of CAR. This software completely replaces the prior CAR Data System and Control Panel with a compact and robust virtual instrument computer interface. Additionally, the instrument is now usable for the first time for taking data in an off-aircraft mode. The new instrument is controlled via a LabVIEW v5. 1.1-developed software interface that utilizes, (1) serial port writes to write commands to the controller module of the instrument, and (2) serial port reads to acquire data from the controller module of the instrument. Step-by-step operational procedures are provided in this document. A suite of other software programs has been developed to complement the actual CAR virtual instrument. These programs include: (1) a simulator mode that allows pretesting of new features that might be added in the future, as well as demonstrations to CAR customers, and development at times when the instrument/hardware is off-location, and (2) a post-experiment data viewer that can be used to view all segments of individual data cycles and to locate positions where 'start' and stop' byte sequences were incorrectly formulated by the instrument controller. The CAR software described here is expected to be the basis for CAR operation for many missions and many years to come.
Unique Challenges Testing SDRs for Space
NASA Technical Reports Server (NTRS)
Chelmins, David; Downey, Joseph A.; Johnson, Sandra K.; Nappier, Jennifer M.
2013-01-01
This paper describes the approach used by the Space Communication and Navigation (SCaN) Testbed team to qualify three Software Defined Radios (SDR) for operation in space and the characterization of the platform to enable upgrades on-orbit. The three SDRs represent a significant portion of the new technologies being studied on board the SCAN Testbed, which is operating on an external truss on the International Space Station (ISS). The SCaN Testbed provides experimenters an opportunity to develop and demonstrate experimental waveforms and applications for communication, networking, and navigation concepts and advance the understanding of developing and operating SDRs in space. Qualifying a Software Defined Radio for the space environment requires additional consideration versus a hardware radio. Tests that incorporate characterization of the platform to provide information necessary for future waveforms, which might exercise extended capabilities of the hardware, are needed. The development life cycle for the radio follows the software development life cycle, where changes can be incorporated at various stages of development and test. It also enables flexibility to be added with minor additional effort. Although this provides tremendous advantages, managing the complexity inherent in a software implementation requires a testing beyond the traditional hardware radio test plan. Due to schedule and resource limitations and parallel development activities, the subsystem testing of the SDRs at the vendor sites was primarily limited to typical fixed transceiver type of testing. NASA s Glenn Research Center (GRC) was responsible for the integration and testing of the SDRs into the SCaN Testbed system and conducting the investigation of the SDR to advance the technology to be accepted by missions. This paper will describe the unique tests that were conducted at both the subsystem and system level, including environmental testing, and present results. For example, test waveforms were developed to measure the gain of the transmit system across the tunable frequency band. These were used during thermal vacuum testing to enable characterization of the integrated system in the wide operational temperature range of space. Receive power indicators were used for Electromagnetic Interference tests (EMI) to understand the platform s susceptibility to external interferers independent of the waveform. Additional approaches and lessons learned during the SCaN Testbed subsystem and system level testing will be discussed that may help future SDR integrators
Unique Challenges Testing SDRs for Space
NASA Technical Reports Server (NTRS)
Johnson, Sandra; Chelmins, David; Downey, Joseph; Nappier, Jennifer
2013-01-01
This paper describes the approach used by the Space Communication and Navigation (SCaN) Testbed team to qualify three Software Defined Radios (SDR) for operation in space and the characterization of the platform to enable upgrades on-orbit. The three SDRs represent a significant portion of the new technologies being studied on board the SCAN Testbed, which is operating on an external truss on the International Space Station (ISS). The SCaN Testbed provides experimenters an opportunity to develop and demonstrate experimental waveforms and applications for communication, networking, and navigation concepts and advance the understanding of developing and operating SDRs in space. Qualifying a Software Defined Radio for the space environment requires additional consideration versus a hardware radio. Tests that incorporate characterization of the platform to provide information necessary for future waveforms, which might exercise extended capabilities of the hardware, are needed. The development life cycle for the radio follows the software development life cycle, where changes can be incorporated at various stages of development and test. It also enables flexibility to be added with minor additional effort. Although this provides tremendous advantages, managing the complexity inherent in a software implementation requires a testing beyond the traditional hardware radio test plan. Due to schedule and resource limitations and parallel development activities, the subsystem testing of the SDRs at the vendor sites was primarily limited to typical fixed transceiver type of testing. NASA's Glenn Research Center (GRC) was responsible for the integration and testing of the SDRs into the SCaN Testbed system and conducting the investigation of the SDR to advance the technology to be accepted by missions. This paper will describe the unique tests that were conducted at both the subsystem and system level, including environmental testing, and present results. For example, test waveforms were developed to measure the gain of the transmit system across the tunable frequency band. These were used during thermal vacuum testing to enable characterization of the integrated system in the wide operational temperature range of space. Receive power indicators were used for Electromagnetic Interference tests (EMI) to understand the platform's susceptibility to external interferers independent of the waveform. Additional approaches and lessons learned during the SCaN Testbed subsystem and system level testing will be discussed that may help future SDR integrators.
Computational Aspects of Data Assimilation and the ESMF
NASA Technical Reports Server (NTRS)
daSilva, A.
2003-01-01
The scientific challenge of developing advanced data assimilation applications is a daunting task. Independently developed components may have incompatible interfaces or may be written in different computer languages. The high-performance computer (HPC) platforms required by numerically intensive Earth system applications are complex, varied, rapidly evolving and multi-part systems themselves. Since the market for high-end platforms is relatively small, there is little robust middleware available to buffer the modeler from the difficulties of HPC programming. To complicate matters further, the collaborations required to develop large Earth system applications often span initiatives, institutions and agencies, involve geoscience, software engineering, and computer science communities, and cross national borders.The Earth System Modeling Framework (ESMF) project is a concerted response to these challenges. Its goal is to increase software reuse, interoperability, ease of use and performance in Earth system models through the use of a common software framework, developed in an open manner by leaders in the modeling community. The ESMF addresses the technical and to some extent the cultural - aspects of Earth system modeling, laying the groundwork for addressing the more difficult scientific aspects, such as the physical compatibility of components, in the future. In this talk we will discuss the general philosophy and architecture of the ESMF, focussing on those capabilities useful for developing advanced data assimilation applications.
ERIC Educational Resources Information Center
National Academy of Sciences - National Research Council, Washington, DC.
The Panel on Guidelines for Statistical Software was organized in 1990 to document, assess, and prioritize problem areas regarding quality and reliability of statistical software; present prototype guidelines in high priority areas; and make recommendations for further research and discussion. This document provides the following papers presented…
SeisComP 3 - Where are we now?
NASA Astrophysics Data System (ADS)
Saul, Joachim; Becker, Jan; Hanka, Winfried; Heinloo, Andres; Weber, Bernd
2010-05-01
The seismological software SeisComP has evolved within the last approximately 10 years from a pure acquisition modules to a fully featured real-time earthquake monitoring software. The now very popular SeedLink protocol for seismic data transmission has been the core of SeisComP from the very beginning. Later additions included simple, purely automatic event detection, location and magnitude determination capabilities. Especially within the development of the 3rd-generation SeisComP, also known as "SeisComP 3", automatic processing capabilities have been augmented by graphical user interfaces for vizualization, rapid event review and quality control. Communication between the modules is achieved using a a TCP/IP infrastructure that allows distributed computing and remote review. For seismological metadata exchange export/import to/from QuakeML is avalable, which also provides a convenient interface with 3rd-party software. SeisComP is the primary seismological processing software at the GFZ Potsdam. It has also been in use for years in numerous seismic networks in Europe and, more recently, has been adopted as primary monitoring software by several tsunami warning centers around the Indian Ocean. In our presentation we describe the current status of development as well as future plans. We illustrate its possibilities by discussing different use cases for global and regional real-time earthquake monitoring and tsunami warning.
Positional Awareness Map 3D (PAM3D)
NASA Technical Reports Server (NTRS)
Hoffman, Monica; Allen, Earl L.; Yount, John W.; Norcross, April Louise
2012-01-01
The Western Aeronautical Test Range of the National Aeronautics and Space Administration s Dryden Flight Research Center needed to address the aging software and hardware of its current situational awareness display application, the Global Real-Time Interactive Map (GRIM). GRIM was initially developed in the late 1980s and executes on older PC architectures using a Linux operating system that is no longer supported. Additionally, the software is difficult to maintain due to its complexity and loss of developer knowledge. It was decided that a replacement application must be developed or acquired in the near future. The replacement must provide the functionality of the original system, the ability to monitor test flight vehicles in real-time, and add improvements such as high resolution imagery and true 3-dimensional capability. This paper will discuss the process of determining the best approach to replace GRIM, and the functionality and capabilities of the first release of the Positional Awareness Map 3D.