SAGA: A project to automate the management of software production systems
NASA Technical Reports Server (NTRS)
Campbell, Roy H.; Beckman-Davies, C. S.; Benzinger, L.; Beshers, G.; Laliberte, D.; Render, H.; Sum, R.; Smith, W.; Terwilliger, R.
1986-01-01
Research into software development is required to reduce its production cost and to improve its quality. Modern software systems, such as the embedded software required for NASA's space station initiative, stretch current software engineering techniques. The requirements to build large, reliable, and maintainable software systems increases with time. Much theoretical and practical research is in progress to improve software engineering techniques. One such technique is to build a software system or environment which directly supports the software engineering process, i.e., the SAGA project, comprising the research necessary to design and build a software development which automates the software engineering process. Progress under SAGA is described.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bartlett, Roscoe A; Heroux, Dr. Michael A; Willenbring, James
2012-01-01
Software lifecycles are becoming an increasingly important issue for computational science & engineering (CSE) software. The process by which a piece of CSE software begins life as a set of research requirements and then matures into a trusted high-quality capability is both commonplace and extremely challenging. Although an implicit lifecycle is obviously being used in any effort, the challenges of this process--respecting the competing needs of research vs. production--cannot be overstated. Here we describe a proposal for a well-defined software lifecycle process based on modern Lean/Agile software engineering principles. What we propose is appropriate for many CSE software projects thatmore » are initially heavily focused on research but also are expected to eventually produce usable high-quality capabilities. The model is related to TriBITS, a build, integration and testing system, which serves as a strong foundation for this lifecycle model, and aspects of this lifecycle model are ingrained in the TriBITS system. Indeed this lifecycle process, if followed, will enable large-scale sustainable integration of many complex CSE software efforts across several institutions.« less
Certification Processes for Safety-Critical and Mission-Critical Aerospace Software
NASA Technical Reports Server (NTRS)
Nelson, Stacy
2003-01-01
This document is a quick reference guide with an overview of the processes required to certify safety-critical and mission-critical flight software at selected NASA centers and the FAA. Researchers and software developers can use this guide to jumpstart their understanding of how to get new or enhanced software onboard an aircraft or spacecraft. The introduction contains aerospace industry definitions of safety and safety-critical software, as well as, the current rationale for certification of safety-critical software. The Standards for Safety-Critical Aerospace Software section lists and describes current standards including NASA standards and RTCA DO-178B. The Mission-Critical versus Safety-Critical software section explains the difference between two important classes of software: safety-critical software involving the potential for loss of life due to software failure and mission-critical software involving the potential for aborting a mission due to software failure. The DO-178B Safety-critical Certification Requirements section describes special processes and methods required to obtain a safety-critical certification for aerospace software flying on vehicles under auspices of the FAA. The final two sections give an overview of the certification process used at Dryden Flight Research Center and the approval process at the Jet Propulsion Lab (JPL).
A quality-refinement process for medical imaging applications.
Neuhaus, J; Maleike, D; Nolden, M; Kenngott, H-G; Meinzer, H-P; Wolf, I
2009-01-01
To introduce and evaluate a process for refinement of software quality that is suitable to research groups. In order to avoid constraining researchers too much, the quality improvement process has to be designed carefully. The scope of this paper is to present and evaluate a process to advance quality aspects of existing research prototypes in order to make them ready for initial clinical studies. The proposed process is tailored for research environments and therefore more lightweight than traditional quality management processes. Focus on quality criteria that are important at the given stage of the software life cycle. Usage of tools that automate aspects of the process is emphasized. To evaluate the additional effort that comes along with the process, it was exemplarily applied for eight prototypical software modules for medical image processing. The introduced process has been applied to improve the quality of all prototypes so that they could be successfully used in clinical studies. The quality refinement yielded an average of 13 person days of additional effort per project. Overall, 107 bugs were found and resolved by applying the process. Careful selection of quality criteria and the usage of automated process tools lead to a lightweight quality refinement process suitable for scientific research groups that can be applied to ensure a successful transfer of technical software prototypes into clinical research workflows.
NASA Technical Reports Server (NTRS)
Hinchey, Michael G.; Pressburger, Thomas; Markosian, Lawrence; Feather, Martin S.
2006-01-01
New processes, methods and tools are constantly appearing in the field of software engineering. Many of these augur great potential in improving software development processes, resulting in higher quality software with greater levels of assurance. However, there are a number of obstacles that impede their infusion into software development practices. These are the recurring obstacles common to many forms of research. Practitioners cannot readily identify the emerging techniques that may most benefit them, and cannot afford to risk time and effort in evaluating and experimenting with them while there is still uncertainty about whether they will have payoff in this particular context. Similarly, researchers cannot readily identify those practitioners whose problems would be amenable to their techniques and lack the feedback from practical applications necessary to help them to evolve their techniques to make them more likely to be successful. This paper describes an ongoing effort conducted by a software engineering research infusion team, and the NASA Research Infusion Initiative, established by NASA s Software Engineering Initiative, to overcome these obstacles.
Software Development as Music Education Research
ERIC Educational Resources Information Center
Brown, Andrew R.
2007-01-01
This paper discusses how software development can be used as a method for music education research. It explains how software development can externalize ideas, stimulate action and reflection, and provide evidence to support the educative value of new software-based experiences. Parallels between the interactive software development process and…
The NASA SARP Software Research Infusion Initiative
NASA Technical Reports Server (NTRS)
Hinchey, Mike; Pressburger, Tom; Markosian, Lawrence; Feather, Martin
2006-01-01
A viewgraph presentation describing the NASA Software Assurance Research Program (SARP) research infusion projects is shown. The topics include: 1) Background/Motivation; 2) Proposal Solicitation Process; 3) Proposal Evaluation Process; 4) Overview of Some Projects to Date; and 5) Lessons Learned.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Willenbring, James M.; Bartlett, Roscoe Ainsworth; Heroux, Michael Allen
2012-01-01
Software lifecycles are becoming an increasingly important issue for computational science and engineering (CSE) software. The process by which a piece of CSE software begins life as a set of research requirements and then matures into a trusted high-quality capability is both commonplace and extremely challenging. Although an implicit lifecycle is obviously being used in any effort, the challenges of this process - respecting the competing needs of research vs. production - cannot be overstated. Here we describe a proposal for a well-defined software lifecycle process based on modern Lean/Agile software engineering principles. What we propose is appropriate for manymore » CSE software projects that are initially heavily focused on research but also are expected to eventually produce usable high-quality capabilities. The model is related to TriBITS, a build, integration and testing system, which serves as a strong foundation for this lifecycle model, and aspects of this lifecycle model are ingrained in the TriBITS system. Here, we advocate three to four phases or maturity levels that address the appropriate handling of many issues associated with the transition from research to production software. The goals of this lifecycle model are to better communicate maturity levels with customers and to help to identify and promote Software Engineering (SE) practices that will help to improve productivity and produce better software. An important collection of software in this domain is Trilinos, which is used as the motivation and the initial target for this lifecycle model. However, many other related and similar CSE (and non-CSE) software projects can also make good use of this lifecycle model, especially those that use the TriBITS system. Indeed this lifecycle process, if followed, will enable large-scale sustainable integration of many complex CSE software efforts across several institutions.« less
Building an experience factory for maintenance
NASA Technical Reports Server (NTRS)
Valett, Jon D.; Condon, Steven E.; Briand, Lionel; Kim, Yong-Mi; Basili, Victor R.
1994-01-01
This paper reports the preliminary results of a study of the software maintenance process in the Flight Dynamics Division (FDD) of the National Aeronautics and Space Administration/Goddard Space Flight Center (NASA/GSFC). This study is being conducted by the Software Engineering Laboratory (SEL), a research organization sponsored by the Software Engineering Branch of the FDD, which investigates the effectiveness of software engineering technologies when applied to the development of applications software. This software maintenance study began in October 1993 and is being conducted using the Quality Improvement Paradigm (QIP), a process improvement strategy based on three iterative steps: understanding, assessing, and packaging. The preliminary results represent the outcome of the understanding phase, during which SEL researchers characterized the maintenance environment, product, and process. Findings indicate that a combination of quantitative and qualitative analysis is effective for studying the software maintenance process, that additional measures should be collected for maintenance (as opposed to new development), and that characteristics such as effort, error rate, and productivity are best considered on a 'release' basis rather than on a project basis. The research thus far has documented some basic differences between new development and software maintenance. It lays the foundation for further application of the QIP to investigate means of improving the maintenance process and product in the FDD.
DOCLIB: a software library for document processing
NASA Astrophysics Data System (ADS)
Jaeger, Stefan; Zhu, Guangyu; Doermann, David; Chen, Kevin; Sampat, Summit
2006-01-01
Most researchers would agree that research in the field of document processing can benefit tremendously from a common software library through which institutions are able to develop and share research-related software and applications across academic, business, and government domains. However, despite several attempts in the past, the research community still lacks a widely-accepted standard software library for document processing. This paper describes a new library called DOCLIB, which tries to overcome the drawbacks of earlier approaches. Many of DOCLIB's features are unique either in themselves or in their combination with others, e.g. the factory concept for support of different image types, the juxtaposition of image data and metadata, or the add-on mechanism. We cherish the hope that DOCLIB serves the needs of researchers better than previous approaches and will readily be accepted by a larger group of scientists.
Impact of Growing Business on Software Processes
NASA Astrophysics Data System (ADS)
Nikitina, Natalja; Kajko-Mattsson, Mira
When growing their businesses, software organizations should not only put effort into developing and executing their business strategies, but also into managing and improving their internal software development processes and aligning them with business growth strategies. It is only in this way they may confirm that their businesses grow in a healthy and sustainable way. In this paper, we map out one software company's business growth on the course of its historical events and identify its impact on the company's software production processes and capabilities. The impact concerns benefits, challenges, problems and lessons learned. The most important lesson learned is that although business growth has become a stimulus for starting thinking and improving software processes, the organization lacked guidelines aiding it in and aligning it to business growth. Finally, the paper generates research questions providing a platform for future research.
The use of applied software for the professional training of students studying humanities
NASA Astrophysics Data System (ADS)
Sadchikova, A. S.; Rodin, M. M.
2017-01-01
Research practice is an integral part of humanities students' training process. In this regard the training process is to include modern information techniques of the training process of students studying humanities. This paper examines the most popular applied software products used for data processing in social science. For testing purposes we selected the most commonly preferred professional packages: MS Excel, IBM SPSS Statistics, STATISTICA, STADIA. Moreover the article contains testing results of a specialized software Prikladnoy Sotsiolog that is applicable for the preparation stage of the research. The specialised software were tested during one term in groups of students studying humanities.
2017-04-06
Research Hypothesis ........................................................................................................... 15 Research Design ...user community and of accommodating advancing software applications by the vendors. Research Design My approach to this project was to conduct... design descriptions , requirements specifications, test documentation, interface requirement specifications, product specifications, and software
NASA Astrophysics Data System (ADS)
Gunawan, D.; Amalia, A.; Rahmat, R. F.; Muchtar, M. A.; Siregar, I.
2018-02-01
Identification of software maturity level is a technique to determine the quality of the software. By identifying the software maturity level, the weaknesses of the software can be observed. As a result, the recommendations might be a reference for future software maintenance and development. This paper discusses the software Capability Level (CL) with case studies on Quality Management Unit (Unit Manajemen Mutu) University of Sumatera Utara (UMM-USU). This research utilized Standard CMMI Appraisal Method for Process Improvement class C (SCAMPI C) model with continuous representation. This model focuses on activities for developing quality products and services. The observation is done in three process areas, such as Project Planning (PP), Project Monitoring and Control (PMC), and Requirements Management (REQM). According to the measurement of software capability level for UMM-USU software, turns out that the capability level for the observed process area is in the range of CL1 and CL2. Planning Project (PP) is the only process area which reaches capability level 2, meanwhile, PMC and REQM are still in CL 1 or in performed level. This research reveals several weaknesses of existing UMM-USU software. Therefore, this study proposes several recommendations for UMM-USU to improve capability level for observed process areas.
A META-COMPOSITE SOFTWARE DEVELOPMENT APPROACH FOR TRANSLATIONAL RESEARCH
Sadasivam, Rajani S.; Tanik, Murat M.
2013-01-01
Translational researchers conduct research in a highly data-intensive and continuously changing environment and need to use multiple, disparate tools to achieve their goals. These researchers would greatly benefit from meta-composite software development or the ability to continuously compose and recompose tools together in response to their ever-changing needs. However, the available tools are largely disconnected, and current software approaches are inefficient and ineffective in their support for meta-composite software development. Building on the composite services development approach, the de facto standard for developing integrated software systems, we propose a concept-map and agent-based meta-composite software development approach. A crucial step in composite services development is the modeling of users’ needs as processes, which can then be specified in an executable format for system composition. We have two key innovations. First, our approach allows researchers (who understand their needs best) instead of technicians to take a leadership role in the development of process models, reducing inefficiencies and errors. A second innovation is that our approach also allows for modeling of complex user interactions as part of the process, overcoming the technical limitations of current tools. We demonstrate the feasibility of our approach using a real-world translational research use case. We also present results of usability studies evaluating our approach for future refinements. PMID:23504436
A meta-composite software development approach for translational research.
Sadasivam, Rajani S; Tanik, Murat M
2013-06-01
Translational researchers conduct research in a highly data-intensive and continuously changing environment and need to use multiple, disparate tools to achieve their goals. These researchers would greatly benefit from meta-composite software development or the ability to continuously compose and recompose tools together in response to their ever-changing needs. However, the available tools are largely disconnected, and current software approaches are inefficient and ineffective in their support for meta-composite software development. Building on the composite services development approach, the de facto standard for developing integrated software systems, we propose a concept-map and agent-based meta-composite software development approach. A crucial step in composite services development is the modeling of users' needs as processes, which can then be specified in an executable format for system composition. We have two key innovations. First, our approach allows researchers (who understand their needs best) instead of technicians to take a leadership role in the development of process models, reducing inefficiencies and errors. A second innovation is that our approach also allows for modeling of complex user interactions as part of the process, overcoming the technical limitations of current tools. We demonstrate the feasibility of our approach using a real-world translational research use case. We also present results of usability studies evaluating our approach for future refinements.
Better software, better research: the challenge of preserving your research and your reputation
NASA Astrophysics Data System (ADS)
Chue Hong, N.
2017-12-01
Software is fundamental to research. From short, thrown-together temporary scripts, through an abundance of complex spreadsheets analysing collected data, to the hundreds of software engineers and millions of lines of code behind international efforts such as the Large Hadron Collider and the Square Kilometre Array, software has made an invaluable contribution to advancing our research knowledge. Within the earth and space sciences, data is being generated, collected, processed and analysed in ever greater amounts and detail. However the pace of this improvement leads to challenges around the persistence of research outputs and artefacts. A specific challenge in this field is that often experiments and measurements cannot be repeated, yet the infrastructure used to manage, store and process this data must be continually updated and developed: constant change just to stay still. The UK-based Software Sustainability Institute (SSI) aims to improve research software sustainability, working with researchers, funders, research software engineers, managers, and other stakeholders across the research spectrum. In this talk, I will present lessons learned and good practice based on the work of the Institute and its collaborators. I will summarise some of the work that is being done to improve the integration of infrastructure for managing research outputs, including around software citation and reward, extending data management plans, and improving researcher skills: "better software, better research". Ultimately, being a modern researcher in the geosciences requires you to efficiently balance the pursuit of new knowledge with making your work reusable and reproducible. And as scientists are placed under greater scrutiny about whether others can trust their results, the preservation of your artefacts has a key role in the preservation of your reputation.
SoftLab: A Soft-Computing Software for Experimental Research with Commercialization Aspects
NASA Technical Reports Server (NTRS)
Akbarzadeh-T, M.-R.; Shaikh, T. S.; Ren, J.; Hubbell, Rob; Kumbla, K. K.; Jamshidi, M
1998-01-01
SoftLab is a software environment for research and development in intelligent modeling/control using soft-computing paradigms such as fuzzy logic, neural networks, genetic algorithms, and genetic programs. SoftLab addresses the inadequacies of the existing soft-computing software by supporting comprehensive multidisciplinary functionalities from management tools to engineering systems. Furthermore, the built-in features help the user process/analyze information more efficiently by a friendly yet powerful interface, and will allow the user to specify user-specific processing modules, hence adding to the standard configuration of the software environment.
System software for the finite element machine
NASA Technical Reports Server (NTRS)
Crockett, T. W.; Knott, J. D.
1985-01-01
The Finite Element Machine is an experimental parallel computer developed at Langley Research Center to investigate the application of concurrent processing to structural engineering analysis. This report describes system-level software which has been developed to facilitate use of the machine by applications researchers. The overall software design is outlined, and several important parallel processing issues are discussed in detail, including processor management, communication, synchronization, and input/output. Based on experience using the system, the hardware architecture and software design are critiqued, and areas for further work are suggested.
NASA Technical Reports Server (NTRS)
Straeter, T. A.; Foudriat, E. C.; Will, R. W.
1977-01-01
The objectives of NASA's MUST (Multipurpose User-oriented Software Technology) program at Langley Research Center are to cut the cost of producing software which effectively utilizes digital systems for flight research. These objectives will be accomplished by providing an integrated system of support software tools for use throughout the research flight software development process. A description of the overall MUST program and its progress toward the release of a first MUST system will be presented. This release includes: a special interactive user interface, a library of subroutines, assemblers, a compiler, automatic documentation tools, and a test and simulation system.
The Role of Computers in Research and Development at Langley Research Center
NASA Technical Reports Server (NTRS)
Wieseman, Carol D. (Compiler)
1994-01-01
This document is a compilation of presentations given at a workshop on the role cf computers in research and development at the Langley Research Center. The objectives of the workshop were to inform the Langley Research Center community of the current software systems and software practices in use at Langley. The workshop was organized in 10 sessions: Software Engineering; Software Engineering Standards, methods, and CASE tools; Solutions of Equations; Automatic Differentiation; Mosaic and the World Wide Web; Graphics and Image Processing; System Design Integration; CAE Tools; Languages; and Advanced Topics.
Integrated testing and verification system for research flight software design document
NASA Technical Reports Server (NTRS)
Taylor, R. N.; Merilatt, R. L.; Osterweil, L. J.
1979-01-01
The NASA Langley Research Center is developing the MUST (Multipurpose User-oriented Software Technology) program to cut the cost of producing research flight software through a system of software support tools. The HAL/S language is the primary subject of the design. Boeing Computer Services Company (BCS) has designed an integrated verification and testing capability as part of MUST. Documentation, verification and test options are provided with special attention on real time, multiprocessing issues. The needs of the entire software production cycle have been considered, with effective management and reduced lifecycle costs as foremost goals. Capabilities have been included in the design for static detection of data flow anomalies involving communicating concurrent processes. Some types of ill formed process synchronization and deadlock also are detected statically.
C-C1-04: Building a Health Services Information Technology Research Environment
Gehrum, David W; Jones, JB; Romania, Gregory J; Young, David L; Lerch, Virginia R; Bruce, Christa A; Donkochik, Diane; Stewart, Walter F
2010-01-01
Background: The electronic health record (EHR) has opened a new era for health services research (HSR) where information technology (IT) is used to re-engineer care processes. While the EHR provides one means of advancing novel solutions, a promising strategy is to develop tools (e.g., online questionnaires, visual display tools, decision support) distinct from, but which interact with, the EHR. Development of such software tools outside the EHR offers an advantage in flexibility, sophistication, and ultimately in portability to other settings. However, institutional IT departments have an imperative to protect patient data and to standardize IT processes to ensure system-level security and support traditional business needs. Such imperatives usually present formidable process barriers to testing novel software solutions. We describe how, in collaboration with our IT department, we are creating an environment and a process that allows for routine and rapid testing of novel software solutions. Methods: We convened a working group consisting of IT and research personnel with expertise in information security, database design/management, web design, EHR programming, and health services research. The working group was tasked with developing a research IT environment to accomplish two objectives: maintain network/ data security and regulatory compliance; allow researchers working with external vendors to rapidly prototype and, in a clinical setting, test web-based tools. Results: Two parallel solutions, one focused on hardware, the second on oversight and management, were developed. First, we concluded that three separate, staged development environments were required to allow external vendor access for testing software and for transitioning software to be used in a clinic. In parallel, the extant oversight process for approving/managing access to internal/external personnel had to be altered to reflect the scope and scale of discrete research projects, as opposed to an enterpriselevel approach to IT management. Conclusions: Innovation in health services software development requires a flexible, scalable IT environment adapted to the unique objectives of a HSR software development model. In our experience, implementing the hardware solution is less challenging than the cultural change required to implement such a model and the modifications to administrative and oversight processes to sustain an environment for rapid product development and testing.
Analyzing qualitative data with computer software.
Weitzman, E A
1999-01-01
OBJECTIVE: To provide health services researchers with an overview of the qualitative data analysis process and the role of software within it; to provide a principled approach to choosing among software packages to support qualitative data analysis; to alert researchers to the potential benefits and limitations of such software; and to provide an overview of the developments to be expected in the field in the near future. DATA SOURCES, STUDY DESIGN, METHODS: This article does not include reports of empirical research. CONCLUSIONS: Software for qualitative data analysis can benefit the researcher in terms of speed, consistency, rigor, and access to analytic methods not available by hand. Software, however, is not a replacement for methodological training. PMID:10591282
Sowunmi, Olaperi Yeside; Misra, Sanjay; Fernandez-Sanz, Luis; Crawford, Broderick; Soto, Ricardo
2016-01-01
The importance of quality assurance in the software development process cannot be overemphasized because its adoption results in high reliability and easy maintenance of the software system and other software products. Software quality assurance includes different activities such as quality control, quality management, quality standards, quality planning, process standardization and improvement amongst others. The aim of this work is to further investigate the software quality assurance practices of practitioners in Nigeria. While our previous work covered areas on quality planning, adherence to standardized processes and the inherent challenges, this work has been extended to include quality control, software process improvement and international quality standard organization membership. It also makes comparison based on a similar study carried out in Turkey. The goal is to generate more robust findings that can properly support decision making by the software community. The qualitative research approach, specifically, the use of questionnaire research instruments was applied to acquire data from software practitioners. In addition to the previous results, it was observed that quality assurance practices are quite neglected and this can be the cause of low patronage. Moreover, software practitioners are neither aware of international standards organizations or the required process improvement techniques; as such their claimed standards are not aligned to those of accredited bodies, and are only limited to their local experience and knowledge, which makes it questionable. The comparison with Turkey also yielded similar findings, making the results typical of developing countries. The research instrument used was tested for internal consistency using the Cronbach's alpha, and it was proved reliable. For the software industry in developing countries to grow strong and be a viable source of external revenue, software assurance practices have to be taken seriously because its effect is evident in the final product. Moreover, quality frameworks and tools which require minimum time and cost are highly needed in these countries.
Research into software executives for space operations support
NASA Technical Reports Server (NTRS)
Collier, Mark D.
1990-01-01
Research concepts pertaining to a software (workstation) executive which will support a distributed processing command and control system characterized by high-performance graphics workstations used as computing nodes are presented. Although a workstation-based distributed processing environment offers many advantages, it also introduces a number of new concerns. In order to solve these problems, allow the environment to function as an integrated system, and present a functional development environment to application programmers, it is necessary to develop an additional layer of software. This 'executive' software integrates the system, provides real-time capabilities, and provides the tools necessary to support the application requirements.
Software design as a problem in learning theory (a research overview)
NASA Technical Reports Server (NTRS)
Fass, Leona F.
1992-01-01
Our interest in automating software design has come out of our research in automated reasoning, inductive inference, learnability, and algebraic machine theory. We have investigated these areas extensively, in connection with specific problems of language representation, acquisition, processing, and design. In the case of formal context-free (CF) languages we established existence of finite learnable models ('behavioral realizations') and procedures for constructing them effectively. We also determined techniques for automatic construction of the models, inductively inferring them from finite examples of how they should 'behave'. These results were obtainable due to appropriate representation of domain knowledge, and constraints on the domain that the representation defined. It was when we sought to generalize our results, and adapt or apply them, that we began investigating the possibility of determining similar procedures for constructing correct software. Discussions with other researchers led us to examine testing and verification processes, as they are related to inference, and due to their considerable importance in correct software design. Motivating papers by other researchers, led us to examine these processes in some depth. Here we present our approach to those software design issues raised by other researchers, within our own theoretical context. We describe our results, relative to those of the other researchers, and conclude that they do not compare unfavorably.
MNE Scan: Software for real-time processing of electrophysiological data.
Esch, Lorenz; Sun, Limin; Klüber, Viktor; Lew, Seok; Baumgarten, Daniel; Grant, P Ellen; Okada, Yoshio; Haueisen, Jens; Hämäläinen, Matti S; Dinh, Christoph
2018-06-01
Magnetoencephalography (MEG) and Electroencephalography (EEG) are noninvasive techniques to study the electrophysiological activity of the human brain. Thus, they are well suited for real-time monitoring and analysis of neuronal activity. Real-time MEG/EEG data processing allows adjustment of the stimuli to the subject's responses for optimizing the acquired information especially by providing dynamically changing displays to enable neurofeedback. We introduce MNE Scan, an acquisition and real-time analysis software based on the multipurpose software library MNE-CPP. MNE Scan allows the development and application of acquisition and novel real-time processing methods in both research and clinical studies. The MNE Scan development follows a strict software engineering process to enable approvals required for clinical software. We tested the performance of MNE Scan in several device-independent use cases, including, a clinical epilepsy study, real-time source estimation, and Brain Computer Interface (BCI) application. Compared to existing tools we propose a modular software considering clinical software requirements expected by certification authorities. At the same time the software is extendable and freely accessible. We conclude that MNE Scan is the first step in creating a device-independent open-source software to facilitate the transition from basic neuroscience research to both applied sciences and clinical applications. Copyright © 2018 Elsevier B.V. All rights reserved.
Exploring the Role of Usability in the Software Process: A Study of Irish Software SMEs
NASA Astrophysics Data System (ADS)
O'Connor, Rory V.
This paper explores the software processes and usability techniques used by Small and Medium Enterprises (SMEs) that develop web applications. The significance of this research is that it looks at development processes used by SMEs in order to assess to what degree usability is integrated into the process. This study seeks to gain an understanding into the level of awareness of usability within SMEs today and their commitment to usability in practice. The motivation for this research is to explore the current development processes used by SMEs in developing web applications and to understand how usability is represented in those processes. The background for this research is provided by the growth of the web application industry beyond informational web sites to more sophisticated applications delivering a broad range of functionality. This paper presents an analysis of the practices of several Irish SMEs that develop web applications through a series of case studies. With the focus on SMEs that develop web applications as Management Information Systems and not E-Commerce sites, informational sites, online communities or web portals. This study gathered data about the usability techniques practiced by these companies and their awareness of usability in the context of the software process in those SMEs. The contribution of this study is to further the understanding of the current role of usability within the software development processes of SMEs that develop web applications.
Computer Sciences and Data Systems, volume 1
NASA Technical Reports Server (NTRS)
1987-01-01
Topics addressed include: software engineering; university grants; institutes; concurrent processing; sparse distributed memory; distributed operating systems; intelligent data management processes; expert system for image analysis; fault tolerant software; and architecture research.
Developing smartphone apps for behavioural studies: The AlcoRisk app case study.
Smith, Anthony; de Salas, Kristy; Lewis, Ian; Schüz, Benjamin
2017-08-01
Smartphone apps have emerged as valuable research tools to sample human behaviours at their time of occurrence within natural environments. Human behaviour sampling methods, such as Ecological Momentary Assessment (EMA), aim to facilitate research that is situated in ecologically valid real world environments rather than laboratory environments. Researchers have trialled a range of EMA smartphone apps to sample human behaviours such as dieting, physical activity and smoking. Software development processes for EMA smartphones apps, however, are not widely documented with little guidance provided for the integration of complex multidisciplinary behavioural and technical fields. In this paper, the AlcoRisk app for studying alcohol consumption and risk taking tendencies is presented alongside a software development process that integrates these multidisciplinary fields. The software development process consists of three stages including requirements analysis, feature and interface design followed by app implementation. Results from a preliminary feasibility study support the efficacy of the AlcoRisk app's software development process. Copyright © 2017 Elsevier Inc. All rights reserved.
Wang, Xiaofeng; Abrahamsson, Pekka
2014-01-01
For more than thirty years, it has been claimed that a way to improve software developers’ productivity and software quality is to focus on people and to provide incentives to make developers satisfied and happy. This claim has rarely been verified in software engineering research, which faces an additional challenge in comparison to more traditional engineering fields: software development is an intellectual activity and is dominated by often-neglected human factors (called human aspects in software engineering research). Among the many skills required for software development, developers must possess high analytical problem-solving skills and creativity for the software construction process. According to psychology research, affective states—emotions and moods—deeply influence the cognitive processing abilities and performance of workers, including creativity and analytical problem solving. Nonetheless, little research has investigated the correlation between the affective states, creativity, and analytical problem-solving performance of programmers. This article echoes the call to employ psychological measurements in software engineering research. We report a study with 42 participants to investigate the relationship between the affective states, creativity, and analytical problem-solving skills of software developers. The results offer support for the claim that happy developers are indeed better problem solvers in terms of their analytical abilities. The following contributions are made by this study: (1) providing a better understanding of the impact of affective states on the creativity and analytical problem-solving capacities of developers, (2) introducing and validating psychological measurements, theories, and concepts of affective states, creativity, and analytical-problem-solving skills in empirical software engineering, and (3) raising the need for studying the human factors of software engineering by employing a multidisciplinary viewpoint. PMID:24688866
Graziotin, Daniel; Wang, Xiaofeng; Abrahamsson, Pekka
2014-01-01
For more than thirty years, it has been claimed that a way to improve software developers' productivity and software quality is to focus on people and to provide incentives to make developers satisfied and happy. This claim has rarely been verified in software engineering research, which faces an additional challenge in comparison to more traditional engineering fields: software development is an intellectual activity and is dominated by often-neglected human factors (called human aspects in software engineering research). Among the many skills required for software development, developers must possess high analytical problem-solving skills and creativity for the software construction process. According to psychology research, affective states-emotions and moods-deeply influence the cognitive processing abilities and performance of workers, including creativity and analytical problem solving. Nonetheless, little research has investigated the correlation between the affective states, creativity, and analytical problem-solving performance of programmers. This article echoes the call to employ psychological measurements in software engineering research. We report a study with 42 participants to investigate the relationship between the affective states, creativity, and analytical problem-solving skills of software developers. The results offer support for the claim that happy developers are indeed better problem solvers in terms of their analytical abilities. The following contributions are made by this study: (1) providing a better understanding of the impact of affective states on the creativity and analytical problem-solving capacities of developers, (2) introducing and validating psychological measurements, theories, and concepts of affective states, creativity, and analytical-problem-solving skills in empirical software engineering, and (3) raising the need for studying the human factors of software engineering by employing a multidisciplinary viewpoint.
Publishing Platform for Scientific Software - Lessons Learned
NASA Astrophysics Data System (ADS)
Hammitzsch, Martin; Fritzsch, Bernadette; Reusser, Dominik; Brembs, Björn; Deinzer, Gernot; Loewe, Peter; Fenner, Martin; van Edig, Xenia; Bertelmann, Roland; Pampel, Heinz; Klump, Jens; Wächter, Joachim
2015-04-01
Scientific software has become an indispensable commodity for the production, processing and analysis of empirical data but also for modelling and simulation of complex processes. Software has a significant influence on the quality of research results. For strengthening the recognition of the academic performance of scientific software development, for increasing its visibility and for promoting the reproducibility of research results, concepts for the publication of scientific software have to be developed, tested, evaluated, and then transferred into operations. For this, the publication and citability of scientific software have to fulfil scientific criteria by means of defined processes and the use of persistent identifiers, similar to data publications. The SciForge project is addressing these challenges. Based on interviews a blueprint for a scientific software publishing platform and a systematic implementation plan has been designed. In addition, the potential of journals, software repositories and persistent identifiers have been evaluated to improve the publication and dissemination of reusable software solutions. It is important that procedures for publishing software as well as methods and tools for software engineering are reflected in the architecture of the platform, in order to improve the quality of the software and the results of research. In addition, it is necessary to work continuously on improving specific conditions that promote the adoption and sustainable utilization of scientific software publications. Among others, this would include policies for the development and publication of scientific software in the institutions but also policies for establishing the necessary competencies and skills of scientists and IT personnel. To implement the concepts developed in SciForge a combined bottom-up / top-down approach is considered that will be implemented in parallel in different scientific domains, e.g. in earth sciences, climate research and the life sciences. Based on the developed blueprints a scientific software publishing platform will be iteratively implemented, tested, and evaluated. Thus the platform should be developed continuously on the basis of gained experiences and results. The platform services will be extended one by one corresponding to the requirements of the communities. Thus the implemented platform for the publication of scientific software can be improved and stabilized incrementally as a tool with software, science, publishing, and user oriented features.
Making Visible the Coding Process: Using Qualitative Data Software in a Post-Structural Study
ERIC Educational Resources Information Center
Ryan, Mary
2009-01-01
Qualitative research methods require transparency to ensure the "trustworthiness" of the data analysis. The intricate processes of organising, coding and analysing the data are often rendered invisible in the presentation of the research findings, which requires a "leap of faith" for the reader. Computer assisted data analysis software can be used…
NASA Technical Reports Server (NTRS)
Lu, George C.
2003-01-01
The purpose of the EXPRESS (Expedite the PRocessing of Experiments to Space Station) rack project is to provide a set of predefined interfaces for scientific payloads which allow rapid integration into a payload rack on International Space Station (ISS). VxWorks' was selected as the operating system for the rack and payload resource controller, primarily based on the proliferation of VME (Versa Module Eurocard) products. These products provide needed flexibility for future hardware upgrades to meet everchanging science research rack configuration requirements. On the International Space Station, there are multiple science research rack configurations, including: 1) Human Research Facility (HRF); 2) EXPRESS ARIS (Active Rack Isolation System); 3) WORF (Window Observational Research Facility); and 4) HHR (Habitat Holding Rack). The RIC (Rack Interface Controller) connects payloads to the ISS bus architecture for data transfer between the payload and ground control. The RIC is a general purpose embedded computer which supports multiple communication protocols, including fiber optic communication buses, Ethernet buses, EIA-422, Mil-Std-1553 buses, SMPTE (Society Motion Picture Television Engineers)-170M video, and audio interfaces to payloads and the ISS. As a cost saving and software reliability strategy, the Boeing Payload Software Organization developed reusable common software where appropriate. These reusable modules included a set of low-level driver software interfaces to 1553B. RS232, RS422, Ethernet buses, HRDL (High Rate Data Link), video switch functionality, telemetry processing, and executive software hosted on the FUC computer. These drivers formed the basis for software development of the HRF, EXPRESS, EXPRESS ARIS, WORF, and HHR RIC executable modules. The reusable RIC common software has provided extensive benefits, including: 1) Significant reduction in development flow time; 2) Minimal rework and maintenance; 3) Improved reliability; and 4) Overall reduction in software life cycle cost. Due to the limited number of crew hours available on ISS for science research, operational efficiency is a critical customer concern. The current method of upgrading RIC software is a time consuming process; thus, an improved methodology for uploading RIC software is currently under evaluation.
Final Report of the NASA Office of Safety and Mission Assurance Agile Benchmarking Team
NASA Technical Reports Server (NTRS)
Wetherholt, Martha
2016-01-01
To ensure that the NASA Safety and Mission Assurance (SMA) community remains in a position to perform reliable Software Assurance (SA) on NASAs critical software (SW) systems with the software industry rapidly transitioning from waterfall to Agile processes, Terry Wilcutt, Chief, Safety and Mission Assurance, Office of Safety and Mission Assurance (OSMA) established the Agile Benchmarking Team (ABT). The Team's tasks were: 1. Research background literature on current Agile processes, 2. Perform benchmark activities with other organizations that are involved in software Agile processes to determine best practices, 3. Collect information on Agile-developed systems to enable improvements to the current NASA standards and processes to enhance their ability to perform reliable software assurance on NASA Agile-developed systems, 4. Suggest additional guidance and recommendations for updates to those standards and processes, as needed. The ABT's findings and recommendations for software management, engineering and software assurance are addressed herein.
Waveform Generator Signal Processing Software
DOT National Transportation Integrated Search
1988-09-01
This report describes the software that was developed to process test waveforms that were recorded by crash test data acquisition systems. The test waveforms are generated by an electronic waveform generator developed by MGA Research Corporation unde...
ERIC Educational Resources Information Center
Hurt, Andrew C.
2007-01-01
With technology advances, computer software becomes increasingly difficult to learn. Adults often rely on software training to keep abreast of these changes. Instructor-led software training is frequently used to teach adults new software skills; however there is limited research regarding the best practices in adult computer software training.…
BioSig: The Free and Open Source Software Library for Biomedical Signal Processing
Vidaurre, Carmen; Sander, Tilmann H.; Schlögl, Alois
2011-01-01
BioSig is an open source software library for biomedical signal processing. The aim of the BioSig project is to foster research in biomedical signal processing by providing free and open source software tools for many different application areas. Some of the areas where BioSig can be employed are neuroinformatics, brain-computer interfaces, neurophysiology, psychology, cardiovascular systems, and sleep research. Moreover, the analysis of biosignals such as the electroencephalogram (EEG), electrocorticogram (ECoG), electrocardiogram (ECG), electrooculogram (EOG), electromyogram (EMG), or respiration signals is a very relevant element of the BioSig project. Specifically, BioSig provides solutions for data acquisition, artifact processing, quality control, feature extraction, classification, modeling, and data visualization, to name a few. In this paper, we highlight several methods to help students and researchers to work more efficiently with biomedical signals. PMID:21437227
BioSig: the free and open source software library for biomedical signal processing.
Vidaurre, Carmen; Sander, Tilmann H; Schlögl, Alois
2011-01-01
BioSig is an open source software library for biomedical signal processing. The aim of the BioSig project is to foster research in biomedical signal processing by providing free and open source software tools for many different application areas. Some of the areas where BioSig can be employed are neuroinformatics, brain-computer interfaces, neurophysiology, psychology, cardiovascular systems, and sleep research. Moreover, the analysis of biosignals such as the electroencephalogram (EEG), electrocorticogram (ECoG), electrocardiogram (ECG), electrooculogram (EOG), electromyogram (EMG), or respiration signals is a very relevant element of the BioSig project. Specifically, BioSig provides solutions for data acquisition, artifact processing, quality control, feature extraction, classification, modeling, and data visualization, to name a few. In this paper, we highlight several methods to help students and researchers to work more efficiently with biomedical signals.
Loudon, David; Macdonald, Alastair S.; Carse, Bruce; Thikey, Heather; Jones, Lucy; Rowe, Philip J.; Uzor, Stephen; Ayoade, Mobolaji; Baillie, Lynne
2012-01-01
This paper describes the ongoing process of the development and evaluation of prototype visualisation software, designed to assist in the understanding and the improvement of appropriate movements during rehabilitation. The process of engaging users throughout the research project is detailed in the paper, including how the design of the visualisation software is being adapted to meet the emerging understanding of the needs of patients and professionals, and of the rehabilitation process. The value of the process for the design of the visualisation software is illustrated with a discussion of the findings of pre-pilot focus groups with stroke survivors and therapists. PMID:23011812
Harnessing ISO/IEC 12207 to Examine the Extent of SPI Activity in an Organisation
NASA Astrophysics Data System (ADS)
Clarke, Paul; O'Connor, Rory
The quality of the software development process directly affects the quality of the software product. To be successful, software development organisations must respond to changes in technology and business circumstances, and therefore software process improvement (SPI) is required. SPI activity relates to any modification that is performed to the software process in order to improve an aspect of the process. Although multiple process assessments could be employed to examine SPI activity, they present an inefficient tool for such an examination. This paper presents an overview of a new survey-based resource that utilises the process reference model in ISO/IEC 12207 in order to expressly and directly determine the level of SPI activity in a software development organisation. This survey instrument can be used by practitioners, auditors and researchers who are interested in determining the extent of SPI activity in an organisation.
Description of research interests and current work related to automating software design
NASA Technical Reports Server (NTRS)
Kaindl, Hermann
1992-01-01
Enclosed is a list of selected and recent publications. Most of these publications concern applied research in the areas of software engineering and human-computer interaction. It is felt that domain-specific knowledge plays a major role in software development. Additionally, it is believed that improvements in the general software development process (e.g., object-oriented approaches) will have to be combined with the use of large domain-specific knowledge bases.
AVE-SESAME program for the REEDA System
NASA Technical Reports Server (NTRS)
Hickey, J. S.
1981-01-01
The REEDA system software was modified and improved to process the AVE-SESAME severe storm data. A random access file system for the AVE storm data was designed, tested, and implemented. The AVE/SESAME software was modified to incorporate the random access file input and to interface with new graphics hardware/software now available on the REEDA system. Software was developed to graphically display the AVE/SESAME data in the convention normally used by severe storm researchers. Software was converted to AVE/SESAME software systems and interfaced with existing graphics hardware/software available on the REEDA System. Software documentation was provided for existing AVE/SESAME programs underlining functional flow charts and interacting questions. All AVE/SESAME data sets in random access format was processed to allow developed software to access the entire AVE/SESAME data base. The existing software was modified to allow for processing of different AVE/SESAME data set types including satellite surface and radar data.
An assessment of space shuttle flight software development processes
NASA Technical Reports Server (NTRS)
1993-01-01
In early 1991, the National Aeronautics and Space Administration's (NASA's) Office of Space Flight commissioned the Aeronautics and Space Engineering Board (ASEB) of the National Research Council (NRC) to investigate the adequacy of the current process by which NASA develops and verifies changes and updates to the Space Shuttle flight software. The Committee for Review of Oversight Mechanisms for Space Shuttle Flight Software Processes was convened in Jan. 1992 to accomplish the following tasks: (1) review the entire flight software development process from the initial requirements definition phase to final implementation, including object code build and final machine loading; (2) review and critique NASA's independent verification and validation process and mechanisms, including NASA's established software development and testing standards; (3) determine the acceptability and adequacy of the complete flight software development process, including the embedded validation and verification processes through comparison with (1) generally accepted industry practices, and (2) generally accepted Department of Defense and/or other government practices (comparing NASA's program with organizations and projects having similar volumes of software development, software maturity, complexity, criticality, lines of code, and national standards); (4) consider whether independent verification and validation should continue. An overview of the study, independent verification and validation of critical software, and the Space Shuttle flight software development process are addressed. Findings and recommendations are presented.
Driver education program status report : software system.
DOT National Transportation Integrated Search
1981-01-01
In April of 1980, a joint decision between Research Council personnel and representatives of the Department of Education was reached, and a project was undertaken by the Research Council to provide a software system to process the annual Driver Educa...
Proceedings of the Twenty-Fourth Annual Software Engineering Workshop
NASA Technical Reports Server (NTRS)
2000-01-01
On December 1 and 2, the Software Engineering Laboratory (SEL), a consortium composed of NASA/Goddard, the University of Maryland, and CSC, held the 24th Software Engineering Workshop (SEW), the last of the millennium. Approximately 240 people attended the 2-day workshop. Day 1 was composed of four sessions: International Influence of the Software Engineering Laboratory; Object Oriented Testing and Reading; Software Process Improvement; and Space Software. For the first session, three internationally known software process experts discussed the influence of the SEL with respect to software engineering research. In the Space Software session, prominent representatives from three different NASA sites- GSFC's Marti Szczur, the Jet Propulsion Laboratory's Rick Doyle, and the Ames Research Center IV&V Facility's Lou Blazy- discussed the future of space software in their respective centers. At the end of the first day, the SEW sponsored a reception at the GSFC Visitors' Center. Day 2 also provided four sessions: Using the Experience Factory; A panel discussion entitled "Software Past, Present, and Future: Views from Government, Industry, and Academia"; Inspections; and COTS. The day started with an excellent talk by CSC's Frank McGarry on "Attaining Level 5 in CMM Process Maturity." Session 2, the panel discussion on software, featured NASA Chief Information Officer Lee Holcomb (Government), our own Jerry Page (Industry), and Mike Evangelist of the National Science Foundation (Academia). Each presented his perspective on the most important developments in software in the past 10 years, in the present, and in the future.
NASA Technical Reports Server (NTRS)
Mayer, Richard J.; Blinn, Thomas M.; Dewitte, Paul S.; Crump, John W.; Ackley, Keith A.
1992-01-01
The Framework Programmable Software Development Platform (FPP) is a project aimed at effectively combining tool and data integration mechanisms with a model of the software development process to provide an intelligent integrated software development environment. Guided by the model, this system development framework will take advantage of an integrated operating environment to automate effectively the management of the software development process so that costly mistakes during the development phase can be eliminated. The Advanced Software Development Workstation (ASDW) program is conducting research into development of advanced technologies for Computer Aided Software Engineering (CASE).
ERIC Educational Resources Information Center
Margerum-Leys, Jon; Kupperman, Jeff; Boyle-Heimann, Kristen
This paper presents perspectives on the use of data analysis software in the process of qualitative research. These perspectives were gained in the conduct of three qualitative research studies that differed in theoretical frames, areas of interests, and scope. Their common use of a particular data analysis software package allows the exploration…
Imaging has enormous untapped potential to improve cancer research through software to extract and process morphometric and functional biomarkers. In the era of non-cytotoxic treatment agents, multi- modality image-guided ablative therapies and rapidly evolving computational resources, quantitative imaging software can be transformative in enabling minimally invasive, objective and reproducible evaluation of cancer treatment response. Post-processing algorithms are integral to high-throughput analysis and fine- grained differentiation of multiple molecular targets.
NASA Astrophysics Data System (ADS)
Wasser, Avi; Lincoln, Maya
In recent years, both practitioners and applied researchers have become increasingly interested in methods for integrating business process models and enterprise software systems through the deployment of enabling middleware. Integrative BPM research has been mainly focusing on the conversion of workflow notations into enacted application procedures, and less effort has been invested in enhancing the connectivity between design level, non-workflow business process models and related enactment systems such as: ERP, SCM and CRM. This type of integration is useful at several stages of an IT system lifecycle, from design and implementation through change management, upgrades and rollout. The paper presents an integration method that utilizes SOA for connecting business process models with corresponding enterprise software systems. The method is then demonstrated through an Oracle E-Business Suite procurement process and its ERP transactions.
RICIS Software Engineering 90 Symposium: Aerospace Applications and Research Directions Proceedings
NASA Technical Reports Server (NTRS)
1990-01-01
Papers presented at RICIS Software Engineering Symposium are compiled. The following subject areas are covered: synthesis - integrating product and process; Serpent - a user interface management system; prototyping distributed simulation networks; and software reuse.
Measurement fidelity of heart rate variability signal processing: The devil is in the details
Jarrin, Denise C.; McGrath, Jennifer J.; Giovanniello, Sabrina; Poirier, Paul; Lambert, Marie
2017-01-01
Heart rate variability (HRV) is a particularly valuable quantitative marker of the flexibility and balance of the autonomic nervous system. Significant advances in software programs to automatically derive HRV have led to its extensive use in psychophysiological research. However, there is a lack of systematic comparisons across software programs used to derive HRV indices. Further, researchers report meager details on important signal processing decisions making synthesis across studies challenging. The aim of the present study was to evaluate the measurement fidelity of time- and frequency-domain HRV indices derived from three predominant signal processing software programs commonly used in clinical and research settings. Triplicate ECG recordings were derived from 20 participants using identical data acquisition hardware. Among the time-domain indices, there was strong to excellent correspondence (ICCavg =0.93) for SDNN, SDANN, SDNNi, rMSSD, and pNN50. The frequency-domain indices yielded excellent correspondence (ICCavg =0.91) for LF, HF, and LF/HF ratio, except for VLF which exhibited poor correspondence (ICCavg =0.19). Stringent user-decisions and technical specifications for nuanced HRV processing details are essential to ensure measurement fidelity across signal processing software programs. PMID:22820268
Software Acquisition Improvement in the Aeronautical Systems Center
2008-09-01
software fielded, a variety of different methods were suggested by the interviewees. These included blocks, suites and other tailored processes developed...12 Selection of Research Method ...DoD look to the commercial market to buy tools, methods , environments, and application software, instead of custom-built software (DSB: 1987). These
Contingency theoretic methodology for agent-based web-oriented manufacturing systems
NASA Astrophysics Data System (ADS)
Durrett, John R.; Burnell, Lisa J.; Priest, John W.
2000-12-01
The development of distributed, agent-based, web-oriented, N-tier Information Systems (IS) must be supported by a design methodology capable of responding to the convergence of shifts in business process design, organizational structure, computing, and telecommunications infrastructures. We introduce a contingency theoretic model for the use of open, ubiquitous software infrastructure in the design of flexible organizational IS. Our basic premise is that developers should change in the way they view the software design process from a view toward the solution of a problem to one of the dynamic creation of teams of software components. We postulate that developing effective, efficient, flexible, component-based distributed software requires reconceptualizing the current development model. The basic concepts of distributed software design are merged with the environment-causes-structure relationship from contingency theory; the task-uncertainty of organizational- information-processing relationships from information processing theory; and the concept of inter-process dependencies from coordination theory. Software processes are considered as employees, groups of processes as software teams, and distributed systems as software organizations. Design techniques already used in the design of flexible business processes and well researched in the domain of the organizational sciences are presented. Guidelines that can be utilized in the creation of component-based distributed software will be discussed.
Building a Snow Data Management System using Open Source Software (and IDL)
NASA Astrophysics Data System (ADS)
Goodale, C. E.; Mattmann, C. A.; Ramirez, P.; Hart, A. F.; Painter, T.; Zimdars, P. A.; Bryant, A.; Brodzik, M.; Skiles, M.; Seidel, F. C.; Rittger, K. E.
2012-12-01
At NASA's Jet Propulsion Laboratory free and open source software is used everyday to support a wide range of projects, from planetary to climate to research and development. In this abstract I will discuss the key role that open source software has played in building a robust science data processing pipeline for snow hydrology research, and how the system is also able to leverage programs written in IDL, making JPL's Snow Data System a hybrid of open source and proprietary software. Main Points: - The Design of the Snow Data System (illustrate how the collection of sub-systems are combined to create a complete data processing pipeline) - Discuss the Challenges of moving from a single algorithm on a laptop, to running 100's of parallel algorithms on a cluster of servers (lesson's learned) - Code changes - Software license related challenges - Storage Requirements - System Evolution (from data archiving, to data processing, to data on a map, to near-real-time products and maps) - Road map for the next 6 months (including how easily we re-used the snowDS code base to support the Airborne Snow Observatory Mission) Software in Use and their Software Licenses: IDL - Used for pre and post processing of data. Licensed under a proprietary software license held by Excelis. Apache OODT - Used for data management and workflow processing. Licensed under the Apache License Version 2. GDAL - Geospatial Data processing library used for data re-projection currently. Licensed under the X/MIT license. GeoServer - WMS Server. Licensed under the General Public License Version 2.0 Leaflet.js - Javascript web mapping library. Licensed under the Berkeley Software Distribution License. Python - Glue code and miscellaneous data processing support. Licensed under the Python Software Foundation License. Perl - Script wrapper for running the SCAG algorithm. Licensed under the General Public License Version 3. PHP - Front-end web application programming. Licensed under the PHP License Version 3.01
Tang, Qi-Yi; Zhang, Chuan-Xi
2013-04-01
A comprehensive but simple-to-use software package called DPS (Data Processing System) has been developed to execute a range of standard numerical analyses and operations used in experimental design, statistics and data mining. This program runs on standard Windows computers. Many of the functions are specific to entomological and other biological research and are not found in standard statistical software. This paper presents applications of DPS to experimental design, statistical analysis and data mining in entomology. © 2012 The Authors Insect Science © 2012 Institute of Zoology, Chinese Academy of Sciences.
NASA Technical Reports Server (NTRS)
Lee, Pen-Nan
1991-01-01
Previously, several research tasks have been conducted, some observations were obtained, and several possible suggestions have been contemplated involving software quality assurance engineering at NASA Johnson. These research tasks are briefly described. Also, a brief discussion is given on the role of software quality assurance in software engineering along with some observations and suggestions. A brief discussion on a training program for software quality assurance engineers is provided. A list of assurance factors as well as quality factors are also included. Finally, a process model which can be used for searching and collecting software quality assurance tools is presented.
A taxonomy and discussion of software attack technologies
NASA Astrophysics Data System (ADS)
Banks, Sheila B.; Stytz, Martin R.
2005-03-01
Software is a complex thing. It is not an engineering artifact that springs forth from a design by simply following software coding rules; creativity and the human element are at the heart of the process. Software development is part science, part art, and part craft. Design, architecture, and coding are equally important activities and in each of these activities, errors may be introduced that lead to security vulnerabilities. Therefore, inevitably, errors enter into the code. Some of these errors are discovered during testing; however, some are not. The best way to find security errors, whether they are introduced as part of the architecture development effort or coding effort, is to automate the security testing process to the maximum extent possible and add this class of tools to the tools available, which aids in the compilation process, testing, test analysis, and software distribution. Recent technological advances, improvements in computer-generated forces (CGFs), and results in research in information assurance and software protection indicate that we can build a semi-intelligent software security testing tool. However, before we can undertake the security testing automation effort, we must understand the scope of the required testing, the security failures that need to be uncovered during testing, and the characteristics of the failures. Therefore, we undertook the research reported in the paper, which is the development of a taxonomy and a discussion of software attacks generated from the point of view of the security tester with the goal of using the taxonomy to guide the development of the knowledge base for the automated security testing tool. The representation for attacks and threat cases yielded by this research captures the strategies, tactics, and other considerations that come into play during the planning and execution of attacks upon application software. The paper is organized as follows. Section one contains an introduction to our research and a discussion of the motivation for our work. Section two contains a presents our taxonomy of software attacks and a discussion of the strategies employed and general weaknesses exploited for each attack. Section three contains a summary and suggestions for further research.
Developing Software for Pharmacodynamics and Bioassay Studies
The objective of the project is to develop a software system to process general pharmacologic, toxicological, or other biomedical research data that...exhibit a non-monotonic dose-response relationship - for which the current parametric models fail. The software will analyze dose-response
Software control and system configuration management - A process that works
NASA Technical Reports Server (NTRS)
Petersen, K. L.; Flores, C., Jr.
1983-01-01
A comprehensive software control and system configuration management process for flight-crucial digital control systems of advanced aircraft has been developed and refined to insure efficient flight system development and safe flight operations. Because of the highly complex interactions among the hardware, software, and system elements of state-of-the-art digital flight control system designs, a systems-wide approach to configuration control and management has been used. Specific procedures are implemented to govern discrepancy reporting and reconciliation, software and hardware change control, systems verification and validation testing, and formal documentation requirements. An active and knowledgeable configuration control board reviews and approves all flight system configuration modifications and revalidation tests. This flexible process has proved effective during the development and flight testing of several research aircraft and remotely piloted research vehicles with digital flight control systems that ranged from relatively simple to highly complex, integrated mechanizations.
NASA Astrophysics Data System (ADS)
Karmazikov, Y. V.; Fainberg, E. M.
2005-06-01
Work with DICOM compatible equipment integrated into hardware and software systems for medical purposes has been considered. Structures of process of reception and translormation of the data are resulted by the example of digital rentgenography and angiography systems, included in hardware-software complex DIMOL-IK. Algorithms of reception and the analysis of the data are offered. Questions of the further processing and storage of the received data are considered.
Evolution of Secondary Software Businesses: Understanding Industry Dynamics
NASA Astrophysics Data System (ADS)
Tyrväinen, Pasi; Warsta, Juhani; Seppänen, Veikko
Primary software industry originates from IBM's decision to unbundle software-related computer system development activities to external partners. This kind of outsourcing from an enterprise internal software development activity is a common means to start a new software business serving a vertical software market. It combines knowledge of the vertical market process with competence in software development. In this research, we present and analyze the key figures of the Finnish secondary software industry, in order to quantify its interaction with the primary software industry during the period of 2000-2003. On the basis of the empirical data, we present a model for evolution of a secondary software business, which makes explicit the industry dynamics. It represents the shift from internal software developed for competitive advantage to development of products supporting standard business processes on top of standardized technologies. We also discuss the implications for software business strategies in each phase.
Living Design Memory: Framework, Implementation, Lessons Learned.
ERIC Educational Resources Information Center
Terveen, Loren G.; And Others
1995-01-01
Discusses large-scale software development and describes the development of the Designer Assistant to improve software development effectiveness. Highlights include the knowledge management problem; related work, including artificial intelligence and expert systems, software process modeling research, and other approaches to organizational memory;…
A communication channel model of the software process
NASA Technical Reports Server (NTRS)
Tausworthe, R. C.
1988-01-01
Reported here is beginning research into a noisy communication channel analogy of software development process productivity, in order to establish quantifiable behavior and theoretical bounds. The analogy leads to a fundamental mathematical relationship between human productivity and the amount of information supplied by the developers, the capacity of the human channel for processing and transmitting information, the software product yield (object size), the work effort, requirements efficiency, tool and process efficiency, and programming environment advantage. Also derived is an upper bound to productivity that shows that software reuse is the only means than can lead to unbounded productivity growth; practical considerations of size and cost of reusable components may reduce this to a finite bound.
A communication channel model of the software process
NASA Technical Reports Server (NTRS)
Tausworthe, Robert C.
1988-01-01
Beginning research into a noisy communication channel analogy of software development process productivity, in order to establish quantifiable behavior and theoretical bounds is discussed. The analogy leads to a fundamental mathematical relationship between human productivity and the amount of information supplied by the developers, the capacity of the human channel for processing and transmitting information, the software product yield (object size) the work effort, requirements efficiency, tool and process efficiency, and programming environment advantage. An upper bound to productivity is derived that shows that software reuse is the only means that can lead to unbounded productivity growth; practical considerations of size and cost of reusable components may reduce this to a finite bound.
Unified Geophysical Cloud Platform (UGCP) for Seismic Monitoring and other Geophysical Applications.
NASA Astrophysics Data System (ADS)
Synytsky, R.; Starovoit, Y. O.; Henadiy, S.; Lobzakov, V.; Kolesnikov, L.
2016-12-01
We present Unified Geophysical Cloud Platform (UGCP) or UniGeoCloud as an innovative approach for geophysical data processing in the Cloud environment with the ability to run any type of data processing software in isolated environment within the single Cloud platform. We've developed a simple and quick method of several open-source widely known software seismic packages (SeisComp3, Earthworm, Geotool, MSNoise) installation which does not require knowledge of system administration, configuration, OS compatibility issues etc. and other often annoying details preventing time wasting for system configuration work. Installation process is simplified as "mouse click" on selected software package from the Cloud market place. The main objective of the developed capability was the software tools conception with which users are able to design and install quickly their own highly reliable and highly available virtual IT-infrastructure for the organization of seismic (and in future other geophysical) data processing for either research or monitoring purposes. These tools provide access to any seismic station data available in open IP configuration from the different networks affiliated with different Institutions and Organizations. It allows also setting up your own network as you desire by selecting either regionally deployed stations or the worldwide global network based on stations selection form the global map. The processing software and products and research results could be easily monitored from everywhere using variety of user's devices form desk top computers to IT gadgets. Currents efforts of the development team are directed to achieve Scalability, Reliability and Sustainability (SRS) of proposed solutions allowing any user to run their applications with the confidence of no data loss and no failure of the monitoring or research software components. The system is suitable for quick rollout of NDC-in-Box software package developed for State Signatories and aimed for promotion of data processing collected by the IMS Network.
Information Systems and Software Engineering Research and Education in Oulu until the 1990s
NASA Astrophysics Data System (ADS)
Oinas-Kukkonen, Henry; Kerola, Pentti; Oinas-Kukkonen, Harri; Similä, Jouni; Pulli, Petri
This paper discusses the internationalization of software business in the Oulu region. Despite its small size, the region grew rapidly and very successfully into a global information and communication technology business center. The University of Oulu, which was the northern most university in the world at the time of its establishment (1958) had a strong emphasis on engineering since its very beginning. Research on electronics was carried out since the early 1960s. Later, when the Department of Information Processing Science was founded in 1969, research on information systems and later also on software engineering was carried out. This paper discusses the role of the information systems and software engineering research for the business growth of the region. Special emphasis is put on understanding the role of system-theoretical and software development expertise for transferring research knowledge into practice.
Infusing Software Assurance Research Techniques into Use
NASA Technical Reports Server (NTRS)
Pressburger, Thomas; DiVito, Ben; Feather, Martin S.; Hinchey, Michael; Markosian, Lawrence; Trevino, Luis C.
2006-01-01
Research in the software engineering community continues to lead to new development techniques that encompass processes, methods and tools. However, a number of obstacles impede their infusion into software development practices. These are the recurring obstacles common to many forms of research. Practitioners cannot readily identify the emerging techniques that may benefit them, and cannot afford to risk time and effort evaluating and trying one out while there remains uncertainty about whether it will work for them. Researchers cannot readily identify the practitioners whose problems would be amenable to their techniques, and, lacking feedback from practical applications, are hard-pressed to gauge the where and in what ways to evolve their techniques to make them more likely to be successful. This paper describes an ongoing effort conducted by a software engineering research infusion team established by NASA s Software Engineering Initiative to overcome these obstacles. .
Andrew C. Oishi; David Hawthorne; Ram Oren
2016-01-01
Estimating transpiration from woody plants using thermal dissipation sap flux sensors requires careful data processing. Currently, researchers accomplish this using spreadsheets, or by personally writing scripts for statistical software programs (e.g., R, SAS). We developed the Baseliner software to help establish a standardized protocol for processing sap...
Predicting Software Suitability Using a Bayesian Belief Network
NASA Technical Reports Server (NTRS)
Beaver, Justin M.; Schiavone, Guy A.; Berrios, Joseph S.
2005-01-01
The ability to reliably predict the end quality of software under development presents a significant advantage for a development team. It provides an opportunity to address high risk components earlier in the development life cycle, when their impact is minimized. This research proposes a model that captures the evolution of the quality of a software product, and provides reliable forecasts of the end quality of the software being developed in terms of product suitability. Development team skill, software process maturity, and software problem complexity are hypothesized as driving factors of software product quality. The cause-effect relationships between these factors and the elements of software suitability are modeled using Bayesian Belief Networks, a machine learning method. This research presents a Bayesian Network for software quality, and the techniques used to quantify the factors that influence and represent software quality. The developed model is found to be effective in predicting the end product quality of small-scale software development efforts.
Software tools for data modelling and processing of human body temperature circadian dynamics.
Petrova, Elena S; Afanasova, Anastasia I
2015-01-01
This paper is presenting a software development for simulating and processing thermometry data. The motivation of this research is the miniaturization of actuators attached to human body which allow frequent temperature measurements and improve the medical diagnosis procedures related to circadian dynamics.
Colombet, B; Woodman, M; Badier, J M; Bénar, C G
2015-03-15
The importance of digital signal processing in clinical neurophysiology is growing steadily, involving clinical researchers and methodologists. There is a need for crossing the gap between these communities by providing efficient delivery of newly designed algorithms to end users. We have developed such a tool which both visualizes and processes data and, additionally, acts as a software development platform. AnyWave was designed to run on all common operating systems. It provides access to a variety of data formats and it employs high fidelity visualization techniques. It also allows using external tools as plug-ins, which can be developed in languages including C++, MATLAB and Python. In the current version, plug-ins allow computation of connectivity graphs (non-linear correlation h2) and time-frequency representation (Morlet wavelets). The software is freely available under the LGPL3 license. AnyWave is designed as an open, highly extensible solution, with an architecture that permits rapid delivery of new techniques to end users. We have developed AnyWave software as an efficient neurophysiological data visualizer able to integrate state of the art techniques. AnyWave offers an interface well suited to the needs of clinical research and an architecture designed for integrating new tools. We expect this software to strengthen the collaboration between clinical neurophysiologists and researchers in biomedical engineering and signal processing. Copyright © 2015 Elsevier B.V. All rights reserved.
Software design for analysis of multichannel intracardial and body surface electrocardiograms.
Potse, Mark; Linnenbank, André C; Grimbergen, Cornelis A
2002-11-01
Analysis of multichannel ECG recordings (body surface maps (BSMs) and intracardial maps) requires special software. We created a software package and a user interface on top of a commercial data analysis package (MATLAB) by a combination of high-level and low-level programming. Our software was created to satisfy the needs of a diverse group of researchers. It can handle a large variety of recording configurations. It allows for interactive usage through a fast and robust user interface, and batch processing for the analysis of large amounts of data. The package is user-extensible, includes routines for both common and experimental data processing tasks, and works on several computer platforms. The source code is made intelligible using software for structured documentation and is available to the users. The package is currently used by more than ten research groups analysing ECG data worldwide.
The Evolution of Software Publication in Astronomy
NASA Astrophysics Data System (ADS)
Cantiello, Matteo
2018-01-01
Software is a fundamental component of the scientific research process. As astronomical discoveries increasingly rely on complex numerical calculations and the analysis of big data sets, publishing and documenting software is a fundamental step in ensuring transparency and reproducibility of results. I will briefly discuss the recent history of software publication and highlight the challenges and opportunities ahead.
Web-based interactive 2D/3D medical image processing and visualization software.
Mahmoudi, Seyyed Ehsan; Akhondi-Asl, Alireza; Rahmani, Roohollah; Faghih-Roohi, Shahrooz; Taimouri, Vahid; Sabouri, Ahmad; Soltanian-Zadeh, Hamid
2010-05-01
There are many medical image processing software tools available for research and diagnosis purposes. However, most of these tools are available only as local applications. This limits the accessibility of the software to a specific machine, and thus the data and processing power of that application are not available to other workstations. Further, there are operating system and processing power limitations which prevent such applications from running on every type of workstation. By developing web-based tools, it is possible for users to access the medical image processing functionalities wherever the internet is available. In this paper, we introduce a pure web-based, interactive, extendable, 2D and 3D medical image processing and visualization application that requires no client installation. Our software uses a four-layered design consisting of an algorithm layer, web-user-interface layer, server communication layer, and wrapper layer. To compete with extendibility of the current local medical image processing software, each layer is highly independent of other layers. A wide range of medical image preprocessing, registration, and segmentation methods are implemented using open source libraries. Desktop-like user interaction is provided by using AJAX technology in the web-user-interface. For the visualization functionality of the software, the VRML standard is used to provide 3D features over the web. Integration of these technologies has allowed implementation of our purely web-based software with high functionality without requiring powerful computational resources in the client side. The user-interface is designed such that the users can select appropriate parameters for practical research and clinical studies. Copyright (c) 2009 Elsevier Ireland Ltd. All rights reserved.
The Validation by Measurement Theory of Proposed Object-Oriented Software Metrics
NASA Technical Reports Server (NTRS)
Neal, Ralph D.
1996-01-01
Moving software development into the engineering arena requires controllability, and to control a process, it must be measurable. Measuring the process does no good if the product is not also measured, i.e., being the best at producing an inferior product does not define a quality process. Also, not every number extracted from software development is a valid measurement. A valid measurement only results when we are able to verify that the number is representative of the attribute that we wish to measure. Many proposed software metrics are used by practitioners without these metrics ever having been validated, leading to costly but often useless calculations. Several researchers have bemoaned the lack of scientific precision in much of the published software measurement work and have called for validation of software metrics by measurement theory. This dissertation applies measurement theory to validate fifty proposed object-oriented software metrics.
Sharing Research Models: Using Software Engineering Practices for Facilitation
Bryant, Stephanie P.; Solano, Eric; Cantor, Susanna; Cooley, Philip C.; Wagener, Diane K.
2011-01-01
Increasingly, researchers are turning to computational models to understand the interplay of important variables on systems’ behaviors. Although researchers may develop models that meet the needs of their investigation, application limitations—such as nonintuitive user interface features and data input specifications—may limit the sharing of these tools with other research groups. By removing these barriers, other research groups that perform related work can leverage these work products to expedite their own investigations. The use of software engineering practices can enable managed application production and shared research artifacts among multiple research groups by promoting consistent models, reducing redundant effort, encouraging rigorous peer review, and facilitating research collaborations that are supported by a common toolset. This report discusses three established software engineering practices— the iterative software development process, object-oriented methodology, and Unified Modeling Language—and the applicability of these practices to computational model development. Our efforts to modify the MIDAS TranStat application to make it more user-friendly are presented as an example of how computational models that are based on research and developed using software engineering practices can benefit a broader audience of researchers. PMID:21687780
A Unique Software System For Simulation-to-Flight Research
NASA Technical Reports Server (NTRS)
Chung, Victoria I.; Hutchinson, Brian K.
2001-01-01
"Simulation-to-Flight" is a research development concept to reduce costs and increase testing efficiency of future major aeronautical research efforts at NASA. The simulation-to-flight concept is achieved by using common software and hardware, procedures, and processes for both piloted-simulation and flight testing. This concept was applied to the design and development of two full-size transport simulators, a research system installed on a NASA B-757 airplane, and two supporting laboratories. This paper describes the software system that supports the simulation-to-flight facilities. Examples of various simulation-to-flight experimental applications were also provided.
NASA Astrophysics Data System (ADS)
Isnur Haryudo, Subuh; Imam Agung, Achmad; Firmansyah, Rifqi
2018-04-01
The purpose of this research is to develop learning media of control technique using Matrix Laboratory software with industry requirement approach. Learning media serves as a tool for creating a better and effective teaching and learning situation because it can accelerate the learning process in order to enhance the quality of learning. Control Techniques using Matrix Laboratory software can enlarge the interest and attention of students, with real experience and can grow independent attitude. This research design refers to the use of research and development (R & D) methods that have been modified by multi-disciplinary team-based researchers. This research used Computer based learning method consisting of computer and Matrix Laboratory software which was integrated with props. Matrix Laboratory has the ability to visualize the theory and analysis of the Control System which is an integration of computing, visualization and programming which is easy to use. The result of this instructional media development is to use mathematical equations using Matrix Laboratory software on control system application with DC motor plant and PID (Proportional-Integral-Derivative). Considering that manufacturing in the field of Distributed Control systems (DCSs), Programmable Controllers (PLCs), and Microcontrollers (MCUs) use PID systems in production processes are widely used in industry.
An approach to integrating and creating flexible software environments
NASA Technical Reports Server (NTRS)
Bellman, Kirstie L.
1992-01-01
Engineers and scientists are attempting to represent, analyze, and reason about increasingly complex systems. Many researchers have been developing new ways of creating increasingly open environments. In this research on VEHICLES, a conceptual design environment for space systems, an approach was developed, called 'wrapping', to flexibility and integration based on the collection and then processing of explicit qualitative descriptions of all the software resources in the environment. Currently, a simulation is available, VSIM, used to study both the types of wrapping descriptions and the processes necessary to use the metaknowledge to combine, select, adapt, and explain some of the software resources used in VEHICLES. What was learned about the types of knowledge necessary for the wrapping approach is described along with the implications of wrapping for several key software engineering issues.
NASA Technical Reports Server (NTRS)
Wilson, Larry W.
1989-01-01
The longterm goal of this research is to identify or create a model for use in analyzing the reliability of flight control software. The immediate tasks addressed are the creation of data useful to the study of software reliability and production of results pertinent to software reliability through the analysis of existing reliability models and data. The completed data creation portion of this research consists of a Generic Checkout System (GCS) design document created in cooperation with NASA and Research Triangle Institute (RTI) experimenters. This will lead to design and code reviews with the resulting product being one of the versions used in the Terminal Descent Experiment being conducted by the Systems Validations Methods Branch (SVMB) of NASA/Langley. An appended paper details an investigation of the Jelinski-Moranda and Geometric models for software reliability. The models were given data from a process that they have correctly simulated and asked to make predictions about the reliability of that process. It was found that either model will usually fail to make good predictions. These problems were attributed to randomness in the data and replication of data was recommended.
Software development: A paradigm for the future
NASA Technical Reports Server (NTRS)
Basili, Victor R.
1989-01-01
A new paradigm for software development that treats software development as an experimental activity is presented. It provides built-in mechanisms for learning how to develop software better and reusing previous experience in the forms of knowledge, processes, and products. It uses models and measures to aid in the tasks of characterization, evaluation and motivation. An organization scheme is proposed for separating the project-specific focus from the organization's learning and reuse focuses of software development. The implications of this approach for corporations, research and education are discussed and some research activities currently underway at the University of Maryland that support this approach are presented.
Research of aerohydrodynamic and aeroelastic processes on PNRPU HPC system
NASA Astrophysics Data System (ADS)
Modorskii, V. Ya.; Shevelev, N. A.
2016-10-01
Research of aerohydrodynamic and aeroelastic processes with the High Performance Computing Complex in PNIPU is actively conducted within the university priority development direction "Aviation engine and gas turbine technology". Work is carried out in two areas: development and use of domestic software and use of well-known foreign licensed applied software packets. In addition, the third direction associated with the verification of computational experiments - physical modeling, with unique proprietary experimental installations is being developed.
NASA Technical Reports Server (NTRS)
Feather, M. S.
2001-01-01
Risk assessment and mitigation is the focus of the Defect Detection and Prevention (DDP) process, which has been applied to spacecraft technology assessments and planning, both hardware and software. DDP's major elements and their relevance to core requirement engineering concerns are summarized. The accompanying research demonstration illustrates DDP's tool support, and further customizations for application to software.
Multi-crop area estimation and mapping on a microprocessor/mainframe network
NASA Technical Reports Server (NTRS)
Sheffner, E.
1985-01-01
The data processing system is outlined for a 1985 test aimed at determining the performance characteristics of area estimation and mapping procedures connected with the California Cooperative Remote Sensing Project. The project is a joint effort of the USDA Statistical Reporting Service-Remote Sensing Branch, the California Department of Water Resources, NASA-Ames Research Center, and the University of California Remote Sensing Research Program. One objective of the program was to study performance when data processing is done on a microprocessor/mainframe network under operational conditions. The 1985 test covered the hardware, software, and network specifications and the integration of these three components. Plans for the year - including planned completion of PEDITOR software, testing of software on MIDAS, and accomplishment of data processing on the MIDAS-VAX-CRAY network - are discussed briefly.
IEEE Computer Society/Software Engineering Institute Software Process Achievement (SPA) Award 2009
2011-03-01
capabilities to our GDM. We also introduced software as a service ( SaaS ) as part our technology solutions and have further enhanced our ability to...model PROSPER Infosys production support methodology Q&P quality and productivity R&D research and development SaaS software as a service ... Software Development Life Cycle (SDLC) 23 Table 10: Scientific Estimation Coverage by Service Line 27 CMU/SEI-2011-TR-008 | vi CMU/SEI-2011
The instrumental genesis process in future primary teachers using Dynamic Geometry Software
NASA Astrophysics Data System (ADS)
Ruiz-López, Natalia
2018-05-01
This paper, which describes a study undertaken with pairs of future primary teachers using GeoGebra software to solve geometry problems, includes a brief literature review, the theoretical framework and methodology used. An analysis of the instrumental genesis process for a pair participating in the case study is also provided. This analysis addresses the techniques and types of dragging used, the obstacles to learning encountered, a description of the interaction between the pair and their interaction with the teacher, and the type of language used. Based on this analysis, possibilities and limitations of the instrumental genesis process are identified for the development of geometric competencies such as conjecture creation, property checking and problem researching. It is also suggested that the methodology used in the analysis of the problem solving process may be useful for those teachers and researchers who want to integrate Dynamic Geometry Software (DGS) in their classrooms.
The Kepler Science Data Processing Pipeline Source Code Road Map
NASA Technical Reports Server (NTRS)
Wohler, Bill; Jenkins, Jon M.; Twicken, Joseph D.; Bryson, Stephen T.; Clarke, Bruce Donald; Middour, Christopher K.; Quintana, Elisa Victoria; Sanderfer, Jesse Thomas; Uddin, Akm Kamal; Sabale, Anima;
2016-01-01
We give an overview of the operational concepts and architecture of the Kepler Science Processing Pipeline. Designed, developed, operated, and maintained by the Kepler Science Operations Center (SOC) at NASA Ames Research Center, the Science Processing Pipeline is a central element of the Kepler Ground Data System. The SOC consists of an office at Ames Research Center, software development and operations departments, and a data center which hosts the computers required to perform data analysis. The SOC's charter is to analyze stellar photometric data from the Kepler spacecraft and report results to the Kepler Science Office for further analysis. We describe how this is accomplished via the Kepler Science Processing Pipeline, including, the software algorithms. We present the high-performance, parallel computing software modules of the pipeline that perform transit photometry, pixel-level calibration, systematic error correction, attitude determination, stellar target management, and instrument characterization.
The Effects of Beacons, Comments, and Tasks on Program Comprehension Process in Software Maintenance
ERIC Educational Resources Information Center
Fan, Quyin
2010-01-01
Program comprehension is the most important and frequent process in software maintenance. Extensive research has found that individual characteristics of programmers, differences of computer programs, and differences of task-driven motivations are the major factors that affect the program comprehension results. There is no study specifically…
Clinical software development for the Web: lessons learned from the BOADICEA project
2012-01-01
Background In the past 20 years, society has witnessed the following landmark scientific advances: (i) the sequencing of the human genome, (ii) the distribution of software by the open source movement, and (iii) the invention of the World Wide Web. Together, these advances have provided a new impetus for clinical software development: developers now translate the products of human genomic research into clinical software tools; they use open-source programs to build them; and they use the Web to deliver them. Whilst this open-source component-based approach has undoubtedly made clinical software development easier, clinical software projects are still hampered by problems that traditionally accompany the software process. This study describes the development of the BOADICEA Web Application, a computer program used by clinical geneticists to assess risks to patients with a family history of breast and ovarian cancer. The key challenge of the BOADICEA Web Application project was to deliver a program that was safe, secure and easy for healthcare professionals to use. We focus on the software process, problems faced, and lessons learned. Our key objectives are: (i) to highlight key clinical software development issues; (ii) to demonstrate how software engineering tools and techniques can facilitate clinical software development for the benefit of individuals who lack software engineering expertise; and (iii) to provide a clinical software development case report that can be used as a basis for discussion at the start of future projects. Results We developed the BOADICEA Web Application using an evolutionary software process. Our approach to Web implementation was conservative and we used conventional software engineering tools and techniques. The principal software development activities were: requirements, design, implementation, testing, documentation and maintenance. The BOADICEA Web Application has now been widely adopted by clinical geneticists and researchers. BOADICEA Web Application version 1 was released for general use in November 2007. By May 2010, we had > 1200 registered users based in the UK, USA, Canada, South America, Europe, Africa, Middle East, SE Asia, Australia and New Zealand. Conclusions We found that an evolutionary software process was effective when we developed the BOADICEA Web Application. The key clinical software development issues identified during the BOADICEA Web Application project were: software reliability, Web security, clinical data protection and user feedback. PMID:22490389
Clinical software development for the Web: lessons learned from the BOADICEA project.
Cunningham, Alex P; Antoniou, Antonis C; Easton, Douglas F
2012-04-10
In the past 20 years, society has witnessed the following landmark scientific advances: (i) the sequencing of the human genome, (ii) the distribution of software by the open source movement, and (iii) the invention of the World Wide Web. Together, these advances have provided a new impetus for clinical software development: developers now translate the products of human genomic research into clinical software tools; they use open-source programs to build them; and they use the Web to deliver them. Whilst this open-source component-based approach has undoubtedly made clinical software development easier, clinical software projects are still hampered by problems that traditionally accompany the software process. This study describes the development of the BOADICEA Web Application, a computer program used by clinical geneticists to assess risks to patients with a family history of breast and ovarian cancer. The key challenge of the BOADICEA Web Application project was to deliver a program that was safe, secure and easy for healthcare professionals to use. We focus on the software process, problems faced, and lessons learned. Our key objectives are: (i) to highlight key clinical software development issues; (ii) to demonstrate how software engineering tools and techniques can facilitate clinical software development for the benefit of individuals who lack software engineering expertise; and (iii) to provide a clinical software development case report that can be used as a basis for discussion at the start of future projects. We developed the BOADICEA Web Application using an evolutionary software process. Our approach to Web implementation was conservative and we used conventional software engineering tools and techniques. The principal software development activities were: requirements, design, implementation, testing, documentation and maintenance. The BOADICEA Web Application has now been widely adopted by clinical geneticists and researchers. BOADICEA Web Application version 1 was released for general use in November 2007. By May 2010, we had > 1200 registered users based in the UK, USA, Canada, South America, Europe, Africa, Middle East, SE Asia, Australia and New Zealand. We found that an evolutionary software process was effective when we developed the BOADICEA Web Application. The key clinical software development issues identified during the BOADICEA Web Application project were: software reliability, Web security, clinical data protection and user feedback.
Development of a Real-Time General-Purpose Digital Signal Processing Laboratory System.
1983-12-01
should serve several important purposes: to familiarize students with the use of common DSP tools in an instructional environment, to serve as a research ...of Dayton Research Institute researchers for DSP software and DSP system design insight. 3. Formulation of statement of requirements for development...Neither the University of Dayton nor its Research Institute have a DSP computer system. While UD offered no software or DSP system design information
Requirements Engineering in Building Climate Science Software
NASA Astrophysics Data System (ADS)
Batcheller, Archer L.
Software has an important role in supporting scientific work. This dissertation studies teams that build scientific software, focusing on the way that they determine what the software should do. These requirements engineering processes are investigated through three case studies of climate science software projects. The Earth System Modeling Framework assists modeling applications, the Earth System Grid distributes data via a web portal, and the NCAR (National Center for Atmospheric Research) Command Language is used to convert, analyze and visualize data. Document analysis, observation, and interviews were used to investigate the requirements-related work. The first research question is about how and why stakeholders engage in a project, and what they do for the project. Two key findings arise. First, user counts are a vital measure of project success, which makes adoption important and makes counting tricky and political. Second, despite the importance of quantities of users, a few particular "power users" develop a relationship with the software developers and play a special role in providing feedback to the software team and integrating the system into user practice. The second research question focuses on how project objectives are articulated and how they are put into practice. The team seeks to both build a software system according to product requirements but also to conduct their work according to process requirements such as user support. Support provides essential communication between users and developers that assists with refining and identifying requirements for the software. It also helps users to learn and apply the software to their real needs. User support is a vital activity for scientific software teams aspiring to create infrastructure. The third research question is about how change in scientific practice and knowledge leads to changes in the software, and vice versa. The "thickness" of a layer of software infrastructure impacts whether the software team or users have control and responsibility for making changes in response to new scientific ideas. Thick infrastructure provides more functionality for users, but gives them less control of it. The stability of infrastructure trades off against the responsiveness that the infrastructure can have to user needs.
Craniux: A LabVIEW-Based Modular Software Framework for Brain-Machine Interface Research
Degenhart, Alan D.; Kelly, John W.; Ashmore, Robin C.; Collinger, Jennifer L.; Tyler-Kabara, Elizabeth C.; Weber, Douglas J.; Wang, Wei
2011-01-01
This paper presents “Craniux,” an open-access, open-source software framework for brain-machine interface (BMI) research. Developed in LabVIEW, a high-level graphical programming environment, Craniux offers both out-of-the-box functionality and a modular BMI software framework that is easily extendable. Specifically, it allows researchers to take advantage of multiple features inherent to the LabVIEW environment for on-the-fly data visualization, parallel processing, multithreading, and data saving. This paper introduces the basic features and system architecture of Craniux and describes the validation of the system under real-time BMI operation using simulated and real electrocorticographic (ECoG) signals. Our results indicate that Craniux is able to operate consistently in real time, enabling a seamless work flow to achieve brain control of cursor movement. The Craniux software framework is made available to the scientific research community to provide a LabVIEW-based BMI software platform for future BMI research and development. PMID:21687575
Craniux: a LabVIEW-based modular software framework for brain-machine interface research.
Degenhart, Alan D; Kelly, John W; Ashmore, Robin C; Collinger, Jennifer L; Tyler-Kabara, Elizabeth C; Weber, Douglas J; Wang, Wei
2011-01-01
This paper presents "Craniux," an open-access, open-source software framework for brain-machine interface (BMI) research. Developed in LabVIEW, a high-level graphical programming environment, Craniux offers both out-of-the-box functionality and a modular BMI software framework that is easily extendable. Specifically, it allows researchers to take advantage of multiple features inherent to the LabVIEW environment for on-the-fly data visualization, parallel processing, multithreading, and data saving. This paper introduces the basic features and system architecture of Craniux and describes the validation of the system under real-time BMI operation using simulated and real electrocorticographic (ECoG) signals. Our results indicate that Craniux is able to operate consistently in real time, enabling a seamless work flow to achieve brain control of cursor movement. The Craniux software framework is made available to the scientific research community to provide a LabVIEW-based BMI software platform for future BMI research and development.
PSGMiner: A modular software for polysomnographic analysis.
Umut, İlhan
2016-06-01
Sleep disorders affect a great percentage of the population. The diagnosis of these disorders is usually made by polysomnography. This paper details the development of new software to carry out feature extraction in order to perform robust analysis and classification of sleep events using polysomnographic data. The software, called PSGMiner, is a tool, which visualizes, processes and classifies bioelectrical data. The purpose of this program is to provide researchers with a platform with which to test new hypotheses by creating tests to check for correlations that are not available in commercially available software. The software is freely available under the GPL3 License. PSGMiner is composed of a number of diverse modules such as feature extraction, annotation, and machine learning modules, all of which are accessible from the main module. Using the software, it is possible to extract features of polysomnography using digital signal processing and statistical methods and to perform different analyses. The features can be classified through the use of five classification algorithms. PSGMiner offers an architecture designed for integrating new methods. Automatic scoring, which is available in almost all commercial PSG software, is not inherently available in this program, though it can be implemented by two different methodologies (machine learning and algorithms). While similar software focuses on a certain signal or event composed of a small number of modules with no expansion possibility, the software introduced here can handle all polysomnographic signals and events. The software simplifies the processing of polysomnographic signals for researchers and physicians that are not experts in computer programming. It can find correlations between different events which could help predict an oncoming event such as sleep apnea. The software could also be used for educational purposes. Copyright © 2016 Elsevier Ltd. All rights reserved.
How to choose the right statistical software?-a method increasing the post-purchase satisfaction.
Cavaliere, Roberto
2015-12-01
Nowadays, we live in the "data era" where the use of statistical or data analysis software is inevitable, in any research field. This means that the choice of the right software tool or platform is a strategic issue for a research department. Nevertheless, in many cases decision makers do not pay the right attention to a comprehensive and appropriate evaluation of what the market offers. Indeed, the choice still depends on few factors like, for instance, researcher's personal inclination, e.g., which software have been used at the university or is already known. This is not wrong in principle, but in some cases it's not enough at all and might lead to a "dead end" situation, typically after months or years of investments already done on the wrong software. This article, far from being a full and complete guide to statistical software evaluation, aims to illustrate some key points of the decision process and introduce an extended range of factors which can help to undertake the right choice, at least in potential. There is not enough literature about that topic, most of the time underestimated, both in the traditional literature and even in the so called "gray literature", even if some documents or short pages can be found online. Anyhow, it seems there is not a common and known standpoint about the process of software evaluation from the final user perspective. We suggests a multi-factor analysis leading to an evaluation matrix tool, to be intended as a flexible and customizable tool, aimed to provide a clearer picture of the software alternatives available, not in abstract but related to the researcher's own context and needs. This method is a result of about twenty years of experience of the author in the field of evaluating and using technical-computing software and partially arises from a research made about such topics as part of a project funded by European Commission under the Lifelong Learning Programme 2011.
A process improvement model for software verification and validation
NASA Technical Reports Server (NTRS)
Callahan, John; Sabolish, George
1994-01-01
We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and space station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.
A process improvement model for software verification and validation
NASA Technical Reports Server (NTRS)
Callahan, John; Sabolish, George
1994-01-01
We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and Space Station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.
Software Quality Assurance Metrics
NASA Technical Reports Server (NTRS)
McRae, Kalindra A.
2004-01-01
Software Quality Assurance (SQA) is a planned and systematic set of activities that ensures conformance of software life cycle processes and products conform to requirements, standards and procedures. In software development, software quality means meeting requirements and a degree of excellence and refinement of a project or product. Software Quality is a set of attributes of a software product by which its quality is described and evaluated. The set of attributes includes functionality, reliability, usability, efficiency, maintainability, and portability. Software Metrics help us understand the technical process that is used to develop a product. The process is measured to improve it and the product is measured to increase quality throughout the life cycle of software. Software Metrics are measurements of the quality of software. Software is measured to indicate the quality of the product, to assess the productivity of the people who produce the product, to assess the benefits derived from new software engineering methods and tools, to form a baseline for estimation, and to help justify requests for new tools or additional training. Any part of the software development can be measured. If Software Metrics are implemented in software development, it can save time, money, and allow the organization to identify the caused of defects which have the greatest effect on software development. The summer of 2004, I worked with Cynthia Calhoun and Frank Robinson in the Software Assurance/Risk Management department. My task was to research and collect, compile, and analyze SQA Metrics that have been used in other projects that are not currently being used by the SA team and report them to the Software Assurance team to see if any metrics can be implemented in their software assurance life cycle process.
Software Process Assurance for Complex Electronics
NASA Technical Reports Server (NTRS)
Plastow, Richard A.
2007-01-01
Complex Electronics (CE) now perform tasks that were previously handled in software, such as communication protocols. Many methods used to develop software bare a close resemblance to CE development. Field Programmable Gate Arrays (FPGAs) can have over a million logic gates while system-on-chip (SOC) devices can combine a microprocessor, input and output channels, and sometimes an FPGA for programmability. With this increased intricacy, the possibility of software-like bugs such as incorrect design, logic, and unexpected interactions within the logic is great. With CE devices obscuring the hardware/software boundary, we propose that mature software methodologies may be utilized with slight modifications in the development of these devices. Software Process Assurance for Complex Electronics (SPACE) is a research project that used standardized S/W Assurance/Engineering practices to provide an assurance framework for development activities. Tools such as checklists, best practices and techniques were used to detect missing requirements and bugs earlier in the development cycle creating a development process for CE that was more easily maintained, consistent and configurable based on the device used.
The validation by measurement theory of proposed object-oriented software metrics
NASA Technical Reports Server (NTRS)
Neal, Ralph D.
1994-01-01
Moving software development into the engineering arena requires controllability, and to control a process, it must be measurable. Measuring the process does no good if the product is not also measured, i.e., being the best at producing an inferior product does not define a quality process. Also, not every number extracted from software development is a valid measurement. A valid measurement only results when we are able to verify that the number is representative of the attribute that we wish to measure. Many proposed software metrics are used by practitioners without these metrics ever having been validated, leading to costly but often useless calculations. Several researchers have bemoaned the lack of scientific precision in much of the published software measurement work and have called for validation of software metrics by measurement theory. This dissertation applies measurement theory to validate fifty proposed object-oriented software metrics (Li and Henry, 1993; Chidamber and Kemerrer, 1994; Lorenz and Kidd, 1994).
Read, Sue; Nte, Sol; Corcoran, Patsy; Stephens, Richard
2013-05-01
Loss is a universal experience and death is perceived as the ultimate loss. The overarching aim of this research is to produce a qualitative, flexible, interactive, computerised tool to support the facilitation of emotional expressions around loss for people with intellectual disabilities. This paper explores the process of using Participatory Action Research (PAR) to develop this tool. Participator Action Research provided the indicative framework for the process of developing a software tool that is likely to be used in practice. People with intellectual disability worked alongside researchers to produce an accessible, flexible piece of software that can facilitate storytelling around loss and bereavement and promote spontaneous expression that can be shared with others. This tool has the capacity to enable individuals to capture experiences in a storyboard format; that can be stored; is easily retrievable; can be printed out; and could feasibly be personalised by the insertion of photographs. © 2012 Blackwell Publishing Ltd.
Canary: An NLP Platform for Clinicians and Researchers.
Malmasi, Shervin; Sandor, Nicolae L; Hosomura, Naoshi; Goldberg, Matt; Skentzos, Stephen; Turchin, Alexander
2017-05-03
Information Extraction methods can help discover critical knowledge buried in the vast repositories of unstructured clinical data. However, these methods are underutilized in clinical research, potentially due to the absence of free software geared towards clinicians with little technical expertise. The skills required for developing/using such software constitute a major barrier for medical researchers wishing to employ these methods. To address this, we have developed Canary, a free and open-source solution designed for users without natural language processing (NLP) or software engineering experience. It was designed to be fast and work out of the box via a user-friendly graphical interface.
Knowledge-based processing for aircraft flight control
NASA Technical Reports Server (NTRS)
Painter, John H.; Glass, Emily; Economides, Gregory; Russell, Paul
1994-01-01
This Contractor Report documents research in Intelligent Control using knowledge-based processing in a manner dual to methods found in the classic stochastic decision, estimation, and control discipline. Such knowledge-based control has also been called Declarative, and Hybid. Software architectures were sought, employing the parallelism inherent in modern object-oriented modeling and programming. The viewpoint adopted was that Intelligent Control employs a class of domain-specific software architectures having features common over a broad variety of implementations, such as management of aircraft flight, power distribution, etc. As much attention was paid to software engineering issues as to artificial intelligence and control issues. This research considered that particular processing methods from the stochastic and knowledge-based worlds are duals, that is, similar in a broad context. They provide architectural design concepts which serve as bridges between the disparate disciplines of decision, estimation, control, and artificial intelligence. This research was applied to the control of a subsonic transport aircraft in the airport terminal area.
Evaluation of interaction dynamics of concurrent processes
NASA Astrophysics Data System (ADS)
Sobecki, Piotr; Białasiewicz, Jan T.; Gross, Nicholas
2017-03-01
The purpose of this paper is to present the wavelet tools that enable the detection of temporal interactions of concurrent processes. In particular, the determination of interaction coherence of time-varying signals is achieved using a complex continuous wavelet transform. This paper has used electrocardiogram (ECG) and seismocardiogram (SCG) data set to show multiple continuous wavelet analysis techniques based on Morlet wavelet transform. MATLAB Graphical User Interface (GUI), developed in the reported research to assist in quick and simple data analysis, is presented. These software tools can discover the interaction dynamics of time-varying signals, hence they can reveal their correlation in phase and amplitude, as well as their non-linear interconnections. The user-friendly MATLAB GUI enables effective use of the developed software what enables to load two processes under investigation, make choice of the required processing parameters, and then perform the analysis. The software developed is a useful tool for researchers who have a need for investigation of interaction dynamics of concurrent processes.
Towards a mature measurement environment: Creating a software engineering research environment
NASA Technical Reports Server (NTRS)
Basili, Victor R.
1990-01-01
Software engineering researchers are building tools, defining methods, and models; however, there are problems with the nature and style of the research. The research is typically bottom-up, done in isolation so the pieces cannot be easily logically or physically integrated. A great deal of the research is essentially the packaging of a particular piece of technology with little indication of how the work would be integrated with other prices of research. The research is not aimed at solving the real problems of software engineering, i.e., the development and maintenance of quality systems in a productive manner. The research results are not evaluated or analyzed via experimentation or refined and tailored to the application environment. Thus, it cannot be easily transferred into practice. Because of these limitations we have not been able to understand the components of the discipline as a coherent whole and the relationships between various models of the process and product. What is needed is a top down experimental, evolutionary framework in which research can be focused, logically and physically integrated to produce quality software productively, and evaluated and tailored to the application environment. This implies the need for experimentation, which in turn implies the need for a laboratory that is associated with the artifact we are studying. This laboratory can only exist in an environment where software is being built, i.e., as part of a real software development and maintenance organization. Thus, we propose that Software Engineering Laboratory (SEL) type activities exist in all organizations to support software engineering research. We describe the SEL from a researcher's point of view, and discuss the corporate and government benefits of the SEL. The discussion focuses on the benefits to the research community.
Doing Qualitative Research Using Your Computer: A Practical Guide
ERIC Educational Resources Information Center
Hahn, Chris
2008-01-01
This book is a practical, hands-on guide to using commonly available everyday technology, including Microsoft software, to manage and streamline research projects. It uses straight-forward, everyday language to walk readers through this process, drawing on a wide range of examples to demonstrate how easy it is to use such software. This guide is…
Image Understanding Architecture
1991-09-01
architecture to support real-time, knowledge -based image understanding , and develop the software support environment that will be needed to utilize...NUMBER OF PAGES Image Understanding Architecture, Knowledge -Based Vision, AI Real-Time Computer Vision, Software Simulator, Parallel Processor IL PRICE... information . In addition to sensory and knowledge -based processing it is useful to introduce a level of symbolic processing. Thus, vision researchers
Automation software for a materials testing laboratory
NASA Technical Reports Server (NTRS)
Mcgaw, Michael A.; Bonacuse, Peter J.
1990-01-01
The software environment in use at the NASA-Lewis Research Center's High Temperature Fatigue and Structures Laboratory is reviewed. This software environment is aimed at supporting the tasks involved in performing materials behavior research. The features and capabilities of the approach to specifying a materials test include static and dynamic control mode switching, enabling multimode test control; dynamic alteration of the control waveform based upon events occurring in the response variables; precise control over the nature of both command waveform generation and data acquisition; and the nesting of waveform/data acquisition strategies so that material history dependencies may be explored. To eliminate repetitive tasks in the coventional research process, a communications network software system is established which provides file interchange and remote console capabilities.
A study of software standards used in the avionics industry
NASA Technical Reports Server (NTRS)
Hayhurst, Kelly J.
1994-01-01
Within the past decade, software has become an increasingly common element in computing systems. In particular, the role of software used in the aerospace industry, especially in life- or safety-critical applications, is rapidly expanding. This intensifies the need to use effective techniques for achieving and verifying the reliability of avionics software. Although certain software development processes and techniques are mandated by government regulating agencies, no one methodology has been shown to consistently produce reliable software. The knowledge base for designing reliable software simply has not reached the maturity of its hardware counterpart. In an effort to increase our understanding of software, the Langley Research Center conducted a series of experiments over 15 years with the goal of understanding why and how software fails. As part of this program, the effectiveness of current industry standards for the development of avionics is being investigated. This study involves the generation of a controlled environment to conduct scientific experiments on software processes.
Software control and system configuration management: A systems-wide approach
NASA Technical Reports Server (NTRS)
Petersen, K. L.; Flores, C., Jr.
1984-01-01
A comprehensive software control and system configuration management process for flight-crucial digital control systems of advanced aircraft has been developed and refined to insure efficient flight system development and safe flight operations. Because of the highly complex interactions among the hardware, software, and system elements of state-of-the-art digital flight control system designs, a systems-wide approach to configuration control and management has been used. Specific procedures are implemented to govern discrepancy reporting and reconciliation, software and hardware change control, systems verification and validation testing, and formal documentation requirements. An active and knowledgeable configuration control board reviews and approves all flight system configuration modifications and revalidation tests. This flexible process has proved effective during the development and flight testing of several research aircraft and remotely piloted research vehicles with digital flight control systems that ranged from relatively simple to highly complex, integrated mechanizations.
NASA Astrophysics Data System (ADS)
Hwang, L.; Kellogg, L. H.
2017-12-01
Curation of software promotes discoverability and accessibility and works hand in hand with scholarly citation to ascribe value to, and provide recognition for software development. To meet this challenge, the Computational Infrastructure for Geodynamics (CIG) maintains a community repository built on custom and open tools to promote discovery, access, identification, credit, and provenance of research software for the geodynamics community. CIG (geodynamics.org) originated from recognition of the tremendous effort required to develop sound software and the need to reduce duplication of effort and to sustain community codes. CIG curates software across 6 domains and has developed and follows software best practices that include establishing test cases, documentation, and a citable publication for each software package. CIG software landing web pages provide access to current and past releases; many are also accessible through the CIG community repository on github. CIG has now developed abc - attribution builder for citation to enable software users to give credit to software developers. abc uses zenodo as an archive and as the mechanism to obtain a unique identifier (DOI) for scientific software. To assemble the metadata, we searched the software's documentation and research publications and then requested the primary developers to verify. In this process, we have learned that each development community approaches software attribution differently. The metadata gathered is based on guidelines established by groups such as FORCE11 and OntoSoft. The rollout of abc is gradual as developers are forward-looking, rarely willing to go back and archive prior releases in zenodo. Going forward all actively developed packages will utilize the zenodo and github integration to automate the archival process when a new release is issued. How to handle legacy software, multi-authored libraries, and assigning roles to software remain open issues.
Platform-independent software for medical image processing on the Internet
NASA Astrophysics Data System (ADS)
Mancuso, Michael E.; Pathak, Sayan D.; Kim, Yongmin
1997-05-01
We have developed a software tool for image processing over the Internet. The tool is a general purpose, easy to use, flexible, platform independent image processing software package with functions most commonly used in medical image processing.It provides for processing of medical images located wither remotely on the Internet or locally. The software was written in Java - the new programming language developed by Sun Microsystems. It was compiled and tested using Microsoft's Visual Java 1.0 and Microsoft's Just in Time Compiler 1.00.6211. The software is simple and easy to use. In order to use the tool, the user needs to download the software from our site before he/she runs it using any Java interpreter, such as those supplied by Sun, Symantec, Borland or Microsoft. Future versions of the operating systems supplied by Sun, Microsoft, Apple, IBM, and others will include Java interpreters. The software is then able to access and process any image on the iNternet or on the local computer. Using a 512 X 512 X 8-bit image, a 3 X 3 convolution took 0.88 seconds on an Intel Pentium Pro PC running at 200 MHz with 64 Mbytes of memory. A window/level operation took 0.38 seconds while a 3 X 3 median filter took 0.71 seconds. These performance numbers demonstrate the feasibility of using this software interactively on desktop computes. Our software tool supports various image processing techniques commonly used in medical image processing and can run without the need of any specialized hardware. It can become an easily accessible resource over the Internet to promote the learning and of understanding image processing algorithms. Also, it could facilitate sharing of medical image databases and collaboration amongst researchers and clinicians, regardless of location.
Development of a case tool to support decision based software development
NASA Technical Reports Server (NTRS)
Wild, Christian J.
1993-01-01
A summary of the accomplishments of the research over the past year are presented. Achievements include: made demonstrations with DHC, a prototype supporting decision based software development (DBSD) methodology, for Paramax personnel at ODU; met with Paramax personnel to discuss DBSD issues, the process of integrating DBSD and Refinery and the porting process model; completed and submitted a paper describing DBSD paradigm to IFIP '92; completed and presented a paper describing the approach for software reuse at the Software Reuse Workshop in April 1993; continued to extend DHC with a project agenda, facility necessary for a better project management; completed a primary draft of the re-engineering process model for porting; created a logging form to trace all the activities involved in the process of solving the reengineering problem, and developed a primary chart with the problems involved by the reengineering process.
The Implication of Using NVivo Software in Qualitative Data Analysis: Evidence-Based Reflections.
Zamawe, F C
2015-03-01
For a long time, electronic data analysis has been associated with quantitative methods. However, Computer Assisted Qualitative Data Analysis Software (CAQDAS) are increasingly being developed. Although the CAQDAS has been there for decades, very few qualitative health researchers report using it. This may be due to the difficulties that one has to go through to master the software and the misconceptions that are associated with using CAQDAS. While the issue of mastering CAQDAS has received ample attention, little has been done to address the misconceptions associated with CAQDAS. In this paper, the author reflects on his experience of interacting with one of the popular CAQDAS (NVivo) in order to provide evidence-based implications of using the software. The key message is that unlike statistical software, the main function of CAQDAS is not to analyse data but rather to aid the analysis process, which the researcher must always remain in control of. In other words, researchers must equally know that no software can analyse qualitative data. CAQDAS are basically data management packages, which support the researcher during analysis.
The Consumer Juggernaut: Web-Based and Mobile Applications as Innovation Pioneer
NASA Astrophysics Data System (ADS)
Messerschmitt, David G.
As happened previously in electronics, software targeted at consumers is increasingly the focus of investment and innovation. Some of the areas where it is leading is animated interfaces, treating users as a community, audio and video information, software as a service, agile software development, and the integration of business models with software design. As a risk-taking and experimental market, and as a source of ideas, consumer software can benefit other areas of applications software. The influence of consumer software can be magnified by research into the internal organizations and processes of the innovative firms at its foundation.
A design methodology for portable software on parallel computers
NASA Technical Reports Server (NTRS)
Nicol, David M.; Miller, Keith W.; Chrisman, Dan A.
1993-01-01
This final report for research that was supported by grant number NAG-1-995 documents our progress in addressing two difficulties in parallel programming. The first difficulty is developing software that will execute quickly on a parallel computer. The second difficulty is transporting software between dissimilar parallel computers. In general, we expect that more hardware-specific information will be included in software designs for parallel computers than in designs for sequential computers. This inclusion is an instance of portability being sacrificed for high performance. New parallel computers are being introduced frequently. Trying to keep one's software on the current high performance hardware, a software developer almost continually faces yet another expensive software transportation. The problem of the proposed research is to create a design methodology that helps designers to more precisely control both portability and hardware-specific programming details. The proposed research emphasizes programming for scientific applications. We completed our study of the parallelizability of a subsystem of the NASA Earth Radiation Budget Experiment (ERBE) data processing system. This work is summarized in section two. A more detailed description is provided in Appendix A ('Programming Practices to Support Eventual Parallelism'). Mr. Chrisman, a graduate student, wrote and successfully defended a Ph.D. dissertation proposal which describes our research associated with the issues of software portability and high performance. The list of research tasks are specified in the proposal. The proposal 'A Design Methodology for Portable Software on Parallel Computers' is summarized in section three and is provided in its entirety in Appendix B. We are currently studying a proposed subsystem of the NASA Clouds and the Earth's Radiant Energy System (CERES) data processing system. This software is the proof-of-concept for the Ph.D. dissertation. We have implemented and measured the performance of a portion of this subsystem on the Intel iPSC/2 parallel computer. These results are provided in section four. Our future work is summarized in section five, our acknowledgements are stated in section six, and references for published papers associated with NAG-1-995 are provided in section seven.
Tools for 3D scientific visualization in computational aerodynamics at NASA Ames Research Center
NASA Technical Reports Server (NTRS)
Bancroft, Gordon; Plessel, Todd; Merritt, Fergus; Watson, Val
1989-01-01
Hardware, software, and techniques used by the Fluid Dynamics Division (NASA) for performing visualization of computational aerodynamics, which can be applied to the visualization of flow fields from computer simulations of fluid dynamics about the Space Shuttle, are discussed. Three visualization techniques applied, post-processing, tracking, and steering, are described, as well as the post-processing software packages used, PLOT3D, SURF (Surface Modeller), GAS (Graphical Animation System), and FAST (Flow Analysis software Toolkit). Using post-processing methods a flow simulation was executed on a supercomputer and, after the simulation was complete, the results were processed for viewing. It is shown that the high-resolution, high-performance three-dimensional workstation combined with specially developed display and animation software provides a good tool for analyzing flow field solutions obtained from supercomputers.
Clinical records anonymisation and text extraction (CRATE): an open-source software system.
Cardinal, Rudolf N
2017-04-26
Electronic medical records contain information of value for research, but contain identifiable and often highly sensitive confidential information. Patient-identifiable information cannot in general be shared outside clinical care teams without explicit consent, but anonymisation/de-identification allows research uses of clinical data without explicit consent. This article presents CRATE (Clinical Records Anonymisation and Text Extraction), an open-source software system with separable functions: (1) it anonymises or de-identifies arbitrary relational databases, with sensitivity and precision similar to previous comparable systems; (2) it uses public secure cryptographic methods to map patient identifiers to research identifiers (pseudonyms); (3) it connects relational databases to external tools for natural language processing; (4) it provides a web front end for research and administrative functions; and (5) it supports a specific model through which patients may consent to be contacted about research. Creation and management of a research database from sensitive clinical records with secure pseudonym generation, full-text indexing, and a consent-to-contact process is possible and practical using entirely free and open-source software.
Providing structural modules with self-integrity monitoring software user's manual
NASA Technical Reports Server (NTRS)
1990-01-01
National Aeronautics and Space Administration (NASA) Contract NAS7-961 (A Small Business Innovation and Research (SBIR) contract from NASA) involved research dealing with remote structural damage detection using the concept of substructures. Several approaches were developed. The main two were: (1) the module (substructure) transfer function matrix (MTFM) approach; and (2) modal strain energy distribution method (MSEDM). Either method can be used with a global structure; however, the focus was on substructures. As part of the research contract, computer software was to be developed which would implement the developed methods. This was done and it was used to process all the finite element generated numerical data for the research. The software was written for the IBM AT personal computer. Copies of it were placed on floppy disks. This report serves as a user's manual for the two sets of damage detection software. Sections 2.0 and 3.0 discuss the use of the MTFM and MSEDM software, respectively.
ERIC Educational Resources Information Center
Schmidt, Matthew; Galyen, Krista; Laffey, James; Babiuch, Ryan; Schmidt, Carla
2014-01-01
Design-based research (DBR) and open source software are both acknowledged as potentially productive ways for advancing learning technologies. These approaches have practical benefits for the design and development process and for building and leveraging community to augment and sustain design and development. This report presents a case study of…
S-Cube: Enabling the Next Generation of Software Services
NASA Astrophysics Data System (ADS)
Metzger, Andreas; Pohl, Klaus
The Service Oriented Architecture (SOA) paradigm is increasingly adopted by industry for building distributed software systems. However, when designing, developing and operating innovative software services and servicebased systems, several challenges exist. Those challenges include how to manage the complexity of those systems, how to establish, monitor and enforce Quality of Service (QoS) and Service Level Agreements (SLAs), as well as how to build those systems such that they can proactively adapt to dynamically changing requirements and context conditions. Developing foundational solutions for those challenges requires joint efforts of different research communities such as Business Process Management, Grid Computing, Service Oriented Computing and Software Engineering. This paper provides an overview of S-Cube, the European Network of Excellence on Software Services and Systems. S-Cube brings together researchers from leading research institutions across Europe, who join their competences to develop foundations, theories as well as methods and tools for future service-based systems.
Reproducible Research in the Geosciences at Scale: Achievable Goal or Elusive Dream?
NASA Astrophysics Data System (ADS)
Wyborn, L. A.; Evans, B. J. K.
2016-12-01
Reproducibility is a fundamental tenant of the scientific method: it implies that any researcher, or a third party working independently, can duplicate any experiment or investigation and produce the same results. Historically computationally based research involved an individual using their own data and processing it in their own private area, often using software they wrote or inherited from close collaborators. Today, a researcher is likely to be part of a large team that will use a subset of data from an external repository and then process the data on a public or private cloud or on a large centralised supercomputer, using a mixture of their own code, third party software and libraries, or global community codes. In 'Big Geoscience' research it is common for data inputs to be extracts from externally managed dynamic data collections, where new data is being regularly appended, or existing data is revised when errors are detected and/or as processing methods are improved. New workflows increasingly use services to access data dynamically to create subsets on-the-fly from distributed sources, each of which can have a complex history. At major computational facilities, underlying systems, libraries, software and services are being constantly tuned and optimised, or as new or replacement infrastructure being installed. Likewise code used from a community repository is continually being refined, re-packaged and ported to the target platform. To achieve reproducibility, today's researcher increasingly needs to track their workflow, including querying information on the current or historical state of facilities used. Versioning methods are standard practice for software repositories or packages, but it is not common for either data repositories or data services to provide information about their state, or for systems to provide query-able access to changes in the underlying software. While a researcher can achieve transparency and describe steps in their workflow so that others can repeat them and replicate processes undertaken, they cannot achieve exact reproducibility or even transparency of results generated. In Big Geoscience, full reproducibiliy will be an elusive dream until data repositories and compute facilities can provide provenance information in a standards compliant, machine query-able way.
GCS plan for software aspects of certification
NASA Technical Reports Server (NTRS)
Shagnea, Anita M.; Lowman, Douglas S.; Withers, B. Edward
1990-01-01
As part of the Guidance and Control Software (GCS) research project being sponsored by NASA to evaluate the failure processes of software, standard industry software development procedures are being employed. To ensure that these procedures are authentic, the guidelines outlined in the Radio Technical Commission for Aeronautics (RTCA/DO-178A document entitled, software considerations in airborne systems and equipment certification, were adopted. A major aspect of these guidelines is proper documentation. As such, this report, the plan for software aspects of certification, was produced in accordance with DO-178A. An overview is given of the GCS research project, including the goals of the project, project organization, and project schedules. It also specifies the plans for all aspects of the project which relate to the certification of the GCS implementations developed under a NASA contract. These plans include decisions made regarding the software specification, accuracy requirements, configuration management, implementation development and verification, and the development of the GCS simulator.
Idea Paper: The Lifecycle of Software for Scientific Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dubey, Anshu; McInnes, Lois C.
The software lifecycle is a well researched topic that has produced many models to meet the needs of different types of software projects. However, one class of projects, software development for scientific computing, has received relatively little attention from lifecycle researchers. In particular, software for end-to-end computations for obtaining scientific results has received few lifecycle proposals and no formalization of a development model. An examination of development approaches employed by the teams implementing large multicomponent codes reveals a great deal of similarity in their strategies. This idea paper formalizes these related approaches into a lifecycle model for end-to-end scientific applicationmore » software, featuring loose coupling between submodels for development of infrastructure and scientific capability. We also invite input from stakeholders to converge on a model that captures the complexity of this development processes and provides needed lifecycle guidance to the scientific software community.« less
Developing software to use parallel processing effectively. Final report, June-December 1987
DOE Office of Scientific and Technical Information (OSTI.GOV)
Center, J.
1988-10-01
This report describes the difficulties involved in writing efficient parallel programs and describes the hardware and software support currently available for generating software that utilizes processing effectively. Historically, the processing rate of single-processor computers has increased by one order of magnitude every five years. However, this pace is slowing since electronic circuitry is coming up against physical barriers. Unfortunately, the complexity of engineering and research problems continues to require ever more processing power (far in excess of the maximum estimated 3 Gflops achievable by single-processor computers). For this reason, parallel-processing architectures are receiving considerable interest, since they offer high performancemore » more cheaply than a single-processor supercomputer, such as the Cray.« less
Experience Transitioning Models and Data at the NOAA Space Weather Prediction Center
NASA Astrophysics Data System (ADS)
Berger, Thomas
2016-07-01
The NOAA Space Weather Prediction Center has a long history of transitioning research data and models into operations and with the validation activities required. The first stage in this process involves demonstrating that the capability has sufficient value to customers to justify the cost needed to transition it and to run it continuously and reliably in operations. Once the overall value is demonstrated, a substantial effort is then required to develop the operational software from the research codes. The next stage is to implement and test the software and product generation on the operational computers. Finally, effort must be devoted to establishing long-term measures of performance, maintaining the software, and working with forecasters, customers, and researchers to improve over time the operational capabilities. This multi-stage process of identifying, transitioning, and improving operational space weather capabilities will be discussed using recent examples. Plans for future activities will also be described.
ERIC Educational Resources Information Center
Greenberg, Richard
1998-01-01
Describes the Image Processing for Teaching (IPT) project which provides digital image processing to excite students about science and mathematics as they use research-quality software on microcomputers. Provides information on IPT whose components of this dissemination project have been widespread teacher education, curriculum-based materials…
Design and implementation of a Windows NT network to support CNC activities
NASA Technical Reports Server (NTRS)
Shearrow, C. A.
1996-01-01
The Manufacturing, Materials, & Processes Technology Division is undergoing dramatic changes to bring it's manufacturing practices current with today's technological revolution. The Division is developing Computer Automated Design and Computer Automated Manufacturing (CAD/CAM) abilities. The development of resource tracking is underway in the form of an accounting software package called Infisy. These two efforts will bring the division into the 1980's in relationship to manufacturing processes. Computer Integrated Manufacturing (CIM) is the final phase of change to be implemented. This document is a qualitative study and application of a CIM application capable of finishing the changes necessary to bring the manufacturing practices into the 1990's. The documentation provided in this qualitative research effort includes discovery of the current status of manufacturing in the Manufacturing, Materials, & Processes Technology Division including the software, hardware, network and mode of operation. The proposed direction of research included a network design, computers to be used, software to be used, machine to computer connections, estimate a timeline for implementation, and a cost estimate. Recommendation for the division's improvement include action to be taken, software to utilize, and computer configurations.
NASA Technical Reports Server (NTRS)
1997-01-01
Real-Time Innovations, Inc. (RTI) collaborated with Ames Research Center, the Jet Propulsion Laboratory and Stanford University to leverage NASA research to produce ControlShell software. RTI is the first "graduate" of Ames Research Center's Technology Commercialization Center. The ControlShell system was used extensively on a cooperative project to enhance the capabilities of a Russian-built Marsokhod rover being evaluated for eventual flight to Mars. RTI's ControlShell is complex, real-time command and control software, capable of processing information and controlling mechanical devices. One ControlShell tool is StethoScope. As a real-time data collection and display tool, StethoScope allows a user to see how a program is running without changing its execution. RTI has successfully applied its software savvy in other arenas, such as telecommunications, networking, video editing, semiconductor manufacturing, automobile systems, and medical imaging.
Adapting Rational Unified Process (RUP) approach in designing a secure e-Tendering model
NASA Astrophysics Data System (ADS)
Mohd, Haslina; Robie, Muhammad Afdhal Muhammad; Baharom, Fauziah; Darus, Norida Muhd; Saip, Mohamed Ali; Yasin, Azman
2016-08-01
e-Tendering is an electronic processing of the tender document via internet and allow tenderer to publish, communicate, access, receive and submit all tender related information and documentation via internet. This study aims to design the e-Tendering system using Rational Unified Process approach. RUP provides a disciplined approach on how to assign tasks and responsibilities within the software development process. RUP has four phases that can assist researchers to adjust the requirements of various projects with different scope, problem and the size of projects. RUP is characterized as a use case driven, architecture centered, iterative and incremental process model. However the scope of this study only focusing on Inception and Elaboration phases as step to develop the model and perform only three of nine workflows (business modeling, requirements, analysis and design). RUP has a strong focus on documents and the activities in the inception and elaboration phases mainly concern the creation of diagrams and writing of textual descriptions. The UML notation and the software program, Star UML are used to support the design of e-Tendering. The e-Tendering design based on the RUP approach can contribute to e-Tendering developers and researchers in e-Tendering domain. In addition, this study also shows that the RUP is one of the best system development methodology that can be used as one of the research methodology in Software Engineering domain related to secured design of any observed application. This methodology has been tested in various studies in certain domains, such as in Simulation-based Decision Support, Security Requirement Engineering, Business Modeling and Secure System Requirement, and so forth. As a conclusion, these studies showed that the RUP one of a good research methodology that can be adapted in any Software Engineering (SE) research domain that required a few artifacts to be generated such as use case modeling, misuse case modeling, activity diagram, and initial class diagram from a list of requirements as identified earlier by the SE researchers
Bushland Evapotranspiration and Agricultural Remote Sensing System (BEARS) software
NASA Astrophysics Data System (ADS)
Gowda, P. H.; Moorhead, J.; Brauer, D. K.
2017-12-01
Evapotranspiration (ET) is a major component of the hydrologic cycle. ET data are used for a variety of water management and research purposes such as irrigation scheduling, water and crop modeling, streamflow, water availability, and many more. Remote sensing products have been widely used to create spatially representative ET data sets which provide important information from field to regional scales. As UAV capabilities increase, remote sensing use is likely to also increase. For that purpose, scientists at the USDA-ARS research laboratory in Bushland, TX developed the Bushland Evapotranspiration and Agricultural Remote Sensing System (BEARS) software. The BEARS software is a Java based software that allows users to process remote sensing data to generate ET outputs using predefined models, or enter custom equations and models. The capability to define new equations and build new models expands the applicability of the BEARS software beyond ET mapping to any remote sensing application. The software also includes an image viewing tool that allows users to visualize outputs, as well as draw an area of interest using various shapes. This software is freely available from the USDA-ARS Conservation and Production Research Laboratory website.
Development of a Software Safety Process and a Case Study of Its Use
NASA Technical Reports Server (NTRS)
Knight, J. C.
1996-01-01
Research in the year covered by this reporting period has been primarily directed toward: continued development of mock-ups of computer screens for operator of a digital reactor control system; development of a reactor simulation to permit testing of various elements of the control system; formal specification of user interfaces; fault-tree analysis including software; evaluation of formal verification techniques; and continued development of a software documentation system. Technical results relating to this grant and the remainder of the principal investigator's research program are contained in various reports and papers.
Lindoerfer, Doris; Mansmann, Ulrich
2017-07-01
Patient registries are instrumental for medical research. Often their structures are complex and their implementations use composite software systems to meet the wide spectrum of challenges. Commercial and open-source systems are available for registry implementation, but many research groups develop their own systems. Methodological approaches in the selection of software as well as the construction of proprietary systems are needed. We propose an evidence-based checklist, summarizing essential items for patient registry software systems (CIPROS), to accelerate the requirements engineering process. Requirements engineering activities for software systems follow traditional software requirements elicitation methods, general software requirements specification (SRS) templates, and standards. We performed a multistep procedure to develop a specific evidence-based CIPROS checklist: (1) A systematic literature review to build a comprehensive collection of technical concepts, (2) a qualitative content analysis to define a catalogue of relevant criteria, and (3) a checklist to construct a minimal appraisal standard. CIPROS is based on 64 publications and covers twelve sections with a total of 72 items. CIPROS also defines software requirements. Comparing CIPROS with traditional software requirements elicitation methods, SRS templates and standards show a broad consensus but differences in issues regarding registry-specific aspects. Using an evidence-based approach to requirements engineering for registry software adds aspects to the traditional methods and accelerates the software engineering process for registry software. The method we used to construct CIPROS serves as a potential template for creating evidence-based checklists in other fields. The CIPROS list supports developers in assessing requirements for existing systems and formulating requirements for their own systems, while strengthening the reporting of patient registry software system descriptions. It may be a first step to create standards for patient registry software system assessments. Copyright © 2017 Elsevier Inc. All rights reserved.
Measuring the complexity of design in real-time imaging software
NASA Astrophysics Data System (ADS)
Sangwan, Raghvinder S.; Vercellone-Smith, Pamela; Laplante, Phillip A.
2007-02-01
Due to the intricacies in the algorithms involved, the design of imaging software is considered to be more complex than non-image processing software (Sangwan et al, 2005). A recent investigation (Larsson and Laplante, 2006) examined the complexity of several image processing and non-image processing software packages along a wide variety of metrics, including those postulated by McCabe (1976), Chidamber and Kemerer (1994), and Martin (2003). This work found that it was not always possible to quantitatively compare the complexity between imaging applications and nonimage processing systems. Newer research and an accompanying tool (Structure 101, 2006), however, provides a greatly simplified approach to measuring software complexity. Therefore it may be possible to definitively quantify the complexity differences between imaging and non-imaging software, between imaging and real-time imaging software, and between software programs of the same application type. In this paper, we review prior results and describe the methodology for measuring complexity in imaging systems. We then apply a new complexity measurement methodology to several sets of imaging and non-imaging code in order to compare the complexity differences between the two types of applications. The benefit of such quantification is far reaching, for example, leading to more easily measured performance improvement and quality in real-time imaging code.
NASA Astrophysics Data System (ADS)
Chu, X.
2011-12-01
This study, funded by the NSF CAREER program, focuses on developing new methods to quantify microtopography-controlled overland flow processes and integrating the cutting-edge hydrologic research with all-level education and outreach activities. To achieve the educational goal, an interactive teaching-learning software package has been developed. This software, with enhanced visualization capabilities, integrates the new modeling techniques, computer-guided learning processes, and education-oriented tools in a user-friendly interface. Both Windows-based and web-based versions have been developed. The software is specially designed for three major user levels: elementary level (Level 1: K-12 and outreach education), medium level (Level 2: undergraduate education), and advanced level (Level 3: graduate education). Depending on the levels, users are guided to different educational systems. Each system consists of a series of mini "libraries" featured with movies, pictures, and documentation that cover fundamental theories, varying scale experiments, and computer modeling of overland flow generation, surface runoff, and infiltration processes. Testing and practical use of this educational software in undergraduate and graduate teaching demonstrate its effectiveness to promote students' learning and interest in hydrologic sciences. This educational software also has been used as a hydrologic demonstration tool for K-12 students and Native American students through the Nurturing American Tribal Undergraduate Research Education (NATURE) program and Science, Technology, Engineering and Mathematics (STEM) outreach activities.
NASA Technical Reports Server (NTRS)
Cramer, K. Elliott; Syed, Hazari I.
1995-01-01
This user's manual describes the installation and operation of TIA, the Thermal-Imaging acquisition and processing Application, developed by the Nondestructive Evaluation Sciences Branch at NASA Langley Research Center, Hampton, Virginia. TIA is a user friendly graphical interface application for the Macintosh 2 and higher series computers. The software has been developed to interface with the Perceptics/Westinghouse Pixelpipe(TM) and PixelStore(TM) NuBus cards and the GW Instruments MacADIOS(TM) input-output (I/O) card for the Macintosh for imaging thermal data. The software is also capable of performing generic image-processing functions.
Connecting Research and Practice: An Experience Report on Research Infusion with SAVE
NASA Technical Reports Server (NTRS)
Lindvall, Mikael; Stratton, William C.; Sibol, Deane E.; Ackermann, Christopher; Reid, W. Mark; Ganesan, Dharmalingam; McComas, David; Bartholomew, Maureen; Godfrey, Sally
2009-01-01
NASA systems need to be highly dependable to avoid catastrophic mission failures. This calls for rigorous engineering processes including meticulous validation and verification. However, NASA systems are often highly distributed and overwhelmingly complex, making the software portion of these systems challenging to understand, maintain, change, reuse, and test. NASA's systems are long-lived and the software maintenance process typically constitutes 60-80% of the total cost of the entire lifecycle. Thus, in addition to the technical challenges of ensuring high life-time quality of NASA's systems, the post-development phase also presents a significant financial burden. Some of NASA's software-related challenges could potentially be addressed by some of the many powerful technologies that are being developed in software research laboratories. Many of these research technologies seek to facilitate maintenance and evolution by for example architecting, designing and modeling for quality, flexibility, and reuse. Other technologies attempt to detect and remove defects and other quality issues by various forms of automated defect detection, architecture analysis, and various forms of sophisticated simulation and testing. However promising, most such research technologies nevertheless do not make the transition from the research lab to the software lab. One reason the transition from research to practice seldom occurs is that research infusion and technology transfer is difficult. For example, factors related to the technology are sometimes overshadowed by other types of factors such as reluctance to change and therefore prohibits the technology from sticking. Successful infusion might also take very long time. One famous study showed that the discrepancy between the conception of the idea and its practical use was 18 years plus or minus three. Nevertheless, infusing new technology is possible. We have found that it takes special circumstances for such research infusion to succeed: 1) there must be evidence that the technology works in the practitioner's particular domain, 2) there must be a potential for great improvements and enhanced competitive edge for the practitioner, 3) the practitioner has to have strong individual curiosity and continuous interest in trying out new technologies, 4) the practitioner has to have support on multiple levels (i.e. from the researchers, from management, from sponsors etc), and 5) to remain infused, the new technology has to be integrated into the practitioner's processes so that it becomes a natural part of the daily work. NASA IV&V's Research Infusion initiative sponsored by NASA's Office of Safety & Mission Assurance (OSMA) through the Software Assurance Research Program (SARP), strives to overcome some of the problems related to research infusion.
Software Process Assurance for Complex Electronics (SPACE)
NASA Technical Reports Server (NTRS)
Plastow, Richard A.
2007-01-01
Complex Electronics (CE) are now programmed to perform tasks that were previously handled in software, such as communication protocols. Many of the methods used to develop software bare a close resemblance to CE development. For instance, Field Programmable Gate Arrays (FPGAs) can have over a million logic gates while system-on-chip (SOC) devices can combine a microprocessor, input and output channels, and sometimes an FPGA for programmability. With this increased intricacy, the possibility of software-like bugs such as incorrect design, logic, and unexpected interactions within the logic is great. Since CE devices are obscuring the hardware/software boundary, we propose that mature software methodologies may be utilized with slight modifications in the development of these devices. Software Process Assurance for Complex Electronics (SPACE) is a research project that looks at using standardized S/W Assurance/Engineering practices to provide an assurance framework for development activities. Tools such as checklists, best practices and techniques can be used to detect missing requirements and bugs earlier in the development cycle creating a development process for CE that will be more easily maintained, consistent and configurable based on the device used.
Gaining Control and Predictability of Software-Intensive Systems Development and Sustainment
2015-02-04
implementation of the baselines, audits , and technical reviews within an overarching systems engineering process (SEP; Defense Acquisition University...warfighters’ needs. This management and metrics effort supplements and supports the system’s technical development through the baselines, audits and...other areas that could be researched and added into the nine-tier model. Areas including software metrics, quality assurance , software-oriented
The Birth, Death, and Resurrection of an SPI Project
NASA Astrophysics Data System (ADS)
Carlsson, Sven; Schönström, Mikael
Commentators on contemporary themes of strategic management and firm competitiveness stress that a firm's competitive advantage flows from its unique knowledge and how it manages knowledge, and for many firms their ability to create, share, exchange, and use knowledge have a major impact on their competitiveness (Nonaka & Teece 2001). In software development, knowledge management (KM) plays an increasingly important role. It has been argued that the KM-field is an important source for creating new perspectives on the software development process (Iivari 2000). Several Software Process Improvement (SPI) approaches stress the importance of managing knowledge and experiences as a way for improving software processes (Ahem et al. 2001). Another SPI-trend is the use of ideas from process management like in the Capability Maturity Model (CMM). Unfortunately, little research on the effects of the use of process management ideas in SPI exists. Given the influx of process management ideas to SPI, the impact of these ideas should be addressed.
NASA Astrophysics Data System (ADS)
McCray, Wilmon Wil L., Jr.
The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization model and dashboard that demonstrates the use of statistical methods, statistical process control, sensitivity analysis, quantitative and optimization techniques to establish a baseline and predict future customer satisfaction index scores (outcomes). The American Customer Satisfaction Index (ACSI) model and industry benchmarks were used as a framework for the simulation model.
The Production Data Approach for Full Lifecycle Management
NASA Astrophysics Data System (ADS)
Schopf, J.
2012-04-01
The amount of data generated by scientists is growing exponentially, and studies have shown [Koe04] that un-archived data sets have a resource half-life that is only a fraction of those resources that are electronically archived. Most groups still lack standard approaches and procedures for data management. Arguably, however, scientists know something about building software. A recent article in Nature [Mer10] stated that 45% of research scientists spend more time now developing software than they did 5 years ago, and 38% spent at least 1/5th of their time developing software. Fox argues [Fox10] that a simple release of data is not the correct approach to data curation. In addition, just as software is used in a wide variety of ways never initially envisioned by its developers, we're seeing this even to a greater extent with data sets. In order to address the need for better data preservation and access, we propose that data sets should be managed in a similar fashion to building production quality software. These production data sets are not simply published once, but go through a cyclical process, including phases such as design, development, verification, deployment, support, analysis, and then development again, thereby supporting the full lifecycle of a data set. The process involved in academically-produced software changes over time with respect to issues such as how much it is used outside the development group, but factors in aspects such as knowing who is using the code, enabling multiple developers to contribute to code development with common procedures, formal testing and release processes, developing documentation, and licensing. When we work with data, either as a collection source, as someone tagging data, or someone re-using it, many of the lessons learned in building production software are applicable. Table 1 shows a comparison of production software elements to production data elements. Table 1: Comparison of production software and production data. Production Software Production Data End-user considerations End-user considerations Multiple Coders: Repository with check-in procedures Coding standards Multiple producers/collectors Local archive with check-in procedure Metadata Standards Formal testing Formal testing Bug tracking and fixes Bug tracking and fixes, QA/QC Documentation Documentation Formal Release Process Formal release process to external archive License Citation/usage statement The full presentation of this abstract will include a detailed discussion of these issues so that researchers can produce usable and accessible data sets as a first step toward reproducible science. By creating production-quality data sets, we extend the potential of our data, both in terms of usability and usefulness to ourselves and other researchers. The more we treat data with formal processes and release cycles, the more relevant and useful it can be to the scientific community.
Alternatives for Developing User Documentation for Applications Software
1991-09-01
style that is designed to match adult reading behaviors, using reader-based writing techniques, developing effective graphics , creating reference aids...involves research, analysis, design , and testing. The writer must have a solid understanding of the technical aspects of the document being prepared, good...ABSTRACT The preparation of software documentation is an iterative process that involves research, analysis, design , and testing. The writer must have
From Exotic to Mainstream: A 10-year Odyssey from Internet Speed to Boundary Spanning with Scrum
NASA Astrophysics Data System (ADS)
Baskerville, Richard; Pries-Heje, Jan; Madsen, Sabine
Based on four empirical studies conducted over a 10-year time period from 1999 to 2008 we investigate how local software processes interact with global changes in the software development context. In 1999 companies were developing software at high speed in a desperate rush to be first-to-market. In 2001 a new high speed/quick results development process had become established practice. In 2003 changes in the market created the need for a more balanced view on speed and quality, and in 2008 companies were successfully combining agile and plan driven approaches to achieve the benefits of both. The studies reveal a twostage pattern in which dramatic changes in the market causes disruption of established practices, experimentation, and process adaptations followed by consolidation of lessons learnt into a new (and once again mature) software development process. Limitations, implications, and areas for future research are discussed.
NASA Astrophysics Data System (ADS)
Sadchikova, G. M.
2017-01-01
This article discusses the results of the introduction of computer-aided design NX by Siemens Plm Software to the classes of a higher education institution. The necessity of application of modern information technologies in teaching students of engineering profile and selection of a software product is substantiated. The author describes stages of the software module study in relation to some specific courses, considers the features of NX software, which require the creation of standard and unified product databases. The article also gives examples of research carried out by the students with the various software modules.
Color graphics, interactive processing, and the supercomputer
NASA Technical Reports Server (NTRS)
Smith-Taylor, Rudeen
1987-01-01
The development of a common graphics environment for the NASA Langley Research Center user community and the integration of a supercomputer into this environment is examined. The initial computer hardware, the software graphics packages, and their configurations are described. The addition of improved computer graphics capability to the supercomputer, and the utilization of the graphic software and hardware are discussed. Consideration is given to the interactive processing system which supports the computer in an interactive debugging, processing, and graphics environment.
NASA Technical Reports Server (NTRS)
1998-01-01
Under an SBIR (Small Business Innovative Research) contract with Johnson Space Center, Knowledge Based Systems Inc. (KBSI) developed an intelligent software environment for modeling and analyzing mission planning activities, simulating behavior, and, using a unique constraint propagation mechanism, updating plans with each change in mission planning activities. KBSI developed this technology into a commercial product, PROJECTLINK, a two-way bridge between PROSIm, KBSI's process modeling and simulation software and leading project management software like Microsoft Project and Primavera's SureTrak Project Manager.
The connectome mapper: an open-source processing pipeline to map connectomes with MRI.
Daducci, Alessandro; Gerhard, Stephan; Griffa, Alessandra; Lemkaddem, Alia; Cammoun, Leila; Gigandet, Xavier; Meuli, Reto; Hagmann, Patric; Thiran, Jean-Philippe
2012-01-01
Researchers working in the field of global connectivity analysis using diffusion magnetic resonance imaging (MRI) can count on a wide selection of software packages for processing their data, with methods ranging from the reconstruction of the local intra-voxel axonal structure to the estimation of the trajectories of the underlying fibre tracts. However, each package is generally task-specific and uses its own conventions and file formats. In this article we present the Connectome Mapper, a software pipeline aimed at helping researchers through the tedious process of organising, processing and analysing diffusion MRI data to perform global brain connectivity analyses. Our pipeline is written in Python and is freely available as open-source at www.cmtk.org.
Automation of experimental research of waveguide paths induction soldering
NASA Astrophysics Data System (ADS)
Tynchenko, V. S.; Petrenko, V. E.; Kukartsev, V. V.; Tynchenko, V. V.; Antamoshkin, O. A.
2018-05-01
The article presents an automated system of experimental studies of the waveguide paths induction soldering process. The system is a part of additional software for a complex of automated control of the technological process of induction soldering of thin-walled waveguide paths from aluminum alloys, expanding its capabilities. The structure of the software product, the general appearance of the controls and the potential application possibilities are presented. The utility of the developed application by approbation in a series of field experiments was considered and justified. The application of the experimental research system makes it possible to improve the process under consideration, providing the possibility of fine-tuning the control regulators, as well as keeping the statistics of the soldering process in a convenient form for analysis.
The Australian Computational Earth Systems Simulator
NASA Astrophysics Data System (ADS)
Mora, P.; Muhlhaus, H.; Lister, G.; Dyskin, A.; Place, D.; Appelbe, B.; Nimmervoll, N.; Abramson, D.
2001-12-01
Numerical simulation of the physics and dynamics of the entire earth system offers an outstanding opportunity for advancing earth system science and technology but represents a major challenge due to the range of scales and physical processes involved, as well as the magnitude of the software engineering effort required. However, new simulation and computer technologies are bringing this objective within reach. Under a special competitive national funding scheme to establish new Major National Research Facilities (MNRF), the Australian government together with a consortium of Universities and research institutions have funded construction of the Australian Computational Earth Systems Simulator (ACcESS). The Simulator or computational virtual earth will provide the research infrastructure to the Australian earth systems science community required for simulations of dynamical earth processes at scales ranging from microscopic to global. It will consist of thematic supercomputer infrastructure and an earth systems simulation software system. The Simulator models and software will be constructed over a five year period by a multi-disciplinary team of computational scientists, mathematicians, earth scientists, civil engineers and software engineers. The construction team will integrate numerical simulation models (3D discrete elements/lattice solid model, particle-in-cell large deformation finite-element method, stress reconstruction models, multi-scale continuum models etc) with geophysical, geological and tectonic models, through advanced software engineering and visualization technologies. When fully constructed, the Simulator aims to provide the software and hardware infrastructure needed to model solid earth phenomena including global scale dynamics and mineralisation processes, crustal scale processes including plate tectonics, mountain building, interacting fault system dynamics, and micro-scale processes that control the geological, physical and dynamic behaviour of earth systems. ACcESS represents a part of Australia's contribution to the APEC Cooperation for Earthquake Simulation (ACES) international initiative. Together with other national earth systems science initiatives including the Japanese Earth Simulator and US General Earthquake Model projects, ACcESS aims to provide a driver for scientific advancement and technological breakthroughs including: quantum leaps in understanding of earth evolution at global, crustal, regional and microscopic scales; new knowledge of the physics of crustal fault systems required to underpin the grand challenge of earthquake prediction; new understanding and predictive capabilities of geological processes such as tectonics and mineralisation.
The (mis)use of subjective process measures in software engineering
NASA Technical Reports Server (NTRS)
Valett, Jon D.; Condon, Steven E.
1993-01-01
A variety of measures are used in software engineering research to develop an understanding of the software process and product. These measures fall into three broad categories: quantitative, characteristics, and subjective. Quantitative measures are those to which a numerical value can be assigned, for example effort or lines of code (LOC). Characteristics describe the software process or product; they might include programming language or the type of application. While such factors do not provide a quantitative measurement of a process or product, they do help characterize them. Subjective measures (as defined in this study) are those that are based on the opinion or opinions of individuals; they are somewhat unique and difficult to quantify. Capturing of subjective measure data typically involves development of some type of scale. For example, 'team experience' is one of the subjective measures that were collected and studied by the Software Engineering Laboratory (SEL). Certainly, team experience could have an impact on the software process or product; actually measuring a team's experience, however, is not a strictly mathematical exercise. Simply adding up each team member's years of experience appears inadequate. In fact, most researchers would agree that 'years' do not directly translate into 'experience.' Team experience must be defined subjectively and then a scale must be developed e.g., high experience versus low experience; or high, medium, low experience; or a different or more granular scale. Using this type of scale, a particular team's overall experience can be compared with that of other teams in the development environment. Defining, collecting, and scaling subjective measures is difficult. First, precise definitions of the measures must be established. Next, choices must be made about whose opinions will be solicited to constitute the data. Finally, care must be given to defining the right scale and level of granularity for measurement.
Text Information Extraction System (TIES) | Informatics Technology for Cancer Research (ITCR)
TIES is a service based software system for acquiring, deidentifying, and processing clinical text reports using natural language processing, and also for querying, sharing and using this data to foster tissue and image based research, within and between institutions.
GSC configuration management plan
NASA Technical Reports Server (NTRS)
Withers, B. Edward
1990-01-01
The tools and methods used for the configuration management of the artifacts (including software and documentation) associated with the Guidance and Control Software (GCS) project are described. The GCS project is part of a software error studies research program. Three implementations of GCS are being produced in order to study the fundamental characteristics of the software failure process. The Code Management System (CMS) is used to track and retrieve versions of the documentation and software. Application of the CMS for this project is described and the numbering scheme is delineated for the versions of the project artifacts.
Selecting information technology for physicians' practices: a cross-sectional study.
Eden, Karen Beekman
2002-04-05
Many physicians are transitioning from paper to electronic formats for billing, scheduling, medical charts, communications, etc. The primary objective of this research was to identify the relationship (if any) between the software selection process and the office staff's perceptions of the software's impact on practice activities. A telephone survey was conducted with office representatives of 407 physician practices in Oregon who had purchased information technology. The respondents, usually office managers, answered scripted questions about their selection process and their perceptions of the software after implementation. Multiple logistic regression revealed that software type, selection steps, and certain factors influencing the purchase were related to whether the respondents felt the software improved the scheduling and financial analysis practice activities. Specifically, practices that selected electronic medical record or practice management software, that made software comparisons, or that considered prior user testimony as important were more likely to have perceived improvements in the scheduling process than were other practices. Practices that considered value important, that did not consider compatibility important, that selected managed care software, that spent less than 10,000 dollars, or that provided learning time (most dramatic increase in odds ratio, 8.2) during implementation were more likely to perceive that the software had improved the financial analysis process than were other practices. Perhaps one of the most important predictors of improvement was providing learning time during implementation, particularly when the software involves several practice activities. Despite this importance, less than half of the practices reported performing this step.
NASA Astrophysics Data System (ADS)
Basri, Shuib; O'Connor, Rory V.
This paper is concerned with understanding the issues that affect the adoption of software process standards by Very Small Entities (VSEs), their needs from process standards and their willingness to engage with the new ISO/IEC 29110 standard in particular. In order to achieve this goal, a series of industry data collection studies were undertaken with a collection of VSEs. A twin track approach of a qualitative data collection (interviews and focus groups) and quantitative data collection (questionnaire) were undertaken. Data analysis was being completed separately and the final results were merged, using the coding mechanisms of grounded theory. This paper serves as a roadmap for both researchers wishing to understand the issues of process standards adoption by very small companies and also for the software process standards community.
The use of Virtual Analogy Simulation (VAS) in physics learning
NASA Astrophysics Data System (ADS)
Faizin, M. Noor; Samsudin, A.
2018-05-01
The purpose of this research is to explore the use of VAS software in electrical dynamic learning in junior high student, so as to obtain an overview of this software consistency in help students build a scientific conception. This research was administered via research and Development (R & D) with the design of embedded experimental models. The respondents which were involved in this research were 60 students of ninth grade in one of junior high schools in Kudus central java. The improving process of students’ concept is examined based on normalized gain analysis from pretest and posttest scores. The result of this research shows that there was difference between learning using conventional learning (power point software) with VAS software. VAS is more effective to assist students in understanding the electrical dynamic concept shown with N-gain of 0.36, or 36 % were included in the medium category, whereas the conventional learning with N-gain of 0.28, or 28%.
Neugebauer, Tomasz; Bordeleau, Eric; Burrus, Vincent; Brzezinski, Ryszard
2015-01-01
Data visualization methods are necessary during the exploration and analysis activities of an increasingly data-intensive scientific process. There are few existing visualization methods for raw nucleotide sequences of a whole genome or chromosome. Software for data visualization should allow the researchers to create accessible data visualization interfaces that can be exported and shared with others on the web. Herein, novel software developed for generating DNA data visualization interfaces is described. The software converts DNA data sets into images that are further processed as multi-scale images to be accessed through a web-based interface that supports zooming, panning and sequence fragment selection. Nucleotide composition frequencies and GC skew of a selected sequence segment can be obtained through the interface. The software was used to generate DNA data visualization of human and bacterial chromosomes. Examples of visually detectable features such as short and long direct repeats, long terminal repeats, mobile genetic elements, heterochromatic segments in microbial and human chromosomes, are presented. The software and its source code are available for download and further development. The visualization interfaces generated with the software allow for the immediate identification and observation of several types of sequence patterns in genomes of various sizes and origins. The visualization interfaces generated with the software are readily accessible through a web browser. This software is a useful research and teaching tool for genetics and structural genomics.
Methods for cost estimation in software project management
NASA Astrophysics Data System (ADS)
Briciu, C. V.; Filip, I.; Indries, I. I.
2016-02-01
The speed in which the processes used in software development field have changed makes it very difficult the task of forecasting the overall costs for a software project. By many researchers, this task has been considered unachievable, but there is a group of scientist for which this task can be solved using the already known mathematical methods (e.g. multiple linear regressions) and the new techniques as genetic programming and neural networks. The paper presents a solution for building a model for the cost estimation models in the software project management using genetic algorithms starting from the PROMISE datasets related COCOMO 81 model. In the first part of the paper, a summary of the major achievements in the research area of finding a model for estimating the overall project costs is presented together with the description of the existing software development process models. In the last part, a basic proposal of a mathematical model of a genetic programming is proposed including here the description of the chosen fitness function and chromosome representation. The perspective of model described it linked with the current reality of the software development considering as basis the software product life cycle and the current challenges and innovations in the software development area. Based on the author's experiences and the analysis of the existing models and product lifecycle it was concluded that estimation models should be adapted with the new technologies and emerging systems and they depend largely by the chosen software development method.
Software Accelerates Computing Time for Complex Math
NASA Technical Reports Server (NTRS)
2014-01-01
Ames Research Center awarded Newark, Delaware-based EM Photonics Inc. SBIR funding to utilize graphic processing unit (GPU) technology- traditionally used for computer video games-to develop high-computing software called CULA. The software gives users the ability to run complex algorithms on personal computers with greater speed. As a result of the NASA collaboration, the number of employees at the company has increased 10 percent.
Technology for Manufacturing Efficiency
NASA Technical Reports Server (NTRS)
1995-01-01
The Ground Processing Scheduling System (GPSS) was developed by Ames Research Center, Kennedy Space Center and divisions of the Lockheed Company to maintain the scheduling for preparing a Space Shuttle Orbiter for a mission. Red Pepper Software Company, now part of PeopleSoft, Inc., commercialized the software as their ResponseAgent product line. The software enables users to monitor manufacturing variables, report issues and develop solutions to existing problems.
Information models of software productivity - Limits on productivity growth
NASA Technical Reports Server (NTRS)
Tausworthe, Robert C.
1992-01-01
Research into generalized information-metric models of software process productivity establishes quantifiable behavior and theoretical bounds. The models establish a fundamental mathematical relationship between software productivity and the human capacity for information traffic, the software product yield (system size), information efficiency, and tool and process efficiencies. An upper bound is derived that quantifies average software productivity and the maximum rate at which it may grow. This bound reveals that ultimately, when tools, methodologies, and automated assistants have reached their maximum effective state, further improvement in productivity can only be achieved through increasing software reuse. The reuse advantage is shown not to increase faster than logarithmically in the number of reusable features available. The reuse bound is further shown to be somewhat dependent on the reuse policy: a general 'reuse everything' policy can lead to a somewhat slower productivity growth than a specialized reuse policy.
An Aerodynamic Simulation Process for Iced Lifting Surfaces and Associated Issues
NASA Technical Reports Server (NTRS)
Choo, Yung K.; Vickerman, Mary B.; Hackenberg, Anthony W.; Rigby, David L.
2003-01-01
This paper discusses technologies and software tools that are being implemented in a software toolkit currently under development at NASA Glenn Research Center. Its purpose is to help study the effects of icing on airfoil performance and assist with the aerodynamic simulation process which consists of characterization and modeling of ice geometry, application of block topology and grid generation, and flow simulation. Tools and technologies for each task have been carefully chosen based on their contribution to the overall process. For the geometry characterization and modeling, we have chosen an interactive rather than automatic process in order to handle numerous ice shapes. An Appendix presents features of a software toolkit developed to support the interactive process. Approaches taken for the generation of block topology and grids, and flow simulation, though not yet implemented in the software, are discussed with reasons for why particular methods are chosen. Some of the issues that need to be addressed and discussed by the icing community are also included.
Software Writing Skills for Your Research - Lessons Learned from Workshops in the Geosciences
NASA Astrophysics Data System (ADS)
Hammitzsch, Martin
2016-04-01
Findings presented in scientific papers are based on data and software. Once in a while they come along with data - but not commonly with software. However, the software used to gain findings plays a crucial role in the scientific work. Nevertheless, software is rarely seen publishable. Thus researchers may not reproduce the findings without the software which is in conflict with the principle of reproducibility in sciences. For both, the writing of publishable software and the reproducibility issue, the quality of software is of utmost importance. For many programming scientists the treatment of source code, e.g. with code design, version control, documentation, and testing is associated with additional work that is not covered in the primary research task. This includes the adoption of processes following the software development life cycle. However, the adoption of software engineering rules and best practices has to be recognized and accepted as part of the scientific performance. Most scientists have little incentive to improve code and do not publish code because software engineering habits are rarely practised by researchers or students. Software engineering skills are not passed on to followers as for paper writing skill. Thus it is often felt that the software or code produced is not publishable. The quality of software and its source code has a decisive influence on the quality of research results obtained and their traceability. So establishing best practices from software engineering to serve scientific needs is crucial for the success of scientific software. Even though scientists use existing software and code, i.e., from open source software repositories, only few contribute their code back into the repositories. So writing and opening code for Open Science means that subsequent users are able to run the code, e.g. by the provision of sufficient documentation, sample data sets, tests and comments which in turn can be proven by adequate and qualified reviews. This assumes that scientist learn to write and release code and software as they learn to write and publish papers. Having this in mind, software could be valued and assessed as a contribution to science. But this requires the relevant skills that can be passed to colleagues and followers. Therefore, the GFZ German Research Centre for Geosciences performed three workshops in 2015 to address the passing of software writing skills to young scientists, the next generation of researchers in the Earth, planetary and space sciences. Experiences in running these workshops and the lessons learned will be summarized in this presentation. The workshops have received support and funding by Software Carpentry, a volunteer organization whose goal is to make scientists more productive, and their work more reliable, by teaching them basic computing skills, and by FOSTER (Facilitate Open Science Training for European Research), a two-year, EU-Funded (FP7) project, whose goal to produce a European-wide training programme that will help to incorporate Open Access approaches into existing research methodologies and to integrate Open Science principles and practice in the current research workflow by targeting the young researchers and other stakeholders.
Collected software engineering papers, volume 9
NASA Technical Reports Server (NTRS)
1991-01-01
This document is a collection of selected technical papers produced by participants in the Software Engineering Laboratory (SEL) from November 1990 through October 1991. The purpose of the document is to make available, in one reference, some results of SEL research that originally appeared in a number of different forums. This is the ninth such volume of technical papers produced by the SEL. Although these papers cover several topics related to software engineering, they do not encompass the entire scope of SEL activities and interests. For the convenience of this presentation, the eight papers contained here are grouped into three major categories: (1) software models studies; (2) software measurement studies; and (3) Ada technology studies. The first category presents studies on reuse models, including a software reuse model applied to maintenance and a model for an organization to support software reuse. The second category includes experimental research methods and software measurement techniques. The third category presents object-oriented approaches using Ada and object-oriented features proposed for Ada. The SEL is actively working to understand and improve the software development process at GSFC.
Research in software allocation for advanced manned mission communications and tracking systems
NASA Technical Reports Server (NTRS)
Warnagiris, Tom; Wolff, Bill; Kusmanoff, Antone
1990-01-01
An assessment of the planned processing hardware and software/firmware for the Communications and Tracking System of the Space Station Freedom (SSF) was performed. The intent of the assessment was to determine the optimum distribution of software/firmware in the processing hardware for maximum throughput with minimum required memory. As a product of the assessment process an assessment methodology was to be developed that could be used for similar assessments of future manned spacecraft system designs. The assessment process was hampered by changing requirements for the Space Station. As a result, the initial objective of determining the optimum software/firmware allocation was not fulfilled, but several useful conclusions and recommendations resulted from the assessment. It was concluded that the assessment process would not be completely successful for a system with changing requirements. It was also concluded that memory requirements and hardware requirements were being modified to fit as a consequence of the change process, and although throughput could not be quantitized, potential problem areas could be identified. Finally, inherent flexibility of the system design was essential for the success of a system design with changing requirements. Recommendations resulting from the assessment included development of common software for some embedded controller functions, reduction of embedded processor requirements by hardwiring some Orbital Replacement Units (ORUs) to make better use of processor capabilities, and improvement in communications between software development personnel to enhance the integration process. Lastly, a critical observation was made regarding the software integration tasks did not appear to be addressed in the design process to the degree necessary for successful satisfaction of the system requirements.
Ground control station software design for micro aerial vehicles
NASA Astrophysics Data System (ADS)
Walendziuk, Wojciech; Oldziej, Daniel; Binczyk, Dawid Przemyslaw; Slowik, Maciej
2017-08-01
This article describes the process of designing the equipment part and the software of a ground control station used for configuring and operating micro unmanned aerial vehicles (UAV). All the works were conducted on a quadrocopter model being a commonly accessible commercial construction. This article contains a characteristics of the research object, the basics of operating the micro aerial vehicles (MAV) and presents components of the ground control station model. It also describes the communication standards for the purpose of building a model of the station. Further part of the work concerns the software of the product - the GIMSO application (Generally Interactive Station for Mobile Objects), which enables the user to manage the actions and communication and control processes from the UAV. The process of creating the software and the field tests of a station model are also presented in the article.
Parallel Processing with Digital Signal Processing Hardware and Software
NASA Technical Reports Server (NTRS)
Swenson, Cory V.
1995-01-01
The assembling and testing of a parallel processing system is described which will allow a user to move a Digital Signal Processing (DSP) application from the design stage to the execution/analysis stage through the use of several software tools and hardware devices. The system will be used to demonstrate the feasibility of the Algorithm To Architecture Mapping Model (ATAMM) dataflow paradigm for static multiprocessor solutions of DSP applications. The individual components comprising the system are described followed by the installation procedure, research topics, and initial program development.
Bridging the Qualitative/Quantitative Software Divide
Annechino, Rachelle; Antin, Tamar M. J.; Lee, Juliet P.
2011-01-01
To compare and combine qualitative and quantitative data collected from respondents in a mixed methods study, the research team developed a relational database to merge survey responses stored and analyzed in SPSS and semistructured interview responses stored and analyzed in the qualitative software package ATLAS.ti. The process of developing the database, as well as practical considerations for researchers who may wish to use similar methods, are explored. PMID:22003318
NASA Astrophysics Data System (ADS)
Wi, S.; Ray, P. A.; Brown, C.
2015-12-01
A software package developed to facilitate building distributed hydrologic models in a modular modeling system is presented. The software package provides a user-friendly graphical user interface that eases its practical use in water resources-related research and practice. The modular modeling system organizes the options available to users when assembling models according to the stages of hydrological cycle, such as potential evapotranspiration, soil moisture accounting, and snow/glacier melting processes. The software is intended to be a comprehensive tool that simplifies the task of developing, calibrating, validating, and using hydrologic models through the inclusion of intelligent automation to minimize user effort, and reduce opportunities for error. Processes so far automated include the definition of system boundaries (i.e., watershed delineation), climate and geographical input generation, and parameter calibration. Built-in post-processing toolkits greatly improve the functionality of the software as a decision support tool for water resources system management and planning. Example post-processing toolkits enable streamflow simulation at ungauged sites with predefined model parameters, and perform climate change risk assessment by means of the decision scaling approach. The software is validated through application to watersheds representing a variety of hydrologic regimes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Azevedo, S.G.; Fitch, J.P.
1987-10-21
Conventional software interfaces that use imperative computer commands or menu interactions are often restrictive environments when used for researching new algorithms or analyzing processed experimental data. We found this to be true with current signal-processing software (SIG). As an alternative, ''functional language'' interfaces provide features such as command nesting for a more natural interaction with the data. The Image and Signal LISP Environment (ISLE) is an example of an interpreted functional language interface based on common LISP. Advantages of ISLE include multidimensional and multiple data-type independence through dispatching functions, dynamic loading of new functions, and connections to artificial intelligence (AI)more » software. 10 refs.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Azevedo, S.G.; Fitch, J.P.
1987-05-01
Conventional software interfaces which utilize imperative computer commands or menu interactions are often restrictive environments when used for researching new algorithms or analyzing processed experimental data. We found this to be true with current signal processing software (SIG). Existing ''functional language'' interfaces provide features such as command nesting for a more natural interaction with the data. The Image and Signal Lisp Environment (ISLE) will be discussed as an example of an interpreted functional language interface based on Common LISP. Additional benefits include multidimensional and multiple data-type independence through dispatching functions, dynamic loading of new functions, and connections to artificial intelligencemore » software.« less
Research a Novel Integrated and Dynamic Multi-object Trade-Off Mechanism in Software Project
NASA Astrophysics Data System (ADS)
Jiang, Weijin; Xu, Yuhui
Aiming at practical requirements of present software project management and control, the paper presented to construct integrated multi-object trade-off model based on software project process management, so as to actualize integrated and dynamic trade-oil of the multi-object system of project. Based on analyzing basic principle of dynamic controlling and integrated multi-object trade-off system process, the paper integrated method of cybernetics and network technology, through monitoring on some critical reference points according to the control objects, emphatically discussed the integrated and dynamic multi- object trade-off model and corresponding rules and mechanism in order to realize integration of process management and trade-off of multi-object system.
Repository-Based Software Engineering Program: Working Program Management Plan
NASA Technical Reports Server (NTRS)
1993-01-01
Repository-Based Software Engineering Program (RBSE) is a National Aeronautics and Space Administration (NASA) sponsored program dedicated to introducing and supporting common, effective approaches to software engineering practices. The process of conceiving, designing, building, and maintaining software systems by using existing software assets that are stored in a specialized operational reuse library or repository, accessible to system designers, is the foundation of the program. In addition to operating a software repository, RBSE promotes (1) software engineering technology transfer, (2) academic and instructional support of reuse programs, (3) the use of common software engineering standards and practices, (4) software reuse technology research, and (5) interoperability between reuse libraries. This Program Management Plan (PMP) is intended to communicate program goals and objectives, describe major work areas, and define a management report and control process. This process will assist the Program Manager, University of Houston at Clear Lake (UHCL) in tracking work progress and describing major program activities to NASA management. The goal of this PMP is to make managing the RBSE program a relatively easy process that improves the work of all team members. The PMP describes work areas addressed and work efforts being accomplished by the program; however, it is not intended as a complete description of the program. Its focus is on providing management tools and management processes for monitoring, evaluating, and administering the program; and it includes schedules for charting milestones and deliveries of program products. The PMP was developed by soliciting and obtaining guidance from appropriate program participants, analyzing program management guidance, and reviewing related program management documents.
Generalized implementation of software safety policies
NASA Technical Reports Server (NTRS)
Knight, John C.; Wika, Kevin G.
1994-01-01
As part of a research program in the engineering of software for safety-critical systems, we are performing two case studies. The first case study, which is well underway, is a safety-critical medical application. The second, which is just starting, is a digital control system for a nuclear research reactor. Our goal is to use these case studies to permit us to obtain a better understanding of the issues facing developers of safety-critical systems, and to provide a vehicle for the assessment of research ideas. The case studies are not based on the analysis of existing software development by others. Instead, we are attempting to create software for new and novel systems in a process that ultimately will involve all phases of the software lifecycle. In this abstract, we summarize our results to date in a small part of this project, namely the determination and classification of policies related to software safety that must be enforced to ensure safe operation. We hypothesize that this classification will permit a general approach to the implementation of a policy enforcement mechanism.
Experimental research control software system
NASA Astrophysics Data System (ADS)
Cohn, I. A.; Kovalenko, A. G.; Vystavkin, A. N.
2014-05-01
A software system, intended for automation of a small scale research, has been developed. The software allows one to control equipment, acquire and process data by means of simple scripts. The main purpose of that development is to increase experiment automation easiness, thus significantly reducing experimental setup automation efforts. In particular, minimal programming skills are required and supervisors have no reviewing troubles. Interactions between scripts and equipment are managed automatically, thus allowing to run multiple scripts simultaneously. Unlike well-known data acquisition commercial software systems, the control is performed by an imperative scripting language. This approach eases complex control and data acquisition algorithms implementation. A modular interface library performs interaction with external interfaces. While most widely used interfaces are already implemented, a simple framework is developed for fast implementations of new software and hardware interfaces. While the software is in continuous development with new features being implemented, it is already used in our laboratory for automation of a helium-3 cryostat control and data acquisition. The software is open source and distributed under Gnu Public License.
The advanced software development workstation project
NASA Technical Reports Server (NTRS)
Fridge, Ernest M., III; Pitman, Charles L.
1991-01-01
The Advanced Software Development Workstation (ASDW) task is researching and developing the technologies required to support Computer Aided Software Engineering (CASE) with the emphasis on those advanced methods, tools, and processes that will be of benefit to support all NASA programs. Immediate goals are to provide research and prototype tools that will increase productivity, in the near term, in projects such as the Software Support Environment (SSE), the Space Station Control Center (SSCC), and the Flight Analysis and Design System (FADS) which will be used to support the Space Shuttle and Space Station Freedom. Goals also include providing technology for development, evolution, maintenance, and operations. The technologies under research and development in the ASDW project are targeted to provide productivity enhancements during the software life cycle phase of enterprise and information system modeling, requirements generation and analysis, system design and coding, and system use and maintenance. On-line user's guides will assist users in operating the developed information system with knowledge base expert assistance.
Framework Support For Knowledge-Based Software Development
NASA Astrophysics Data System (ADS)
Huseth, Steve
1988-03-01
The advent of personal engineering workstations has brought substantial information processing power to the individual programmer. Advanced tools and environment capabilities supporting the software lifecycle are just beginning to become generally available. However, many of these tools are addressing only part of the software development problem by focusing on rapid construction of self-contained programs by a small group of talented engineers. Additional capabilities are required to support the development of large programming systems where a high degree of coordination and communication is required among large numbers of software engineers, hardware engineers, and managers. A major player in realizing these capabilities is the framework supporting the software development environment. In this paper we discuss our research toward a Knowledge-Based Software Assistant (KBSA) framework. We propose the development of an advanced framework containing a distributed knowledge base that can support the data representation needs of tools, provide environmental support for the formalization and control of the software development process, and offer a highly interactive and consistent user interface.
For operation of the Computer Software Management and Information Center (COSMIC)
NASA Technical Reports Server (NTRS)
Carmon, J. L.
1983-01-01
During the month of June, the Survey Research Center (SRC) at the University of Georgia designed new benefits questionnaires for computer software management and information center (COSMIC). As a test of their utility, these questionnaires are now used in the benefits identification process.
Agile Methods for Open Source Safety-Critical Software
Enquobahrie, Andinet; Ibanez, Luis; Cheng, Patrick; Yaniv, Ziv; Cleary, Kevin; Kokoori, Shylaja; Muffih, Benjamin; Heidenreich, John
2011-01-01
The introduction of software technology in a life-dependent environment requires the development team to execute a process that ensures a high level of software reliability and correctness. Despite their popularity, agile methods are generally assumed to be inappropriate as a process family in these environments due to their lack of emphasis on documentation, traceability, and other formal techniques. Agile methods, notably Scrum, favor empirical process control, or small constant adjustments in a tight feedback loop. This paper challenges the assumption that agile methods are inappropriate for safety-critical software development. Agile methods are flexible enough to encourage the right amount of ceremony; therefore if safety-critical systems require greater emphasis on activities like formal specification and requirements management, then an agile process will include these as necessary activities. Furthermore, agile methods focus more on continuous process management and code-level quality than classic software engineering process models. We present our experiences on the image-guided surgical toolkit (IGSTK) project as a backdrop. IGSTK is an open source software project employing agile practices since 2004. We started with the assumption that a lighter process is better, focused on evolving code, and only adding process elements as the need arose. IGSTK has been adopted by teaching hospitals and research labs, and used for clinical trials. Agile methods have matured since the academic community suggested they are not suitable for safety-critical systems almost a decade ago, we present our experiences as a case study for renewing the discussion. PMID:21799545
Agile Methods for Open Source Safety-Critical Software.
Gary, Kevin; Enquobahrie, Andinet; Ibanez, Luis; Cheng, Patrick; Yaniv, Ziv; Cleary, Kevin; Kokoori, Shylaja; Muffih, Benjamin; Heidenreich, John
2011-08-01
The introduction of software technology in a life-dependent environment requires the development team to execute a process that ensures a high level of software reliability and correctness. Despite their popularity, agile methods are generally assumed to be inappropriate as a process family in these environments due to their lack of emphasis on documentation, traceability, and other formal techniques. Agile methods, notably Scrum, favor empirical process control, or small constant adjustments in a tight feedback loop. This paper challenges the assumption that agile methods are inappropriate for safety-critical software development. Agile methods are flexible enough to encourage the rightamount of ceremony; therefore if safety-critical systems require greater emphasis on activities like formal specification and requirements management, then an agile process will include these as necessary activities. Furthermore, agile methods focus more on continuous process management and code-level quality than classic software engineering process models. We present our experiences on the image-guided surgical toolkit (IGSTK) project as a backdrop. IGSTK is an open source software project employing agile practices since 2004. We started with the assumption that a lighter process is better, focused on evolving code, and only adding process elements as the need arose. IGSTK has been adopted by teaching hospitals and research labs, and used for clinical trials. Agile methods have matured since the academic community suggested they are not suitable for safety-critical systems almost a decade ago, we present our experiences as a case study for renewing the discussion.
Göbl, Rüdiger; Navab, Nassir; Hennersperger, Christoph
2018-06-01
Research in ultrasound imaging is limited in reproducibility by two factors: First, many existing ultrasound pipelines are protected by intellectual property, rendering exchange of code difficult. Second, most pipelines are implemented in special hardware, resulting in limited flexibility of implemented processing steps on such platforms. With SUPRA, we propose an open-source pipeline for fully software-defined ultrasound processing for real-time applications to alleviate these problems. Covering all steps from beamforming to output of B-mode images, SUPRA can help improve the reproducibility of results and make modifications to the image acquisition mode accessible to the research community. We evaluate the pipeline qualitatively, quantitatively, and regarding its run time. The pipeline shows image quality comparable to a clinical system and backed by point spread function measurements a comparable resolution. Including all processing stages of a usual ultrasound pipeline, the run-time analysis shows that it can be executed in 2D and 3D on consumer GPUs in real time. Our software ultrasound pipeline opens up the research in image acquisition. Given access to ultrasound data from early stages (raw channel data, radiofrequency data), it simplifies the development in imaging. Furthermore, it tackles the reproducibility of research results, as code can be shared easily and even be executed without dedicated ultrasound hardware.
Smartphones in ecology and evolution: a guide for the app-rehensive.
Teacher, Amber G F; Griffiths, David J; Hodgson, David J; Inger, Richard
2013-12-01
Smartphones and their apps (application software) are now used by millions of people worldwide and represent a powerful combination of sensors, information transfer, and computing power that deserves better exploitation by ecological and evolutionary researchers. We outline the development process for research apps, provide contrasting case studies for two new research apps, and scan the research horizon to suggest how apps can contribute to the rapid collection, interpretation, and dissemination of data in ecology and evolutionary biology. We emphasize that the usefulness of an app relies heavily on the development process, recommend that app developers are engaged with the process at the earliest possible stage, and commend efforts to create open-source software scaffolds on which customized apps can be built by nonexperts. We conclude that smartphones and their apps could replace many traditional handheld sensors, calculators, and data storage devices in ecological and evolutionary research. We identify their potential use in the high-throughput collection, analysis, and storage of complex ecological information.
NASA/WVU Software Research Laboratory, 1995
NASA Technical Reports Server (NTRS)
Sabolish, George J.; Callahan, John R.
1995-01-01
In our second year, the NASA/WVU Software Research Lab has made significant strides toward analysis and solution of major software problems related to V&V activities. We have established working relationships with many ongoing efforts within NASA and continue to provide valuable input into policy and decision-making processes. Through our publications, technical reports, lecture series, newsletters, and resources on the World-Wide-Web, we provide information to many NASA and external parties daily. This report is a summary and overview of some of our activities for the past year. This report is divided into 6 chapters: Introduction, People, Support Activities, Process, Metrics, and Testing. The Introduction chapter (this chapter) gives an overview of our project beginnings and targets. The People chapter focuses on new people who have joined the Lab this year. The Support chapter briefly lists activities like our WWW pages, Technical Report Series, Technical Lecture Series, and Research Quarterly newsletter. Finally, the remaining four chapters discuss the major research areas that we have made significant progress towards producing meaningful task reports. These chapters can be regarded as portions of drafts of our task reports.
LUMA: A many-core, Fluid-Structure Interaction solver based on the Lattice-Boltzmann Method
NASA Astrophysics Data System (ADS)
Harwood, Adrian R. G.; O'Connor, Joseph; Sanchez Muñoz, Jonathan; Camps Santasmasas, Marta; Revell, Alistair J.
2018-01-01
The Lattice-Boltzmann Method at the University of Manchester (LUMA) project was commissioned to build a collaborative research environment in which researchers of all abilities can study fluid-structure interaction (FSI) problems in engineering applications from aerodynamics to medicine. It is built on the principles of accessibility, simplicity and flexibility. The LUMA software at the core of the project is a capable FSI solver with turbulence modelling and many-core scalability as well as a wealth of input/output and pre- and post-processing facilities. The software has been validated and several major releases benchmarked on supercomputing facilities internationally. The software architecture is modular and arranged logically using a minimal amount of object-orientation to maintain a simple and accessible software.
An evaluation of software tools for the design and development of cockpit displays
NASA Technical Reports Server (NTRS)
Ellis, Thomas D., Jr.
1993-01-01
The use of all-glass cockpits at the NASA Langley Research Center (LaRC) simulation facility has changed the means of design, development, and maintenance of instrument displays. The human-machine interface has evolved from a physical hardware device to a software-generated electronic display system. This has subsequently caused an increased workload at the facility. As computer processing power increases and the glass cockpit becomes predominant in facilities, software tools used in the design and development of cockpit displays are becoming both feasible and necessary for a more productive simulation environment. This paper defines LaRC requirements of a display software development tool and compares two available applications against these requirements. As a part of the software engineering process, these tools reduce development time, provide a common platform for display development, and produce exceptional real-time results.
Using a web-based survey tool to undertake a Delphi study: application for nurse education research.
Gill, Fenella J; Leslie, Gavin D; Grech, Carol; Latour, Jos M
2013-11-01
The Internet is increasingly being used as a data collection medium to access research participants. This paper reports on the experience and value of using web-survey software to conduct an eDelphi study to develop Australian critical care course graduate practice standards. The eDelphi technique used involved the iterative process of administering three rounds of surveys to a national expert panel. The survey was developed online using SurveyMonkey. Panel members responded to statements using one rating scale for round one and two scales for rounds two and three. Text boxes for panel comments were provided. For each round, the SurveyMonkey's email tool was used to distribute an individualized email invitation containing the survey web link. The distribution of panel responses, individual responses and a summary of comments were emailed to panel members. Stacked bar charts representing the distribution of responses were generated using the SurveyMonkey software. Panel response rates remained greater than 85% over all rounds. An online survey provided numerous advantages over traditional survey approaches including high quality data collection, ease and speed of survey administration, direct communication with the panel and rapid collation of feedback allowing data collection to be undertaken in 12 weeks. Only minor challenges were experienced using the technology. Ethical issues, specific to using the Internet to conduct research and external hosting of web-based software, lacked formal guidance. High response rates and an increased level of data quality were achieved in this study using web-survey software and the process was efficient and user-friendly. However, when considering online survey software, it is important to match the research design with the computer capabilities of participants and recognize that ethical review guidelines and processes have not yet kept pace with online research practices. Copyright © 2013 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Allard, R.; Mack, B.; Bayoumi, M. M.
1989-01-01
Most robot systems lack a suitable hardware and software environment for the efficient research of new control and sensing schemes. Typically, engineers and researchers need to be experts in control, sensing, programming, communication and robotics in order to implement, integrate and test new ideas in a robot system. In order to reduce this time, the Robot Controller Test Station (RCTS) has been developed. It uses a modular hardware and software architecture allowing easy physical and functional reconfiguration of a robot. This is accomplished by emphasizing four major design goals: flexibility, portability, ease of use, and ease of modification. An enhanced distributed processing version of RCTS is described. It features an expanded and more flexible communication system design. Distributed processing results in the availability of more local computing power and retains the low cost of microprocessors. A large number of possible communication, control and sensing schemes can therefore be easily introduced and tested, using the same basic software structure.
Reuse-Driven Software Processes Guidebook. Version 02.00.03
1993-11-01
a required sys - tem without unduly constraining the details of the solution. The Naval Research Laboratory Software Cost Reduction project developed...conventional manner. The emphasis is still on the development of "one-of-a-kind" sys - tems and the phased completion and review of corresponding...Application Engineering to improve the life-cycle productivity of Sy - 21 OVM ftrdauntals of Syatbes the total software development enterprise. The
Survivability as a Tool for Evaluating Open Source Software
2015-06-01
the thesis limited the program development, so it is only able to process project issues (bugs or feature requests), which is an important metric for...Ideally, these insights may provide an analytic framework to generate guidance for decision makers that may support the inclusion of OSS to more...refine their efforts to build quality software and to strengthen their software development communities. 1.4 Research Questions This thesis addresses
Evolving software reengineering technology for the emerging innovative-competitive era
NASA Technical Reports Server (NTRS)
Hwang, Phillip Q.; Lock, Evan; Prywes, Noah
1994-01-01
This paper reports on a multi-tool commercial/military environment combining software Domain Analysis techniques with Reusable Software and Reengineering of Legacy Software. It is based on the development of a military version for the Department of Defense (DOD). The integrated tools in the military version are: Software Specification Assistant (SSA) and Software Reengineering Environment (SRE), developed by Computer Command and Control Company (CCCC) for Naval Surface Warfare Center (NSWC) and Joint Logistics Commanders (JLC), and the Advanced Research Project Agency (ARPA) STARS Software Engineering Environment (SEE) developed by Boeing for NAVAIR PMA 205. The paper describes transitioning these integrated tools to commercial use. There is a critical need for the transition for the following reasons: First, to date, 70 percent of programmers' time is applied to software maintenance. The work of these users has not been facilitated by existing tools. The addition of Software Reengineering will also facilitate software maintenance and upgrading. In fact, the integrated tools will support the entire software life cycle. Second, the integrated tools are essential to Business Process Reengineering, which seeks radical process innovations to achieve breakthrough results. Done well, process reengineering delivers extraordinary gains in process speed, productivity and profitability. Most importantly, it discovers new opportunities for products and services in collaboration with other organizations. Legacy computer software must be changed rapidly to support innovative business processes. The integrated tools will provide commercial organizations important competitive advantages. This, in turn, will increase employment by creating new business opportunities. Third, the integrated system will produce much higher quality software than use of the tools separately. The reason for this is that producing or upgrading software requires keen understanding of extremely complex applications which is facilitated by the integrated tools. The radical savings in the time and cost associated with software, due to use of CASE tools that support combined Reuse of Software and Reengineering of Legacy Code, will add an important impetus to improving the automation of enterprises. This will be reflected in continuing operations, as well as in innovating new business processes. The proposed multi-tool software development is based on state of the art technology, which will be further advanced through the use of open systems for adding new tools and experience in their use.
An image-processing software package: UU and Fig for optical metrology applications
NASA Astrophysics Data System (ADS)
Chen, Lujie
2013-06-01
Modern optical metrology applications are largely supported by computational methods, such as phase shifting [1], Fourier Transform [2], digital image correlation [3], camera calibration [4], etc, in which image processing is a critical and indispensable component. While it is not too difficult to obtain a wide variety of image-processing programs from the internet; few are catered for the relatively special area of optical metrology. This paper introduces an image-processing software package: UU (data processing) and Fig (data rendering) that incorporates many useful functions to process optical metrological data. The cross-platform programs UU and Fig are developed based on wxWidgets. At the time of writing, it has been tested on Windows, Linux and Mac OS. The userinterface is designed to offer precise control of the underline processing procedures in a scientific manner. The data input/output mechanism is designed to accommodate diverse file formats and to facilitate the interaction with other independent programs. In terms of robustness, although the software was initially developed for personal use, it is comparably stable and accurate to most of the commercial software of similar nature. In addition to functions for optical metrology, the software package has a rich collection of useful tools in the following areas: real-time image streaming from USB and GigE cameras, computational geometry, computer vision, fitting of data, 3D image processing, vector image processing, precision device control (rotary stage, PZT stage, etc), point cloud to surface reconstruction, volume rendering, batch processing, etc. The software package is currently used in a number of universities for teaching and research.
OIPAV: an integrated software system for ophthalmic image processing, analysis and visualization
NASA Astrophysics Data System (ADS)
Zhang, Lichun; Xiang, Dehui; Jin, Chao; Shi, Fei; Yu, Kai; Chen, Xinjian
2018-03-01
OIPAV (Ophthalmic Images Processing, Analysis and Visualization) is a cross-platform software which is specially oriented to ophthalmic images. It provides a wide range of functionalities including data I/O, image processing, interaction, ophthalmic diseases detection, data analysis and visualization to help researchers and clinicians deal with various ophthalmic images such as optical coherence tomography (OCT) images and color photo of fundus, etc. It enables users to easily access to different ophthalmic image data manufactured from different imaging devices, facilitate workflows of processing ophthalmic images and improve quantitative evaluations. In this paper, we will present the system design and functional modules of the platform and demonstrate various applications. With a satisfying function scalability and expandability, we believe that the software can be widely applied in ophthalmology field.
Proceedings of the Eighteenth Annual Software Engineering Workshop
NASA Technical Reports Server (NTRS)
1993-01-01
The workshop provided a forum for software practitioners from around the world to exchange information on the measurement, use, and evaluation of software methods, models, and tools. This year, approximately 450 people attended the workshop, which consisted of six sessions on the following topics: the Software Engineering Laboratory, measurement, technology assessment, advanced concepts, process, and software engineering issues in NASA. Three presentations were given in each of the topic areas. The content of those presentations and the research papers detailing the work reported are included in these proceedings. The workshop concluded with a tutorial session on how to start an Experience Factory.
Reuse of Software Assets for the NASA Earth Science Decadal Survey Missions
NASA Technical Reports Server (NTRS)
Mattmann, Chris A.; Downs, Robert R.; Marshall, James J.; Most, Neal F.; Samadi, Shahin
2010-01-01
Software assets from existing Earth science missions can be reused for the new decadal survey missions that are being planned by NASA in response to the 2007 Earth Science National Research Council (NRC) Study. The new missions will require the development of software to curate, process, and disseminate the data to science users of interest and to the broader NASA mission community. In this paper, we discuss new tools and a blossoming community that are being developed by the Earth Science Data System (ESDS) Software Reuse Working Group (SRWG) to improve capabilities for reusing NASA software assets.
Development of software for geodynamic processes monitoring system
NASA Astrophysics Data System (ADS)
Kabanov, M. M.; Kapustin, S. N.; Gordeev, V. F.; Botygin, I. A.; Tartakovsky, V. A.
2017-11-01
This article justifies the usage of natural pulsed electromagnetic Earth's noises logging method for mapping anomalies of strain-stress state of Earth's crust. The methods and technologies for gathering, processing and systematization of data gathered by ground multi-channel geophysical loggers for monitoring geomagnetic situation have been experimentally tested, and software had been developed. The data was consolidated in a network storage and can be accessed without using any specialized client software. The article proposes ways to distinguish global and regional small-scale time-space variations of Earth's natural electromagnetic field. For research purposes, the software provides a way to export data for any given period of time for any loggers and displays measurement data charts for selected set of stations.
User Driven Development of Software Tools for Open Data Discovery and Exploration
NASA Astrophysics Data System (ADS)
Schlobinski, Sascha; Keppel, Frank; Dihe, Pascal; Boot, Gerben; Falkenroth, Esa
2016-04-01
The use of open data in research faces challenges not restricted to inherent properties such as data quality, resolution of open data sets. Often Open data is catalogued insufficiently or fragmented. Software tools that support the effective discovery including the assessment of the data's appropriateness for research have shortcomings such as the lack of essential functionalities like support for data provenance. We believe that one of the reasons is the neglect of real end users requirements in the development process of aforementioned software tools. In the context of the FP7 Switch-On project we have pro-actively engaged the relevant user user community to collaboratively develop a means to publish, find and bind open data relevant for hydrologic research. Implementing key concepts of data discovery and exploration we have used state of the art web technologies to provide an interactive software tool that is easy to use yet powerful enough to satisfy the data discovery and access requirements of the hydrological research community.
Cross Sectional Study of Agile Software Development Methods and Project Performance
ERIC Educational Resources Information Center
Lambert, Tracy
2011-01-01
Agile software development methods, characterized by delivering customer value via incremental and iterative time-boxed development processes, have moved into the mainstream of the Information Technology (IT) industry. However, despite a growing body of research which suggests that a predictive manufacturing approach, with big up-front…
Measuring the Impact of Agile Coaching on Students' Performance
ERIC Educational Resources Information Center
Rodríguez, Guillermo; Soria, Álvaro; Campo, Marcelo
2016-01-01
Nowadays, considerable attention is paid to agile methods as a means to improve management of software development processes. The widespread use of such methods in professional contexts has encouraged their integration into software engineering training and undergraduate courses. Although several research efforts have focused on teaching Scrum…
Data collection and evaluation for experimental computer science research
NASA Technical Reports Server (NTRS)
Zelkowitz, Marvin V.
1983-01-01
The Software Engineering Laboratory was monitoring software development at NASA Goddard Space Flight Center since 1976. The data collection activities of the Laboratory and some of the difficulties of obtaining reliable data are described. In addition, the application of this data collection process to a current prototyping experiment is reviewed.
Development of a software safety process and a case study of its use
NASA Technical Reports Server (NTRS)
Knight, John C.
1993-01-01
The goal of this research is to continue the development of a comprehensive approach to software safety and to evaluate the approach with a case study. The case study is a major part of the project, and it involves the analysis of a specific safety-critical system from the medical equipment domain. The particular application being used was selected because of the availability of a suitable candidate system. We consider the results to be generally applicable and in no way particularly limited by the domain. The research is concentrating on issues raised by the specification and verification phases of the software lifecycle since they are central to our previously-developed rigorous definitions of software safety. The theoretical research is based on our framework of definitions for software safety. In the area of specification, the main topics being investigated are the development of techniques for building system fault trees that correctly incorporate software issues and the development of rigorous techniques for the preparation of software safety specifications. The research results are documented. Another area of theoretical investigation is the development of verification methods tailored to the characteristics of safety requirements. Verification of the correct implementation of the safety specification is central to the goal of establishing safe software. The empirical component of this research is focusing on a case study in order to provide detailed characterizations of the issues as they appear in practice, and to provide a testbed for the evaluation of various existing and new theoretical results, tools, and techniques. The Magnetic Stereotaxis System is summarized.
Knowledge and utilization of computer-software for statistics among Nigerian dentists.
Chukwuneke, F N; Anyanechi, C E; Obiakor, A O; Amobi, O; Onyejiaka, N; Alamba, I
2013-01-01
The use of computer soft ware for generation of statistic analysis has transformed health information and data to simplest form in the areas of access, storage, retrieval and analysis in the field of research. This survey therefore was carried out to assess the level of knowledge and utilization of computer software for statistical analysis among dental researchers in eastern Nigeria. Questionnaires on the use of computer software for statistical analysis were randomly distributed to 65 practicing dental surgeons of above 5 years experience in the tertiary academic hospitals in eastern Nigeria. The focus was on: years of clinical experience; research work experience; knowledge and application of computer generated software for data processing and stastistical analysis. Sixty-two (62/65; 95.4%) of these questionnaires were returned anonymously, which were used in our data analysis. Twenty-nine (29/62; 46.8%) respondents fall within those with 5-10 years of clinical experience out of which none has completed the specialist training programme. Practitioners with above 10 years clinical experiences were 33 (33/62; 53.2%) out of which 15 (15/33; 45.5%) are specialists representing 24.2% (15/62) of the total number of respondents. All the 15 specialists are actively involved in research activities and only five (5/15; 33.3%) can utilize software statistical analysis unaided. This study has i dentified poor utilization of computer software for statistic analysis among dental researchers in eastern Nigeria. This is strongly associated with lack of exposure on the use of these software early enough especially during the undergraduate training. This call for introduction of computer training programme in dental curriculum to enable practitioners develops the attitude of using computer software for their research.
NASA Astrophysics Data System (ADS)
Zelt, C. A.
2017-12-01
Earth science attempts to understand how the earth works. This research often depends on software for modeling, processing, inverting or imaging. Freely sharing open-source software is essential to prevent reinventing the wheel and allows software to be improved and applied in ways the original author may never have envisioned. For young scientists, releasing software can increase their name ID when applying for jobs and funding, and create opportunities for collaborations when scientists who collect data want the software's creator to be involved in their project. However, we frequently hear scientists say software is a tool, it's not science. Creating software that implements a new or better way of earth modeling or geophysical processing, inverting or imaging should be viewed as earth science. Creating software for things like data visualization, format conversion, storage, or transmission, or programming to enhance computational performance, may be viewed as computer science. The former, ideally with an application to real data, can be published in earth science journals, the latter possibly in computer science journals. Citations in either case should accurately reflect the impact of the software on the community. Funding agencies need to support more software development and open-source releasing, and the community should give more high-profile awards for developing impactful open-source software. Funding support and community recognition for software development can have far reaching benefits when the software is used in foreseen and unforeseen ways, potentially for years after the original investment in the software development. For funding, an open-source release that is well documented should be required, with example input and output files. Appropriate funding will provide the incentive and time to release user-friendly software, and minimize the need for others to duplicate the effort. All funded software should be available through a single web site, ideally maintained by someone in a funded position. Perhaps the biggest challenge is the reality that researches who use software, as opposed to develop software, are more attractive university hires because they are more likely to be "big picture" scientists that publish in the highest profile journals, although sometimes the two go together.
DOE Office of Scientific and Technical Information (OSTI.GOV)
P-Mart was designed specifically to allow cancer researchers to perform robust statistical processing of publicly available cancer proteomic datasets. To date an online statistical processing suite for proteomics does not exist. The P-Mart software is designed to allow statistical programmers to utilize these algorithms through packages in the R programming language as well as offering a web-based interface using the Azure cloud technology. The Azure cloud technology also allows the release of the software via Docker containers.
Implementation of an open adoption research data management system for clinical studies.
Müller, Jan; Heiss, Kirsten Ingmar; Oberhoffer, Renate
2017-07-06
Research institutions need to manage multiple studies with individual data sets, processing rules and different permissions. So far, there is no standard technology that provides an easy to use environment to create databases and user interfaces for clinical trials or research studies. Therefore various software solutions are being used-from custom software, explicitly designed for a specific study, to cost intensive commercial Clinical Trial Management Systems (CTMS) up to very basic approaches with self-designed Microsoft ® databases. The technology applied to conduct those studies varies tremendously from study to study, making it difficult to evaluate data across various studies (meta-analysis) and keeping a defined level of quality in database design, data processing, displaying and exporting. Furthermore, the systems being used to collect study data are often operated redundantly to systems used in patient care. As a consequence the data collection in studies is inefficient and data quality may suffer from unsynchronized datasets, non-normalized database scenarios and manually executed data transfers. With OpenCampus Research we implemented an open adoption software (OAS) solution on an open source basis, which provides a standard environment for state-of-the-art research database management at low cost.
Technology in nursing scholarship: use of citation reference managers.
Smith, Cheryl M; Baker, Bradford
2007-06-01
Nurses, especially those in academia, feel the pressure to publish but have a limited time to write. One of the more time-consuming and frustrating tasks of research, and subsequent publications, is the collection and organization of accurate citations of sources of information. The purpose of this article is to discuss three types of citation reference managers (personal bibliographic software) and how their use can provide consistency and accuracy in recording all the information needed for the research and writing process. The advantages and disadvantages of three software programs, EndNote, Reference Manager, and ProCite, are discussed. These three software products have a variety of options that can be used in personal data management to assist researchers in becoming published authors.
Flexible Software Architecture for Visualization and Seismic Data Analysis
NASA Astrophysics Data System (ADS)
Petunin, S.; Pavlov, I.; Mogilenskikh, D.; Podzyuban, D.; Arkhipov, A.; Baturuin, N.; Lisin, A.; Smith, A.; Rivers, W.; Harben, P.
2007-12-01
Research in the field of seismology requires software and signal processing utilities for seismogram manipulation and analysis. Seismologists and data analysts often encounter a major problem in the use of any particular software application specific to seismic data analysis: the tuning of commands and windows to the specific waveforms and hot key combinations so as to fit their familiar informational environment. The ability to modify the user's interface independently from the developer requires an adaptive code structure. An adaptive code structure also allows for expansion of software capabilities such as new signal processing modules and implementation of more efficient algorithms. Our approach is to use a flexible "open" architecture for development of geophysical software. This report presents an integrated solution for organizing a logical software architecture based on the Unix version of the Geotool software implemented on the Microsoft NET 2.0 platform. Selection of this platform greatly expands the variety and number of computers that can implement the software, including laptops that can be utilized in field conditions. It also facilitates implementation of communication functions for seismic data requests from remote databases through the Internet. The main principle of the new architecture for Geotool is that scientists should be able to add new routines for digital waveform analysis via software plug-ins that utilize the basic Geotool display for GUI interaction. The use of plug-ins allows the efficient integration of diverse signal-processing software, including software still in preliminary development, into an organized platform without changing the fundamental structure of that platform itself. An analyst's use of Geotool is tracked via a metadata file so that future studies can reconstruct, and alter, the original signal processing operations. The work has been completed in the framework of a joint Russian- American project.
Statistical modeling of software reliability
NASA Technical Reports Server (NTRS)
Miller, Douglas R.
1992-01-01
This working paper discusses the statistical simulation part of a controlled software development experiment being conducted under the direction of the System Validation Methods Branch, Information Systems Division, NASA Langley Research Center. The experiment uses guidance and control software (GCS) aboard a fictitious planetary landing spacecraft: real-time control software operating on a transient mission. Software execution is simulated to study the statistical aspects of reliability and other failure characteristics of the software during development, testing, and random usage. Quantification of software reliability is a major goal. Various reliability concepts are discussed. Experiments are described for performing simulations and collecting appropriate simulated software performance and failure data. This data is then used to make statistical inferences about the quality of the software development and verification processes as well as inferences about the reliability of software versions and reliability growth under random testing and debugging.
Adopting software quality measures for healthcare processes.
Yildiz, Ozkan; Demirörs, Onur
2009-01-01
In this study, we investigated the adoptability of software quality measures for healthcare process measurement. Quality measures of ISO/IEC 9126 are redefined from a process perspective to build a generic healthcare process quality measurement model. Case study research method is used, and the model is applied to a public hospital's Entry to Care process. After the application, weak and strong aspects of the process can be easily observed. Access audibility, fault removal, completeness of documentation, and machine utilization are weak aspects and these aspects are the candidates for process improvement. On the other hand, functional completeness, fault ratio, input validity checking, response time, and throughput time are the strong aspects of the process.
Testing Scientific Software: A Systematic Literature Review.
Kanewala, Upulee; Bieman, James M
2014-10-01
Scientific software plays an important role in critical decision making, for example making weather predictions based on climate models, and computation of evidence for research publications. Recently, scientists have had to retract publications due to errors caused by software faults. Systematic testing can identify such faults in code. This study aims to identify specific challenges, proposed solutions, and unsolved problems faced when testing scientific software. We conducted a systematic literature survey to identify and analyze relevant literature. We identified 62 studies that provided relevant information about testing scientific software. We found that challenges faced when testing scientific software fall into two main categories: (1) testing challenges that occur due to characteristics of scientific software such as oracle problems and (2) testing challenges that occur due to cultural differences between scientists and the software engineering community such as viewing the code and the model that it implements as inseparable entities. In addition, we identified methods to potentially overcome these challenges and their limitations. Finally we describe unsolved challenges and how software engineering researchers and practitioners can help to overcome them. Scientific software presents special challenges for testing. Specifically, cultural differences between scientist developers and software engineers, along with the characteristics of the scientific software make testing more difficult. Existing techniques such as code clone detection can help to improve the testing process. Software engineers should consider special challenges posed by scientific software such as oracle problems when developing testing techniques.
Current trends in hardware and software for brain-computer interfaces (BCIs)
NASA Astrophysics Data System (ADS)
Brunner, P.; Bianchi, L.; Guger, C.; Cincotti, F.; Schalk, G.
2011-04-01
A brain-computer interface (BCI) provides a non-muscular communication channel to people with and without disabilities. BCI devices consist of hardware and software. BCI hardware records signals from the brain, either invasively or non-invasively, using a series of device components. BCI software then translates these signals into device output commands and provides feedback. One may categorize different types of BCI applications into the following four categories: basic research, clinical/translational research, consumer products, and emerging applications. These four categories use BCI hardware and software, but have different sets of requirements. For example, while basic research needs to explore a wide range of system configurations, and thus requires a wide range of hardware and software capabilities, applications in the other three categories may be designed for relatively narrow purposes and thus may only need a very limited subset of capabilities. This paper summarizes technical aspects for each of these four categories of BCI applications. The results indicate that BCI technology is in transition from isolated demonstrations to systematic research and commercial development. This process requires several multidisciplinary efforts, including the development of better integrated and more robust BCI hardware and software, the definition of standardized interfaces, and the development of certification, dissemination and reimbursement procedures.
How Digital Image Processing Became Really Easy
NASA Astrophysics Data System (ADS)
Cannon, Michael
1988-02-01
In the early and mid-1970s, digital image processing was the subject of intense university and corporate research. The research lay along two lines: (1) developing mathematical techniques for improving the appearance of or analyzing the contents of images represented in digital form, and (2) creating cost-effective hardware to carry out these techniques. The research has been very effective, as evidenced by the continued decline of image processing as a research topic, and the rapid increase of commercial companies to market digital image processing software and hardware.
Johnston, Sharon; Wong, Sabrina T; Blackman, Stephanie; Chau, Leena W; Grool, Anne M; Hogg, William
2017-11-16
Recruiting family physicians into primary care research studies requires researchers to continually manage information coming in, going out, and coming in again. In many research groups, Microsoft Excel and Access are the usual data management tools, but they are very basic and do not support any automation, linking, or reminder systems to manage and integrate recruitment information and processes. We explored whether a commercial customer relationship management (CRM) software program - designed for sales people in businesses to improve customer relations and communications - could be used to make the research recruitment system faster, more effective, and more efficient. We found that while there was potential for long-term studies, it simply did not adapt effectively enough for our shorter study and recruitment budget. The amount of training required to master the software and our need for ongoing flexible and timely support were greater than the benefit of using CRM software for our study.
1981-04-30
However, SREM was not designed to harmonize these kinds of problems. Rather, it is a tool to investigate the logic of the processing specified in the... design . Supoorting programs were also conducted to perform basic research into such areas as software reliability, static and dynamic validation techniques...development. 0 Maintain requirements development independent of the target machine and the eventual software design . 0. Allow for easy response to
Xu, Shen; Rogers, Toby; Fairweather, Elliot; Glenn, Anthony; Curran, James; Curcin, Vasa
2018-01-01
Data provenance is a technique that describes the history of digital objects. In health data settings, it can be used to deliver auditability and transparency, and to achieve trust in a software system. However, implementing data provenance in analytics software at an enterprise level presents a different set of challenges from the research environments where data provenance was originally devised. In this paper, the challenges of reporting provenance information to the user is presented. Provenance captured from analytics software can be large and complex and visualizing a series of tasks over a long period can be overwhelming even for a domain expert, requiring visual aggregation mechanisms that fit with complex human cognitive activities involved in the process. This research studied how provenance-based reporting can be integrated into a health data analytics software, using the example of Atmolytics visual reporting tool. PMID:29888084
A general software reliability process simulation technique
NASA Technical Reports Server (NTRS)
Tausworthe, Robert C.
1991-01-01
The structure and rationale of the generalized software reliability process, together with the design and implementation of a computer program that simulates this process are described. Given assumed parameters of a particular project, the users of this program are able to generate simulated status timelines of work products, numbers of injected anomalies, and the progress of testing, fault isolation, repair, validation, and retest. Such timelines are useful in comparison with actual timeline data, for validating the project input parameters, and for providing data for researchers in reliability prediction modeling.
Agile Software Development in the Department of Defense Environment
2017-03-31
Research Methodology .............................................................................................. 17 Research Hypothesis...acquisition framework to enable greater adoption of Agile methodologies . Overview of the Research Methodology The strategy for this study was to...guidance. 17 Chapter 3 – Research Methodology This chapter defines the research methodology and processes used in the study, in an effort to
NASA Technical Reports Server (NTRS)
Allen, Bradley P.; Holtzman, Peter L.
1987-01-01
An overview is presented of the Automated Software Development Workstation Project, an effort to explore knowledge-based approaches to increasing software productivity. The project focuses on applying the concept of domain specific automatic programming systems (D-SAPSs) to application domains at NASA's Johnson Space Center. A version of a D-SAPS developed in Phase 1 of the project for the domain of space station momentum management is described. How problems encountered during its implementation led researchers to concentrate on simplifying the process of building and extending such systems is discussed. Researchers propose to do this by attacking three observed bottlenecks in the D-SAPS development process through the increased automation of the acquisition of programming knowledge and the use of an object oriented development methodology at all stages of the program design. How these ideas are being implemented in the Bauhaus, a prototype workstation for D-SAPS development is discussed.
NASA Technical Reports Server (NTRS)
Allen, Bradley P.; Holtzman, Peter L.
1988-01-01
An overview is presented of the Automated Software Development Workstation Project, an effort to explore knowledge-based approaches to increasing software productivity. The project focuses on applying the concept of domain specific automatic programming systems (D-SAPSs) to application domains at NASA's Johnson Space Flight Center. A version of a D-SAPS developed in Phase 1 of the project for the domain of space station momentum management is described. How problems encountered during its implementation led researchers to concentrate on simplifying the process of building and extending such systems is discussed. Researchers propose to do this by attacking three observed bottlenecks in the D-SAPS development process through the increased automation of the acquisition of programming knowledge and the use of an object oriented development methodology at all stages of the program design. How these ideas are being implemented in the Bauhaus, a prototype workstation for D-SAPS development is discussed.
NASA Astrophysics Data System (ADS)
Zacharek, M.; Delis, P.; Kedzierski, M.; Fryskowska, A.
2017-05-01
These studies have been conductedusing non-metric digital camera and dense image matching algorithms, as non-contact methods of creating monuments documentation.In order toprocess the imagery, few open-source software and algorithms of generating adense point cloud from images have been executed. In the research, the OSM Bundler, VisualSFM software, and web application ARC3D were used. Images obtained for each of the investigated objects were processed using those applications, and then dense point clouds and textured 3D models were created. As a result of post-processing, obtained models were filtered and scaled.The research showedthat even using the open-source software it is possible toobtain accurate 3D models of structures (with an accuracy of a few centimeters), but for the purpose of documentation and conservation of cultural and historical heritage, such accuracy can be insufficient.
Theory and Practice Meets in Industrial Process Design -Educational Perspective-
NASA Astrophysics Data System (ADS)
Aramo-Immonen, Heli; Toikka, Tarja
Software engineer should see himself as a business process designer in enterprise resource planning system (ERP) re-engineering project. Software engineers and managers should have design dialogue. The objective of this paper is to discuss the motives to study the design research in connection of management education in order to envision and understand the soft human issues in the management context. Second goal is to develop means of practicing social skills between designers and managers. This article explores the affective components of design thinking in industrial management domain. In the conceptual part of this paper are discussed concepts of network and project economy, creativity, communication, use of metaphors, and design thinking. Finally is introduced empirical research plan and first empirical results from design method experiments among the multi-disciplined groups of the master-level students of industrial engineering and management and software engineering.
2006-01-01
at TrialStat Corporation, a software company that develops applications for clinical research and drug development, and a senior consultant with...Research.” Proceedings of the conference on Designing interactive systems: processes, practices, methods, and techniques. New York, NY: ACM Press, 2000...Rogers 97] Rogers, Yvonne & Victoria. Bellotti. "Grounding Blue-Sky Research: How can Ethnography Help." ACM Interactions Magazine (May-June 1997
Optimizing IV and V for Mature Organizations
NASA Technical Reports Server (NTRS)
Fuhman, Christopher
2003-01-01
NASA is intending for its future software development agencies to have at least a Level 3 rating in the Carnegie Mellon University Capability Maturity Model (CMM). The CMM has built-in Verification and Validation (V&V) processes that support higher software quality. Independent Verification and Validation (IV&V) of software developed by mature agencies can be therefore more effective than for software developed by less mature organizations. How is Independent V&V different with respect to the maturity of an organization? Knowing a priori the maturity of an organization's processes, how can IV&V planners better identify areas of need choose IV&V activities, etc? The objective of this research is to provide a complementary set of guidelines and criteria to assist the planning of IV&V activities on a project using a priori knowledge of the measurable levels of maturity of the organization developing the software.
Characterizing and Assessing a Large-Scale Software Maintenance Organization
NASA Technical Reports Server (NTRS)
Briand, Lionel; Melo, Walcelio; Seaman, Carolyn; Basili, Victor
1995-01-01
One important component of a software process is the organizational context in which the process is enacted. This component is often missing or incomplete in current process modeling approaches. One technique for modeling this perspective is the Actor-Dependency (AD) Model. This paper reports on a case study which used this approach to analyze and assess a large software maintenance organization. Our goal was to identify the approach's strengths and weaknesses while providing practical recommendations for improvement and research directions. The AD model was found to be very useful in capturing the important properties of the organizational context of the maintenance process, and aided in the understanding of the flaws found in this process. However, a number of opportunities for extending and improving the AD model were identified. Among others, there is a need to incorporate quantitative information to complement the qualitative model.
A Software Platform for Post-Processing Waveform-Based NDE
NASA Technical Reports Server (NTRS)
Roth, Donald J.; Martin, Richard E.; Seebo, Jeff P.; Trinh, Long B.; Walker, James L.; Winfree, William P.
2007-01-01
Ultrasonic, microwave, and terahertz nondestructive evaluation imaging systems generally require the acquisition of waveforms at each scan point to form an image. For such systems, signal and image processing methods are commonly needed to extract information from the waves and improve resolution of, and highlight, defects in the image. Since some similarity exists for all waveform-based NDE methods, it would seem a common software platform containing multiple signal and image processing techniques to process the waveforms and images makes sense where multiple techniques, scientists, engineers, and organizations are involved. This presentation describes NASA Glenn Research Center's approach in developing a common software platform for processing waveform-based NDE signals and images. This platform is currently in use at NASA Glenn and at Lockheed Martin Michoud Assembly Facility for processing of pulsed terahertz and ultrasonic data. Highlights of the software operation will be given. A case study will be shown for use with terahertz data. The authors also request scientists and engineers who are interested in sharing customized signal and image processing algorithms to contribute to this effort by letting the authors code up and include these algorithms in future releases.
Parallel-Processing Test Bed For Simulation Software
NASA Technical Reports Server (NTRS)
Blech, Richard; Cole, Gary; Townsend, Scott
1996-01-01
Second-generation Hypercluster computing system is multiprocessor test bed for research on parallel algorithms for simulation in fluid dynamics, electromagnetics, chemistry, and other fields with large computational requirements but relatively low input/output requirements. Built from standard, off-shelf hardware readily upgraded as improved technology becomes available. System used for experiments with such parallel-processing concepts as message-passing algorithms, debugging software tools, and computational steering. First-generation Hypercluster system described in "Hypercluster Parallel Processor" (LEW-15283).
The effect of observing session duration on OPUS-RS results
NASA Astrophysics Data System (ADS)
Dincer Dogru, A.; Ugur Sanli, D.; Hayal, Adem G.; Berber, Mustafa
2016-04-01
Online GPS positioning software has now a widespread interest among practitioners and researchers. Researchers rescently use online software to monitor natural hazards such as landslides. The fact that this software usually employs continuously operating GPS stations of the International GNSS Service (IGS) as reference stations in the processing, the community of world-wide users is growing day by day. In the monitoring of landslides, rapid static mode of a GPS surveying is usually preferred because it is possible to have wider field coverage with only a few minutes of data and low cost ground markers. Results comparable to static positioning can be obtained with careful network design and processing strategies. Some online software such as OPUS-RS developed by the National Geodetic Survey (NGS) of the USA provides rapid static positioning engine that processes GPS data from sessions of only a few minutes. 15-minute is the recommended/standard observing session duration for OPUS-RS processing. In this study, using the CORS data operating in the US, we carried out some tests in which the observing session duration is changed from 8 through 118 minutes, and observed the accuracy change on the OPUS-RS solutions. Then we compared the results with the accuracy levels given for 15-min solutions by the NGS. We determined that there is the effect of changing observing session duration on the obtained results, and we report them in this study.
Writing in the Ether: A Collaborative Approach to Academic Research.
ERIC Educational Resources Information Center
Winograd, David; Milton, Katherine
The purpose of this paper is to shed light on the developmental stages of academic publication collaborations through both research on the collaborative process itself, as well as through analysis of the discovery process. Using the qualitative software package, NUD*IST, the teleconferencing system, FirstClass, and standard e-mail, the study…
Software tool for portal dosimetry research.
Vial, P; Hunt, P; Greer, P B; Oliver, L; Baldock, C
2008-09-01
This paper describes a software tool developed for research into the use of an electronic portal imaging device (EPID) to verify dose for intensity modulated radiation therapy (IMRT) beams. A portal dose image prediction (PDIP) model that predicts the EPID response to IMRT beams has been implemented into a commercially available treatment planning system (TPS). The software tool described in this work was developed to modify the TPS PDIP model by incorporating correction factors into the predicted EPID image to account for the difference in EPID response to open beam radiation and multileaf collimator (MLC) transmitted radiation. The processes performed by the software tool include; i) read the MLC file and the PDIP from the TPS, ii) calculate the fraction of beam-on time that each point in the IMRT beam is shielded by MLC leaves, iii) interpolate correction factors from look-up tables, iv) create a corrected PDIP image from the product of the original PDIP and the correction factors and write the corrected image to file, v) display, analyse, and export various image datasets. The software tool was developed using the Microsoft Visual Studio.NET framework with the C# compiler. The operation of the software tool was validated. This software provided useful tools for EPID dosimetry research, and it is being utilised and further developed in ongoing EPID dosimetry and IMRT dosimetry projects.
Proposal for constructing an advanced software tool for planetary atmospheric modeling
NASA Technical Reports Server (NTRS)
Keller, Richard M.; Sims, Michael H.; Podolak, Esther; Mckay, Christopher P.; Thompson, David E.
1990-01-01
Scientific model building can be a time intensive and painstaking process, often involving the development of large and complex computer programs. Despite the effort involved, scientific models cannot easily be distributed and shared with other scientists. In general, implemented scientific models are complex, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We believe that advanced software techniques can facilitate both the model building and model sharing process. We propose to construct a scientific modeling software tool that serves as an aid to the scientist in developing and using models. The proposed tool will include an interactive intelligent graphical interface and a high level, domain specific, modeling language. As a testbed for this research, we propose development of a software prototype in the domain of planetary atmospheric modeling.
NASA Technical Reports Server (NTRS)
Mattox, J. R.; Bertsch, D. L.; Fichtel, C. E.; Hartman, R. C.; Hunter, S. D.; Kanbach, G.; Kniffen, D. A.; Kwok, P. W.; Lin, Y. C.; Mayer-Hasselwander, H. A.
1992-01-01
We describe the Energetic Gamma Ray Experiment Telescope (EGRET) data products which we anticipate will suffice for virtually all guest and archival investigations. The production process, content, availability, format, and the associated software of each product is described. Supplied here is sufficient detail for each researcher to do analysis which is not supported by extant software.
Painting a picture across the landscape with ModelMap
Brian Cooke; Elizabeth Freeman; Gretchen Moisen; Tracey Frescino
2017-01-01
Scientists and statisticians working for the Rocky Mountain Research Station have created a software package that simplifies and automates many of the processes needed for converting models into maps. This software package, called ModelMap, has helped a variety of specialists and land managers to quickly convert data into easily understood graphical images. The...
Integrated tools and techniques applied to the TES ground data system
NASA Technical Reports Server (NTRS)
Morrison, B. A.
2000-01-01
The author of this paper will dicuss the selection of CASE tools, a decision making process, requirements tracking and a review mechanism that leads to a highly integrated approach to software development that must deal with the constant pressure to change software requirements and design that is associated with research and development.
User Interactive Software for Analysis of Human Physiological Data
NASA Technical Reports Server (NTRS)
Cowings, Patricia S.; Toscano, William; Taylor, Bruce C.; Acharya, Soumydipta
2006-01-01
Ambulatory physiological monitoring has been used to study human health and performance in space and in a variety of Earth-based environments (e.g., military aircraft, armored vehicles, small groups in isolation, and patients). Large, multi-channel data files are typically recorded in these environments, and these files often require the removal of contaminated data prior to processing and analyses. Physiological data processing can now be performed with user-friendly, interactive software developed by the Ames Psychophysiology Research Laboratory. This software, which runs on a Windows platform, contains various signal-processing routines for both time- and frequency- domain data analyses (e.g., peak detection, differentiation and integration, digital filtering, adaptive thresholds, Fast Fourier Transform power spectrum, auto-correlation, etc.). Data acquired with any ambulatory monitoring system that provides text or binary file format are easily imported to the processing software. The application provides a graphical user interface where one can manually select and correct data artifacts utilizing linear and zero interpolation and adding trigger points for missed peaks. Block and moving average routines are also provided for data reduction. Processed data in numeric and graphic format can be exported to Excel. This software, PostProc (for post-processing) requires the Dadisp engineering spreadsheet (DSP Development Corp), or equivalent, for implementation. Specific processing routines were written for electrocardiography, electroencephalography, electromyography, blood pressure, skin conductance level, impedance cardiography (cardiac output, stroke volume, thoracic fluid volume), temperature, and respiration
DOE Office of Scientific and Technical Information (OSTI.GOV)
AlperEker; Mark Giammattia; Paul Houpt
''Intelligent Extruder'' described in this report is a software system and associated support services for monitoring and control of compounding extruders to improve material quality, reduce waste and energy use, with minimal addition of new sensors or changes to the factory floor system components. Emphasis is on process improvements to the mixing, melting and de-volatilization of base resins, fillers, pigments, fire retardants and other additives in the :finishing'' stage of high value added engineering polymer materials. While GE Plastics materials were used for experimental studies throughout the program, the concepts and principles are broadly applicable to other manufacturers materials. Themore » project involved a joint collaboration among GE Global Research, GE Industrial Systems and Coperion Werner & Pleiderer, USA, a major manufacturer of compounding equipment. Scope of the program included development of a algorithms for monitoring process material viscosity without rheological sensors or generating waste streams, a novel detection scheme for rapid detection of process upsets and an adaptive feedback control system to compensate for process upsets where at line adjustments are feasible. Software algorithms were implemented and tested on a laboratory scale extruder (50 lb/hr) at GE Global Research and data from a production scale system (2000 lb/hr) at GE Plastics was used to validate the monitoring and detection software. Although not evaluated experimentally, a new concept for extruder process monitoring through estimation of high frequency drive torque without strain gauges is developed and demonstrated in simulation. A plan to commercialize the software system is outlined, but commercialization has not been completed.« less
Flexible control techniques for a lunar base
NASA Technical Reports Server (NTRS)
Kraus, Thomas W.
1992-01-01
The fundamental elements found in every terrestrial control system can be employed in all lunar applications. These elements include sensors which measure physical properties, controllers which acquire sensor data and calculate a control response, and actuators which apply the control output to the process. The unique characteristics of the lunar environment will certainly require the development of new control system technology. However, weightlessness, harsh atmospheric conditions, temperature extremes, and radiation hazards will most significantly impact the design of sensors and actuators. The controller and associated control algorithms, which are the most complex element of any control system, can be derived in their entirety from existing technology. Lunar process control applications -- ranging from small-scale research projects to full-scale processing plants -- will benefit greatly from the controller advances being developed today. In particular, new software technology aimed at commercial process monitoring and control applications will almost completely eliminate the need for custom programs and the lengthy development and testing cycle they require. The applicability of existing industrial software to lunar applications has other significant advantages in addition to cost and quality. This software is designed to run on standard hardware platforms and takes advantage of existing LAN and telecommunications technology. Further, in order to exploit the existing commercial market, the software is being designed to be implemented by users of all skill levels -- typically users who are familiar with their process, but not necessarily with software or control theory. This means that specialized technical support personnel will not need to be on-hand, and the associated costs are eliminated. Finally, the latest industrial software designed for the commercial market is extremely flexible, in order to fit the requirements of many types of processing applications with little or no customization. This means that lunar process control projects will not be delayed by unforeseen problems or last minute process modifications. The software will include all of the tools needed to adapt to virtually any changes. In contrast to other space programs which required the development of tremendous amounts of custom software, lunar-based processing facilities will benefit from the use of existing software technology which is being proven in commercial applications on Earth.
Exploiting Software Tool Towards Easier Use And Higher Efficiency
NASA Astrophysics Data System (ADS)
Lin, G. H.; Su, J. T.; Deng, Y. Y.
2006-08-01
In developing countries, using data based on instrument made by themselves in maximum extent is very important. It is not only related to maximizing science returns upon prophase investment -- deep accumulations in every aspects but also science output. Based on the idea, we are exploiting a software (called THDP: Tool of Huairou Data Processing). It is used for processing a series of issues, which is met necessary in processing data. This paper discusses its designed purpose, functions, method and specialities. The primary vehicle for general data interpretation is through various techniques of data visualization, techniques of interactive. In the software, we employed Object Oriented approach. It is appropriate to the vehicle. it is imperative that the approach provide not only function, but do so in as convenient a fashion as possible. As result of the software exploiting, it is not only easier to learn data processing for beginner and more convenienter to need further improvement for senior but also increase greatly efficiency in every phrases include analyse, parameter adjusting, result display. Under frame of virtual observatory, for developing countries, we should study more and newer related technologies, which can advance ability and efficiency in science research, like the software we are developing
Past missions - the best way to train future planetary researchers
NASA Astrophysics Data System (ADS)
Kozlova, Natalia; Solodovnikova, Anastasiya; Zubarev, Anatoly; Garov, Andrey; Patraty, Vyacheslav; Kokhanov, Alexander; Karachevtseva, Irina; Nadezhdina, Irina; Konopikhin, Anatoly; Oberst, Juergen
2015-04-01
Practice shows that it is much more interesting and useful to learn from real examples than on imaginary tasks from exercise books. The more technologies and software improves and develops, the more information and new products can be obtained from new processing of archive information collected by past planetary missions. So at MIIGAiK we carry out modern processing of lunar panoramic images obtained by Soviet Lunokhod missions (1970-1973). During two years of the study, which is a part of PRoViDE project (http://www.provide-space.eu/), many students, PhD students, young scientists, as well as professors have taken part in this research. Processing of the data obtained so long ago requires development of specific methods, techniques, special software and extraordinary approach. All these points help to interest young people in planetary science and develop their skills as researchers. Another advantage of data from previous missions is that you can compare your results with the ones obtained during the mission. This also helps to test the developed techniques and software on real data and adjust them for implementation in future missions. The work on Lunokhod data processing became the basis of master and PhD theses of MIIGAiK students and scientists at MExLab. Acknowledgments: The research leading to these results has received funding from the European Community's Seventh Framework Programme (FP7/2007-2013) under grant agreement No 312377 PRoViDE.
NASA Astrophysics Data System (ADS)
Moon, Hongsik
What is the impact of multicore and associated advanced technologies on computational software for science? Most researchers and students have multicore laptops or desktops for their research and they need computing power to run computational software packages. Computing power was initially derived from Central Processing Unit (CPU) clock speed. That changed when increases in clock speed became constrained by power requirements. Chip manufacturers turned to multicore CPU architectures and associated technological advancements to create the CPUs for the future. Most software applications benefited by the increased computing power the same way that increases in clock speed helped applications run faster. However, for Computational ElectroMagnetics (CEM) software developers, this change was not an obvious benefit - it appeared to be a detriment. Developers were challenged to find a way to correctly utilize the advancements in hardware so that their codes could benefit. The solution was parallelization and this dissertation details the investigation to address these challenges. Prior to multicore CPUs, advanced computer technologies were compared with the performance using benchmark software and the metric was FLoting-point Operations Per Seconds (FLOPS) which indicates system performance for scientific applications that make heavy use of floating-point calculations. Is FLOPS an effective metric for parallelized CEM simulation tools on new multicore system? Parallel CEM software needs to be benchmarked not only by FLOPS but also by the performance of other parameters related to type and utilization of the hardware, such as CPU, Random Access Memory (RAM), hard disk, network, etc. The codes need to be optimized for more than just FLOPs and new parameters must be included in benchmarking. In this dissertation, the parallel CEM software named High Order Basis Based Integral Equation Solver (HOBBIES) is introduced. This code was developed to address the needs of the changing computer hardware platforms in order to provide fast, accurate and efficient solutions to large, complex electromagnetic problems. The research in this dissertation proves that the performance of parallel code is intimately related to the configuration of the computer hardware and can be maximized for different hardware platforms. To benchmark and optimize the performance of parallel CEM software, a variety of large, complex projects are created and executed on a variety of computer platforms. The computer platforms used in this research are detailed in this dissertation. The projects run as benchmarks are also described in detail and results are presented. The parameters that affect parallel CEM software on High Performance Computing Clusters (HPCC) are investigated. This research demonstrates methods to maximize the performance of parallel CEM software code.
A Data Accounting System for Clinical Investigators
Kashner, T. Michael; Hinson, Robert; Holland, Gloria J.; Mickey, Don D.; Hoffman, Keith; Lind, Lisa; Johnson, Linda D.; Chang, Barbara K.; Golden, Richard M.; Henley, Steven S.
2007-01-01
Clinical investigators often preprocess, process, and analyze their data without benefit of formally organized research centers to oversee data management. This article outlines a practical three-file structure to help guide these investigators track and document their data through processing and analyses. The proposed process can be implemented without additional training or specialized software. Thus, it is particularly well suited for research projects with small budgets or limited access to viable research/data coordinating centers. PMID:17460138
Digital Methodology to implement the ECOUTER engagement process.
Wilson, Rebecca C; Butters, Oliver W; Clark, Tom; Minion, Joel; Turner, Andrew; Murtagh, Madeleine J
2016-01-01
ECOUTER ( E mploying CO ncept u al schema for policy and T ranslation E in R esearch - French for 'to listen' - is a new stakeholder engagement method incorporating existing evidence to help participants draw upon their own knowledge of cognate issues and interact on a topic of shared concern. The results of an ECOUTER can form the basis of recommendations for research, governance, practice and/or policy. This paper describes the development of a digital methodology for the ECOUTER engagement process based on currently available mind mapping freeware software. The implementation of an ECOUTER process tailored to applications within health studies are outlined for both online and face-to-face scenarios. Limitations of the present digital methodology are discussed, highlighting the requirement of a purpose built software for ECOUTER research purposes.
The eSMAF: a software for the assessment and follow-up of functional autonomy in geriatrics
Boissy, Patrick; Brière, Simon; Tousignant, Michel; Rousseau, Eric
2007-01-01
Background Functional status or disability forms the core of most assessment instruments used to identify mix and level of resources and services needed by older adults who possess common characteristics. The Functional Autonomy Measurement System (SMAF) is a 29-item scale measuring functional ability in five different areas. It has been recommended for use for home care, for allocation of chronic beds, for developing care plans in institutional settings and for epidemiological and evaluative studies. The SMAF can also be used with a case-mix classification system (Iso-SMAF) to allocate resources based on patients' functional autonomy characteristics. The objective of this project was to develop a software version of the SMAF to facilitate the evaluation of the functional status of older adults in health services research and to optimize the clinical decision-making process. Results The eSMAF was developed over an 24-month period using a modified waterfall software engineering process. Requirements and functional specifications were determined using focus groups of stakeholders. Different versions of the software were iteratively field-tested in clinical and research environments and software adaptations made accordingly. User documentation and online help were created to assist the deployment of the software. The software is available in French or English versions under a 30-day unregistered demonstration license or a free restricted registered academic license. It can be used locally on a Windows-based PC or over a network to input SMAF data into a database, search and aggregate client data according to clinical and/or administrative criteria, and generate summary or detailed reports of selected data sets for print or export to another database. Conclusion In the last year, the software has been successfully deployed in the clinical workflow of different institutions in research and clinical applications. The software performed relatively well in terms of stability and performance. Barriers to implementation included antiquated computer hardware, low computer literacy and access to IT support. Key factors for the deployment of the software included standardization of the workflow, user training and support. PMID:17298673
Emerging Uses of Computer Technology in Qualitative Research.
ERIC Educational Resources Information Center
Parker, D. Randall
The application of computer technology in qualitative research and evaluation ranges from simple word processing to doing sophisticated data sorting and retrieval. How computer software can be used for qualitative research is discussed. Researchers should consider the use of computers in data analysis in light of their own familiarity and comfort…
Lin, Steve; Turgulov, Anuar; Taher, Ahmed; Buick, Jason E; Byers, Adam; Drennan, Ian R; Hu, Samantha; J Morrison, Laurie
2016-10-01
Cardiopulmonary resuscitation (CPR) process measures research and quality assurance has traditionally been limited to the first 5 minutes of resuscitation due to significant costs in time, resources, and personnel from manual data abstraction. CPR performance may change over time during prolonged resuscitations, which represents a significant knowledge gap. Moreover, currently available commercial software output of CPR process measures are difficult to analyze. The objective was to develop and validate a software program to help automate the abstraction and transfer of CPR process measures data from electronic defibrillators for complete episodes of cardiac arrest resuscitation. We developed a software program to facilitate and help automate CPR data abstraction and transfer from electronic defibrillators for entire resuscitation episodes. Using an intermediary Extensible Markup Language export file, the automated software transfers CPR process measures data (electrocardiogram [ECG] number, CPR start time, number of ventilations, number of chest compressions, compression rate per minute, compression depth per minute, compression fraction, and end-tidal CO 2 per minute). We performed an internal validation of the software program on 50 randomly selected cardiac arrest cases with resuscitation durations between 15 and 60 minutes. CPR process measures were manually abstracted and transferred independently by two trained data abstractors and by the automated software program, followed by manual interpretation of raw ECG tracings, treatment interventions, and patient events. Error rates and the time needed for data abstraction, transfer, and interpretation were measured for both manual and automated methods, compared to an additional independent reviewer. A total of 9,826 data points were each abstracted by the two abstractors and by the software program. Manual data abstraction resulted in a total of six errors (0.06%) compared to zero errors by the software program. The mean ± SD time measured per case for manual data abstraction was 20.3 ± 2.7 minutes compared to 5.3 ± 1.4 minutes using the software program (p = 0.003). We developed and validated an automated software program that efficiently abstracts and transfers CPR process measures data from electronic defibrillators for complete cardiac arrest episodes. This software will enable future cardiac arrest studies and quality assurance programs to evaluate the impact of CPR process measures during prolonged resuscitations. © 2016 by the Society for Academic Emergency Medicine.
The aerospace energy systems laboratory: Hardware and software implementation
NASA Technical Reports Server (NTRS)
Glover, Richard D.; Oneil-Rood, Nora
1989-01-01
For many years NASA Ames Research Center, Dryden Flight Research Facility has employed automation in the servicing of flight critical aircraft batteries. Recently a major upgrade to Dryden's computerized Battery Systems Laboratory was initiated to incorporate distributed processing and a centralized database. The new facility, called the Aerospace Energy Systems Laboratory (AESL), is being mechanized with iAPX86 and iAPX286 hardware running iRMX86. The hardware configuration and software structure for the AESL are described.
MRIVIEW: An interactive computational tool for investigation of brain structure and function
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ranken, D.; George, J.
MRIVIEW is a software system which uses image processing and visualization to provide neuroscience researchers with an integrated environment for combining functional and anatomical information. Key features of the software include semi-automated segmentation of volumetric head data and an interactive coordinate reconciliation method which utilizes surface visualization. The current system is a precursor to a computational brain atlas. We describe features this atlas will incorporate, including methods under development for visualizing brain functional data obtained from several different research modalities.
A Digital Repository and Execution Platform for Interactive Scholarly Publications in Neuroscience.
Hodge, Victoria; Jessop, Mark; Fletcher, Martyn; Weeks, Michael; Turner, Aaron; Jackson, Tom; Ingram, Colin; Smith, Leslie; Austin, Jim
2016-01-01
The CARMEN Virtual Laboratory (VL) is a cloud-based platform which allows neuroscientists to store, share, develop, execute, reproduce and publicise their work. This paper describes new functionality in the CARMEN VL: an interactive publications repository. This new facility allows users to link data and software to publications. This enables other users to examine data and software associated with the publication and execute the associated software within the VL using the same data as the authors used in the publication. The cloud-based architecture and SaaS (Software as a Service) framework allows vast data sets to be uploaded and analysed using software services. Thus, this new interactive publications facility allows others to build on research results through reuse. This aligns with recent developments by funding agencies, institutions, and publishers with a move to open access research. Open access provides reproducibility and verification of research resources and results. Publications and their associated data and software will be assured of long-term preservation and curation in the repository. Further, analysing research data and the evaluations described in publications frequently requires a number of execution stages many of which are iterative. The VL provides a scientific workflow environment to combine software services into a processing tree. These workflows can also be associated with publications and executed by users. The VL also provides a secure environment where users can decide the access rights for each resource to ensure copyright and privacy restrictions are met.
Testing Scientific Software: A Systematic Literature Review
Kanewala, Upulee; Bieman, James M.
2014-01-01
Context Scientific software plays an important role in critical decision making, for example making weather predictions based on climate models, and computation of evidence for research publications. Recently, scientists have had to retract publications due to errors caused by software faults. Systematic testing can identify such faults in code. Objective This study aims to identify specific challenges, proposed solutions, and unsolved problems faced when testing scientific software. Method We conducted a systematic literature survey to identify and analyze relevant literature. We identified 62 studies that provided relevant information about testing scientific software. Results We found that challenges faced when testing scientific software fall into two main categories: (1) testing challenges that occur due to characteristics of scientific software such as oracle problems and (2) testing challenges that occur due to cultural differences between scientists and the software engineering community such as viewing the code and the model that it implements as inseparable entities. In addition, we identified methods to potentially overcome these challenges and their limitations. Finally we describe unsolved challenges and how software engineering researchers and practitioners can help to overcome them. Conclusions Scientific software presents special challenges for testing. Specifically, cultural differences between scientist developers and software engineers, along with the characteristics of the scientific software make testing more difficult. Existing techniques such as code clone detection can help to improve the testing process. Software engineers should consider special challenges posed by scientific software such as oracle problems when developing testing techniques. PMID:25125798
A multiarchitecture parallel-processing development environment
NASA Technical Reports Server (NTRS)
Townsend, Scott; Blech, Richard; Cole, Gary
1993-01-01
A description is given of the hardware and software of a multiprocessor test bed - the second generation Hypercluster system. The Hypercluster architecture consists of a standard hypercube distributed-memory topology, with multiprocessor shared-memory nodes. By using standard, off-the-shelf hardware, the system can be upgraded to use rapidly improving computer technology. The Hypercluster's multiarchitecture nature makes it suitable for researching parallel algorithms in computational field simulation applications (e.g., computational fluid dynamics). The dedicated test-bed environment of the Hypercluster and its custom-built software allows experiments with various parallel-processing concepts such as message passing algorithms, debugging tools, and computational 'steering'. Such research would be difficult, if not impossible, to achieve on shared, commercial systems.
Software for rapid prototyping in the pharmaceutical and biotechnology industries.
Kappler, Michael A
2008-05-01
The automation of drug discovery methods continues to develop, especially techniques that process information, represent workflow and facilitate decision-making. The magnitude of data and the plethora of questions in pharmaceutical and biotechnology research give rise to the need for rapid prototyping software. This review describes the advantages and disadvantages of three solutions: Competitive Workflow, Taverna and Pipeline Pilot. Each of these systems processes large amounts of data, integrates diverse systems and assists novice programmers and human experts in critical decision-making steps.
Maintaining Research Documents with Database Management Software.
ERIC Educational Resources Information Center
Harrington, Stuart A.
1999-01-01
Discusses taking notes for research projects and organizing them into card files; reviews the literature on personal filing systems; introduces the basic process of database management; and offers a plan for managing research notes. Describes field groups and field definitions, data entry, and creating reports. (LRW)
Research and Development in Very Long Baseline Interferometry (VLBI)
NASA Technical Reports Server (NTRS)
Himwich, William E.
2004-01-01
Contents include the following: 1.Observation coordination. 2. Data acquisition system control software. 3. Station support. 4. Correlation, data processing, and analysis. 5. Data distribution and archiving. 6. Technique improvement and research. 7. Computer support.
Churilov, Leonid; Liu, Daniel; Ma, Henry; Christensen, Soren; Nagakane, Yoshinari; Campbell, Bruce; Parsons, Mark W; Levi, Christopher R; Davis, Stephen M; Donnan, Geoffrey A
2013-04-01
The appropriateness of a software platform for rapid MRI assessment of the amount of salvageable brain tissue after stroke is critical for both the validity of the Extending the Time for Thrombolysis in Emergency Neurological Deficits (EXTEND) Clinical Trial of stroke thrombolysis beyond 4.5 hours and for stroke patient care outcomes. The objective of this research is to develop and implement a methodology for selecting the acute stroke imaging software platform most appropriate for the setting of a multi-centre clinical trial. A multi-disciplinary decision making panel formulated the set of preferentially independent evaluation attributes. Alternative Multi-Attribute Value Measurement methods were used to identify the best imaging software platform followed by sensitivity analysis to ensure the validity and robustness of the proposed solution. Four alternative imaging software platforms were identified. RApid processing of PerfusIon and Diffusion (RAPID) software was selected as the most appropriate for the needs of the EXTEND trial. A theoretically grounded generic multi-attribute selection methodology for imaging software was developed and implemented. The developed methodology assured both a high quality decision outcome and a rational and transparent decision process. This development contributes to stroke literature in the area of comprehensive evaluation of MRI clinical software. At the time of evaluation, RAPID software presented the most appropriate imaging software platform for use in the EXTEND clinical trial. The proposed multi-attribute imaging software evaluation methodology is based on sound theoretical foundations of multiple criteria decision analysis and can be successfully used for choosing the most appropriate imaging software while ensuring both robust decision process and outcomes. © 2012 The Authors. International Journal of Stroke © 2012 World Stroke Organization.
Computational knowledge integration in biopharmaceutical research.
Ficenec, David; Osborne, Mark; Pradines, Joel; Richards, Dan; Felciano, Ramon; Cho, Raymond J; Chen, Richard O; Liefeld, Ted; Owen, James; Ruttenberg, Alan; Reich, Christian; Horvath, Joseph; Clark, Tim
2003-09-01
An initiative to increase biopharmaceutical research productivity by capturing, sharing and computationally integrating proprietary scientific discoveries with public knowledge is described. This initiative involves both organisational process change and multiple interoperating software systems. The software components rely on mutually supporting integration techniques. These include a richly structured ontology, statistical analysis of experimental data against stored conclusions, natural language processing of public literature, secure document repositories with lightweight metadata, web services integration, enterprise web portals and relational databases. This approach has already begun to increase scientific productivity in our enterprise by creating an organisational memory (OM) of internal research findings, accessible on the web. Through bringing together these components it has also been possible to construct a very large and expanding repository of biological pathway information linked to this repository of findings which is extremely useful in analysis of DNA microarray data. This repository, in turn, enables our research paradigm to be shifted towards more comprehensive systems-based understandings of drug action.
Computer programming for generating visual stimuli.
Bukhari, Farhan; Kurylo, Daniel D
2008-02-01
Critical to vision research is the generation of visual displays with precise control over stimulus metrics. Generating stimuli often requires adapting commercial software or developing specialized software for specific research applications. In order to facilitate this process, we give here an overview that allows nonexpert users to generate and customize stimuli for vision research. We first give a review of relevant hardware and software considerations, to allow the selection of display hardware, operating system, programming language, and graphics packages most appropriate for specific research applications. We then describe the framework of a generic computer program that can be adapted for use with a broad range of experimental applications. Stimuli are generated in the context of trial events, allowing the display of text messages, the monitoring of subject responses and reaction times, and the inclusion of contingency algorithms. This approach allows direct control and management of computer-generated visual stimuli while utilizing the full capabilities of modern hardware and software systems. The flowchart and source code for the stimulus-generating program may be downloaded from www.psychonomic.org/archive.
An intelligent CNC machine control system architecture
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, D.J.; Loucks, C.S.
1996-10-01
Intelligent, agile manufacturing relies on automated programming of digitally controlled processes. Currently, processes such as Computer Numerically Controlled (CNC) machining are difficult to automate because of highly restrictive controllers and poor software environments. It is also difficult to utilize sensors and process models for adaptive control, or to integrate machining processes with other tasks within a factory floor setting. As part of a Laboratory Directed Research and Development (LDRD) program, a CNC machine control system architecture based on object-oriented design and graphical programming has been developed to address some of these problems and to demonstrate automated agile machining applications usingmore » platform-independent software.« less
NASA Technical Reports Server (NTRS)
Fournelle, John; Carpenter, Paul
2006-01-01
Modem electron microprobe systems have become increasingly sophisticated. These systems utilize either UNIX or PC computer systems for measurement, automation, and data reduction. These systems have undergone major improvements in processing, storage, display, and communications, due to increased capabilities of hardware and software. Instrument specifications are typically utilized at the time of purchase and concentrate on hardware performance. The microanalysis community includes analysts, researchers, software developers, and manufacturers, who could benefit from exchange of ideas and the ultimate development of core community specifications (CCS) for hardware and software components of microprobe instrumentation and operating systems.
An overview of the mathematical and statistical analysis component of RICIS
NASA Technical Reports Server (NTRS)
Hallum, Cecil R.
1987-01-01
Mathematical and statistical analysis components of RICIS (Research Institute for Computing and Information Systems) can be used in the following problem areas: (1) quantification and measurement of software reliability; (2) assessment of changes in software reliability over time (reliability growth); (3) analysis of software-failure data; and (4) decision logic for whether to continue or stop testing software. Other areas of interest to NASA/JSC where mathematical and statistical analysis can be successfully employed include: math modeling of physical systems, simulation, statistical data reduction, evaluation methods, optimization, algorithm development, and mathematical methods in signal processing.
Maine Facility Research Summary : Dynamic Sign Systems for Narrow Bridges
DOT National Transportation Integrated Search
1997-09-01
This report describes the development of operational surveillance data processing algorithms and software for application to urban freeway systems, conforming to a framework in which data processing is performed in stages: sensor malfunction detectio...
ERIC Educational Resources Information Center
Yildiz, Avni; Baltaci, Serdal; Demir, Betül Küçük
2017-01-01
Creativity has a significant role in individuals' lives. This research aims to examine the reflection of the learning process of analytic geometry concepts through GeoGebra software and its effect upon the development of preservice mathematics teachers' creative thinking skills. This effect is expected to make a significant contribution to the…
Scout: high-performance heterogeneous computing made simple
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jablin, James; Mc Cormick, Patrick; Herlihy, Maurice
2011-01-26
Researchers must often write their own simulation and analysis software. During this process they simultaneously confront both computational and scientific problems. Current strategies for aiding the generation of performance-oriented programs do not abstract the software development from the science. Furthermore, the problem is becoming increasingly complex and pressing with the continued development of many-core and heterogeneous (CPU-GPU) architectures. To acbieve high performance, scientists must expertly navigate both software and hardware. Co-design between computer scientists and research scientists can alleviate but not solve this problem. The science community requires better tools for developing, optimizing, and future-proofing codes, allowing scientists to focusmore » on their research while still achieving high computational performance. Scout is a parallel programming language and extensible compiler framework targeting heterogeneous architectures. It provides the abstraction required to buffer scientists from the constantly-shifting details of hardware while still realizing higb-performance by encapsulating software and hardware optimization within a compiler framework.« less
libdrdc: software standards library
NASA Astrophysics Data System (ADS)
Erickson, David; Peng, Tie
2008-04-01
This paper presents the libdrdc software standards library including internal nomenclature, definitions, units of measure, coordinate reference frames, and representations for use in autonomous systems research. This library is a configurable, portable C-function wrapped C++ / Object Oriented C library developed to be independent of software middleware, system architecture, processor, or operating system. It is designed to use the automatically-tuned linear algebra suite (ATLAS) and Basic Linear Algebra Suite (BLAS) and port to firmware and software. The library goal is to unify data collection and representation for various microcontrollers and Central Processing Unit (CPU) cores and to provide a common Application Binary Interface (ABI) for research projects at all scales. The library supports multi-platform development and currently works on Windows, Unix, GNU/Linux, and Real-Time Executive for Multiprocessor Systems (RTEMS). This library is made available under LGPL version 2.1 license.
Development of N-version software samples for an experiment in software fault tolerance
NASA Technical Reports Server (NTRS)
Lauterbach, L.
1987-01-01
The report documents the task planning and software development phases of an effort to obtain twenty versions of code independently designed and developed from a common specification. These versions were created for use in future experiments in software fault tolerance, in continuation of the experimental series underway at the Systems Validation Methods Branch (SVMB) at NASA Langley Research Center. The 20 versions were developed under controlled conditions at four U.S. universities, by 20 teams of two researchers each. The versions process raw data from a modified Redundant Strapped Down Inertial Measurement Unit (RSDIMU). The specifications, and over 200 questions submitted by the developers concerning the specifications, are included as appendices to this report. Design documents, and design and code walkthrough reports for each version, were also obtained in this task for use in future studies.
HEP Community White Paper on Software Trigger and Event Reconstruction: Executive Summary
DOE Office of Scientific and Technical Information (OSTI.GOV)
Albrecht, Johannes; et al.
Realizing the physics programs of the planned and upgraded high-energy physics (HEP) experiments over the next 10 years will require the HEP community to address a number of challenges in the area of software and computing. For this reason, the HEP software community has engaged in a planning process over the past two years, with the objective of identifying and prioritizing the research and development required to enable the next generation of HEP detectors to fulfill their full physics potential. The aim is to produce a Community White Paper which will describe the community strategy and a roadmap for softwaremore » and computing research and development in HEP for the 2020s. The topics of event reconstruction and software triggers were considered by a joint working group and are summarized together in this document.« less
HEP Community White Paper on Software Trigger and Event Reconstruction
DOE Office of Scientific and Technical Information (OSTI.GOV)
Albrecht, Johannes; et al.
Realizing the physics programs of the planned and upgraded high-energy physics (HEP) experiments over the next 10 years will require the HEP community to address a number of challenges in the area of software and computing. For this reason, the HEP software community has engaged in a planning process over the past two years, with the objective of identifying and prioritizing the research and development required to enable the next generation of HEP detectors to fulfill their full physics potential. The aim is to produce a Community White Paper which will describe the community strategy and a roadmap for softwaremore » and computing research and development in HEP for the 2020s. The topics of event reconstruction and software triggers were considered by a joint working group and are summarized together in this document.« less
Open Source Clinical NLP - More than Any Single System.
Masanz, James; Pakhomov, Serguei V; Xu, Hua; Wu, Stephen T; Chute, Christopher G; Liu, Hongfang
2014-01-01
The number of Natural Language Processing (NLP) tools and systems for processing clinical free-text has grown as interest and processing capability have surged. Unfortunately any two systems typically cannot simply interoperate, even when both are built upon a framework designed to facilitate the creation of pluggable components. We present two ongoing activities promoting open source clinical NLP. The Open Health Natural Language Processing (OHNLP) Consortium was originally founded to foster a collaborative community around clinical NLP, releasing UIMA-based open source software. OHNLP's mission currently includes maintaining a catalog of clinical NLP software and providing interfaces to simplify the interaction of NLP systems. Meanwhile, Apache cTAKES aims to integrate best-of-breed annotators, providing a world-class NLP system for accessing clinical information within free-text. These two activities are complementary. OHNLP promotes open source clinical NLP activities in the research community and Apache cTAKES bridges research to the health information technology (HIT) practice.
Towards Portable Large-Scale Image Processing with High-Performance Computing.
Huo, Yuankai; Blaber, Justin; Damon, Stephen M; Boyd, Brian D; Bao, Shunxing; Parvathaneni, Prasanna; Noguera, Camilo Bermudez; Chaganti, Shikha; Nath, Vishwesh; Greer, Jasmine M; Lyu, Ilwoo; French, William R; Newton, Allen T; Rogers, Baxter P; Landman, Bennett A
2018-05-03
High-throughput, large-scale medical image computing demands tight integration of high-performance computing (HPC) infrastructure for data storage, job distribution, and image processing. The Vanderbilt University Institute for Imaging Science (VUIIS) Center for Computational Imaging (CCI) has constructed a large-scale image storage and processing infrastructure that is composed of (1) a large-scale image database using the eXtensible Neuroimaging Archive Toolkit (XNAT), (2) a content-aware job scheduling platform using the Distributed Automation for XNAT pipeline automation tool (DAX), and (3) a wide variety of encapsulated image processing pipelines called "spiders." The VUIIS CCI medical image data storage and processing infrastructure have housed and processed nearly half-million medical image volumes with Vanderbilt Advanced Computing Center for Research and Education (ACCRE), which is the HPC facility at the Vanderbilt University. The initial deployment was natively deployed (i.e., direct installations on a bare-metal server) within the ACCRE hardware and software environments, which lead to issues of portability and sustainability. First, it could be laborious to deploy the entire VUIIS CCI medical image data storage and processing infrastructure to another HPC center with varying hardware infrastructure, library availability, and software permission policies. Second, the spiders were not developed in an isolated manner, which has led to software dependency issues during system upgrades or remote software installation. To address such issues, herein, we describe recent innovations using containerization techniques with XNAT/DAX which are used to isolate the VUIIS CCI medical image data storage and processing infrastructure from the underlying hardware and software environments. The newly presented XNAT/DAX solution has the following new features: (1) multi-level portability from system level to the application level, (2) flexible and dynamic software development and expansion, and (3) scalable spider deployment compatible with HPC clusters and local workstations.
Fiji: an open-source platform for biological-image analysis.
Schindelin, Johannes; Arganda-Carreras, Ignacio; Frise, Erwin; Kaynig, Verena; Longair, Mark; Pietzsch, Tobias; Preibisch, Stephan; Rueden, Curtis; Saalfeld, Stephan; Schmid, Benjamin; Tinevez, Jean-Yves; White, Daniel James; Hartenstein, Volker; Eliceiri, Kevin; Tomancak, Pavel; Cardona, Albert
2012-06-28
Fiji is a distribution of the popular open-source software ImageJ focused on biological-image analysis. Fiji uses modern software engineering practices to combine powerful software libraries with a broad range of scripting languages to enable rapid prototyping of image-processing algorithms. Fiji facilitates the transformation of new algorithms into ImageJ plugins that can be shared with end users through an integrated update system. We propose Fiji as a platform for productive collaboration between computer science and biology research communities.
IUS/TUG orbital operations and mission support study. Volume 4: Project planning data
NASA Technical Reports Server (NTRS)
1975-01-01
Planning data are presented for the development phases of interim upper stage (IUS) and tug systems. Major project planning requirements, major event schedules, milestones, system development and operations process networks, and relevant support research and technology requirements are included. Topics discussed include: IUS flight software; tug flight software; IUS/tug ground control center facilities, personnel, data systems, software, and equipment; IUS mission events; tug mission events; tug/spacecraft rendezvous and docking; tug/orbiter operations interface, and IUS/orbiter operations interface.
A Survey On Management Of Software Engineering In Japan
NASA Astrophysics Data System (ADS)
Kadono, Yasuo; Tsubaki, Hiroe; Tsuruho, Seishiro
2008-05-01
The purpose of this study is to clarity the mechanism of how software engineering capabilities relate to the business performance of IT vendors in Japan. To do this, we developed a structural model using factors related to software engineering, business performance and competitive environment. By analyzing the data collected from 78 major IT vendors in Japan, we found that superior deliverables and business performance were correlated with the effort expended particularly on human resource development, quality assurance, research and development and process improvement.
Low-cost digital image processing at the University of Oklahoma
NASA Technical Reports Server (NTRS)
Harrington, J. A., Jr.
1981-01-01
Computer assisted instruction in remote sensing at the University of Oklahoma involves two separate approaches and is dependent upon initial preprocessing of a LANDSAT computer compatible tape using software developed for an IBM 370/158 computer. In-house generated preprocessing algorithms permits students or researchers to select a subset of a LANDSAT scene for subsequent analysis using either general purpose statistical packages or color graphic image processing software developed for Apple II microcomputers. Procedures for preprocessing the data and image analysis using either of the two approaches for low-cost LANDSAT data processing are described.
Kinetic synergistic transitions in the Ostwald ripening processes
NASA Astrophysics Data System (ADS)
Sachkov, I. N.; Turygina, V. F.; Dolganov, A. N.
2018-01-01
There is proposed approach to mathematical description of the kinetic transitions in Ostwald ripening processes of volatile substance in nonuniformly heated porous materials. It is based upon the finite element method. There are implemented computer software. The main feature of the software is to calculate evaporation and condensation fluxes on the walls of a nonuniformly heated cylindrical capillary. Kinetic transitions are detected for three modes of volatile substances migration which are different by condensation zones location. There are controlling dimensionless parameters of the kinetic transition which are revealed during research. There is phase diagram of the Ostwald ripening process modes realization.
Welding process modelling and control
NASA Technical Reports Server (NTRS)
Romine, Peter L.; Adenwala, Jinen A.
1993-01-01
The research and analysis performed, and software developed, and hardware/software recommendations made during 1992 in development of the PC-based data acquisition system for support of Welding Process Modeling and Control is reported. A need was identified by the Metals Processing Branch of NASA Marshall Space Flight Center, for a mobile data aquisition and analysis system, customized for welding measurement and calibration. Several hardware configurations were evaluated and a PC-based system was chosen. The Welding Measurement System (WMS) is a dedicated instrument, strictly for the use of data aquisition and analysis. Although the WMS supports many of the functions associated with the process control, it is not the intention for this system to be used for welding process control.
Usability Prediction & Ranking of SDLC Models Using Fuzzy Hierarchical Usability Model
NASA Astrophysics Data System (ADS)
Gupta, Deepak; Ahlawat, Anil K.; Sagar, Kalpna
2017-06-01
Evaluation of software quality is an important aspect for controlling and managing the software. By such evaluation, improvements in software process can be made. The software quality is significantly dependent on software usability. Many researchers have proposed numbers of usability models. Each model considers a set of usability factors but do not cover all the usability aspects. Practical implementation of these models is still missing, as there is a lack of precise definition of usability. Also, it is very difficult to integrate these models into current software engineering practices. In order to overcome these challenges, this paper aims to define the term `usability' using the proposed hierarchical usability model with its detailed taxonomy. The taxonomy considers generic evaluation criteria for identifying the quality components, which brings together factors, attributes and characteristics defined in various HCI and software models. For the first time, the usability model is also implemented to predict more accurate usability values. The proposed system is named as fuzzy hierarchical usability model that can be easily integrated into the current software engineering practices. In order to validate the work, a dataset of six software development life cycle models is created and employed. These models are ranked according to their predicted usability values. This research also focuses on the detailed comparison of proposed model with the existing usability models.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wahanani, Nursinta Adi, E-mail: sintaadi@batan.go.id; Natsir, Khairina, E-mail: sintaadi@batan.go.id; Hartini, Entin, E-mail: sintaadi@batan.go.id
Data processing software packages such as VSOP and MCNPX are softwares that has been scientifically proven and complete. The result of VSOP and MCNPX are huge and complex text files. In the analyze process, user need additional processing like Microsoft Excel to show informative result. This research develop an user interface software for output of VSOP and MCNPX. VSOP program output is used to support neutronic analysis and MCNPX program output is used to support burn-up analysis. Software development using iterative development methods which allow for revision and addition of features according to user needs. Processing time with this softwaremore » 500 times faster than with conventional methods using Microsoft Excel. PYTHON is used as a programming language, because Python is available for all major operating systems: Windows, Linux/Unix, OS/2, Mac, Amiga, among others. Values that support neutronic analysis are k-eff, burn-up and mass Pu{sup 239} and Pu{sup 241}. Burn-up analysis used the mass inventory values of actinide (Thorium, Plutonium, Neptunium and Uranium). Values are visualized in graphical shape to support analysis.« less
SignalPlant: an open signal processing software platform.
Plesinger, F; Jurco, J; Halamek, J; Jurak, P
2016-07-01
The growing technical standard of acquisition systems allows the acquisition of large records, often reaching gigabytes or more in size as is the case with whole-day electroencephalograph (EEG) recordings, for example. Although current 64-bit software for signal processing is able to process (e.g. filter, analyze, etc) such data, visual inspection and labeling will probably suffer from rather long latency during the rendering of large portions of recorded signals. For this reason, we have developed SignalPlant-a stand-alone application for signal inspection, labeling and processing. The main motivation was to supply investigators with a tool allowing fast and interactive work with large multichannel records produced by EEG, electrocardiograph and similar devices. The rendering latency was compared with EEGLAB and proves significantly faster when displaying an image from a large number of samples (e.g. 163-times faster for 75 × 10(6) samples). The presented SignalPlant software is available free and does not depend on any other computation software. Furthermore, it can be extended with plugins by third parties ensuring its adaptability to future research tasks and new data formats.
Design of penicillin fermentation process simulation system
NASA Astrophysics Data System (ADS)
Qi, Xiaoyu; Yuan, Zhonghu; Qi, Xiaoxuan; Zhang, Wenqi
2011-10-01
Real-time monitoring for batch process attracts increasing attention. It can ensure safety and provide products with consistent quality. The design of simulation system of batch process fault diagnosis is of great significance. In this paper, penicillin fermentation, a typical non-linear, dynamic, multi-stage batch production process, is taken as the research object. A visual human-machine interactive simulation software system based on Windows operation system is developed. The simulation system can provide an effective platform for the research of batch process fault diagnosis.
Security Risks: Management and Mitigation in the Software Life Cycle
NASA Technical Reports Server (NTRS)
Gilliam, David P.
2004-01-01
A formal approach to managing and mitigating security risks in the software life cycle is requisite to developing software that has a higher degree of assurance that it is free of security defects which pose risk to the computing environment and the organization. Due to its criticality, security should be integrated as a formal approach in the software life cycle. Both a software security checklist and assessment tools should be incorporated into this life cycle process and integrated with a security risk assessment and mitigation tool. The current research at JPL addresses these areas through the development of a Sotfware Security Assessment Instrument (SSAI) and integrating it with a Defect Detection and Prevention (DDP) risk management tool.
Study and Analysis of The Robot-Operated Material Processing Systems (ROMPS)
NASA Technical Reports Server (NTRS)
Nguyen, Charles C.
1996-01-01
This is a report presenting the progress of a research grant funded by NASA for work performed during 1 Oct. 1994 - 31 Sep. 1995. The report deals with the development and investigation of potential use of software for data processing for the Robot Operated Material Processing System (ROMPS). It reports on the progress of data processing of calibration samples processed by ROMPS in space and on earth. First data were retrieved using the I/O software and manually processed using MicroSoft Excel. Then the data retrieval and processing process was automated using a program written in C which is able to read the telemetry data and produce plots of time responses of sample temperatures and other desired variables. LabView was also employed to automatically retrieve and process the telemetry data.
NASA Astrophysics Data System (ADS)
Ye, Jinzuo; Chi, Chongwei; Zhang, Shuang; Ma, Xibo; Tian, Jie
2014-02-01
Sentinel lymph node (SLN) in vivo detection is vital in breast cancer surgery. A new near-infrared fluorescence-based surgical navigation system (SNS) imaging software, which has been developed by our research group, is presented for SLN detection surgery in this paper. The software is based on the fluorescence-based surgical navigation hardware system (SNHS) which has been developed in our lab, and is designed specifically for intraoperative imaging and postoperative data analysis. The surgical navigation imaging software consists of the following software modules, which mainly include the control module, the image grabbing module, the real-time display module, the data saving module and the image processing module. And some algorithms have been designed to achieve the performance of the software, for example, the image registration algorithm based on correlation matching. Some of the key features of the software include: setting the control parameters of the SNS; acquiring, display and storing the intraoperative imaging data in real-time automatically; analysis and processing of the saved image data. The developed software has been used to successfully detect the SLNs in 21 cases of breast cancer patients. In the near future, we plan to improve the software performance and it will be extensively used for clinical purpose.
Building Safer Systems With SpecTRM
NASA Technical Reports Server (NTRS)
2003-01-01
System safety, an integral component in software development, often poses a challenge to engineers designing computer-based systems. While the relaxed constraints on software design allow for increased power and flexibility, this flexibility introduces more possibilities for error. As a result, system engineers must identify the design constraints necessary to maintain safety and ensure that the system and software design enforces them. Safeware Engineering Corporation, of Seattle, Washington, provides the information, tools, and techniques to accomplish this task with its Specification Tools and Requirements Methodology (SpecTRM). NASA assisted in developing this engineering toolset by awarding the company several Small Business Innovation Research (SBIR) contracts with Ames Research Center and Langley Research Center. The technology benefits NASA through its applications for Space Station rendezvous and docking. SpecTRM aids system and software engineers in developing specifications for large, complex safety critical systems. The product enables engineers to find errors early in development so that they can be fixed with the lowest cost and impact on the system design. SpecTRM traces both the requirements and design rationale (including safety constraints) throughout the system design and documentation, allowing engineers to build required system properties into the design from the beginning, rather than emphasizing assessment at the end of the development process when changes are limited and costly.System safety, an integral component in software development, often poses a challenge to engineers designing computer-based systems. While the relaxed constraints on software design allow for increased power and flexibility, this flexibility introduces more possibilities for error. As a result, system engineers must identify the design constraints necessary to maintain safety and ensure that the system and software design enforces them. Safeware Engineering Corporation, of Seattle, Washington, provides the information, tools, and techniques to accomplish this task with its Specification Tools and Requirements Methodology (SpecTRM). NASA assisted in developing this engineering toolset by awarding the company several Small Business Innovation Research (SBIR) contracts with Ames Research Center and Langley Research Center. The technology benefits NASA through its applications for Space Station rendezvous and docking. SpecTRM aids system and software engineers in developing specifications for large, complex safety critical systems. The product enables engineers to find errors early in development so that they can be fixed with the lowest cost and impact on the system design. SpecTRM traces both the requirements and design rationale (including safety constraints) throughout the system design and documentation, allowing engineers to build required system properties into the design from the beginning, rather than emphasizing assessment at the end of the development process when changes are limited and costly.
NASA Astrophysics Data System (ADS)
Warren, M. A.; Goult, S.; Clewley, D.
2018-06-01
Advances in technology allow remotely sensed data to be acquired with increasingly higher spatial and spectral resolutions. These data may then be used to influence government decision making and solve a number of research and application driven questions. However, such large volumes of data can be difficult to handle on a single personal computer or on older machines with slower components. Often the software required to process data is varied and can be highly technical and too advanced for the novice user to fully understand. This paper describes an open-source tool, the Simple Concurrent Online Processing System (SCOPS), which forms part of an airborne hyperspectral data processing chain that allows users accessing the tool over a web interface to submit jobs and process data remotely. It is demonstrated using Natural Environment Research Council Airborne Research Facility (NERC-ARF) instruments together with other free- and open-source tools to take radiometrically corrected data from sensor geometry into geocorrected form and to generate simple or complex band ratio products. The final processed data products are acquired via an HTTP download. SCOPS can cut data processing times and introduce complex processing software to novice users by distributing jobs across a network using a simple to use web interface.
NASA Technical Reports Server (NTRS)
Matthews, Christine G.; Posenau, Mary-Anne; Leonard, Desiree M.; Avis, Elizabeth L.; Debure, Kelly R.; Stacy, Kathryn; Vonofenheim, Bill
1992-01-01
The intent is to provide an introduction to the image processing capabilities available at the Langley Research Center (LaRC) Central Scientific Computing Complex (CSCC). Various image processing software components are described. Information is given concerning the use of these components in the Data Visualization and Animation Laboratory at LaRC.
A Research Program on Artificial Intelligence in Process Engineering.
ERIC Educational Resources Information Center
Stephanopoulos, George
1986-01-01
Discusses the use of artificial intelligence systems in process engineering. Describes a new program at the Massachusetts Institute of Technology which attempts to advance process engineering through technological advances in the areas of artificial intelligence and computers. Identifies the program's hardware facilities, software support,…
A software pipeline for prediction of allele-specific alternative RNA processing events using single RNA-seq data. The current version focuses on prediction of alternative splicing and alternative polyadenylation modulated by genetic variants.
Cedar Project---Original goals and progress to date
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cybenko, G.; Kuck, D.; Padua, D.
1990-11-28
This work encompasses a broad attack on high speed parallel processing. Hardware, software, applications development, and performance evaluation and visualization as well as research topics are proposed. Our goal is to develop practical parallel processing for the 1990's.
How to choose the right statistical software?—a method increasing the post-purchase satisfaction
2015-01-01
Nowadays, we live in the “data era” where the use of statistical or data analysis software is inevitable, in any research field. This means that the choice of the right software tool or platform is a strategic issue for a research department. Nevertheless, in many cases decision makers do not pay the right attention to a comprehensive and appropriate evaluation of what the market offers. Indeed, the choice still depends on few factors like, for instance, researcher’s personal inclination, e.g., which software have been used at the university or is already known. This is not wrong in principle, but in some cases it’s not enough at all and might lead to a “dead end” situation, typically after months or years of investments already done on the wrong software. This article, far from being a full and complete guide to statistical software evaluation, aims to illustrate some key points of the decision process and introduce an extended range of factors which can help to undertake the right choice, at least in potential. There is not enough literature about that topic, most of the time underestimated, both in the traditional literature and even in the so called “gray literature”, even if some documents or short pages can be found online. Anyhow, it seems there is not a common and known standpoint about the process of software evaluation from the final user perspective. We suggests a multi-factor analysis leading to an evaluation matrix tool, to be intended as a flexible and customizable tool, aimed to provide a clearer picture of the software alternatives available, not in abstract but related to the researcher’s own context and needs. This method is a result of about twenty years of experience of the author in the field of evaluating and using technical-computing software and partially arises from a research made about such topics as part of a project funded by European Commission under the Lifelong Learning Programme 2011. PMID:26793368
Rattanatamrong, Prapaporn; Matsunaga, Andrea; Raiturkar, Pooja; Mesa, Diego; Zhao, Ming; Mahmoudi, Babak; Digiovanna, Jack; Principe, Jose; Figueiredo, Renato; Sanchez, Justin; Fortes, Jose
2010-01-01
The CyberWorkstation (CW) is an advanced cyber-infrastructure for Brain-Machine Interface (BMI) research. It allows the development, configuration and execution of BMI computational models using high-performance computing resources. The CW's concept is implemented using a software structure in which an "experiment engine" is used to coordinate all software modules needed to capture, communicate and process brain signals and motor-control commands. A generic BMI-model template, which specifies a common interface to the CW's experiment engine, and a common communication protocol enable easy addition, removal or replacement of models without disrupting system operation. This paper reviews the essential components of the CW and shows how templates can facilitate the processes of BMI model development, testing and incorporation into the CW. It also discusses the ongoing work towards making this process infrastructure independent.
Towards understanding software: 15 years in the SEL
NASA Technical Reports Server (NTRS)
Mcgarry, Frank; Pajerski, Rose
1990-01-01
For 15 years, the Software Engineering Laboratory (SEL) at GSFC has been carrying out studies and experiments for the purpose of understanding, assessing, and improving software, and software processes within a production software environment. The SEL comprises three major organizations: (1) the GSFC Flight Dynamics Division; (2) the University of Maryland Computer Science Department; and (3) the Computer Sciences Corporation Flight Dynamics Technology Group. These organizations have jointly carried out several hundred software studies, producing hundreds of reports, papers, and documents: all describing some aspect of the software engineering technology that has undergone analysis in the flight dynamics environment. The studies range from small controlled experiments (such as analyzing the effectiveness of code reading versus functional testing) to large, multiple-project studies (such as assessing the impacts of Ada on a production environment). The key findings that NASA feels have laid the foundation for ongoing and future software development and research activities are summarized.
The software and algorithms for hyperspectral data processing
NASA Astrophysics Data System (ADS)
Shyrayeva, Anhelina; Martinov, Anton; Ivanov, Victor; Katkovsky, Leonid
2017-04-01
Hyperspectral remote sensing technique is widely used for collecting and processing -information about the Earth's surface objects. Hyperspectral data are combined to form a three-dimensional (x, y, λ) data cube. Department of Aerospace Research of the Institute of Applied Physical Problems of the Belarusian State University presents a general model of the software for hyperspectral image data analysis and processing. The software runs in Windows XP/7/8/8.1/10 environment on any personal computer. This complex has been has been written in C++ language using QT framework and OpenGL for graphical data visualization. The software has flexible structure that consists of a set of independent plugins. Each plugin was compiled as Qt Plugin and represents Windows Dynamic library (dll). Plugins can be categorized in terms of data reading types, data visualization (3D, 2D, 1D) and data processing The software has various in-built functions for statistical and mathematical analysis, signal processing functions like direct smoothing function for moving average, Savitzky-Golay smoothing technique, RGB correction, histogram transformation, and atmospheric correction. The software provides two author's engineering techniques for the solution of atmospheric correction problem: iteration method of refinement of spectral albedo's parameters using Libradtran and analytical least square method. The main advantages of these methods are high rate of processing (several minutes for 1 GB data) and low relative error in albedo retrieval (less than 15%). Also, the software supports work with spectral libraries, region of interest (ROI) selection, spectral analysis such as cluster-type image classification and automatic hypercube spectrum comparison by similarity criterion with similar ones from spectral libraries, and vice versa. The software deals with different kinds of spectral information in order to identify and distinguish spectrally unique materials. Also, the following advantages should be noted: fast and low memory hypercube manipulation features, user-friendly interface, modularity, and expandability.
ERIC Educational Resources Information Center
Hung, Wei-Chen; Smith, Thomas J.; Harris, Marian S.; Lockard, James
2010-01-01
This study adopted design and development research methodology (Richey & Klein, "Design and development research: Methods, strategies, and issues," 2007) to systematically investigate the process of applying instructional design principles, human-computer interaction, and software engineering to a performance support system (PSS) for behavior…
Research on Speech Perception. Progress Report No. 4, January 1977-September 1978.
ERIC Educational Resources Information Center
Pisoni, David B.; And Others
Summarizing research activities from January 1977 to September 1978, this is the fourth annual report of research on speech processing conducted in the Department of Psychology at Indiana University. The report includes extended manuscripts, short reports, progress reports, and information on instrumentation developments and software support. The…
Automating the parallel processing of fluid and structural dynamics calculations
NASA Technical Reports Server (NTRS)
Arpasi, Dale J.; Cole, Gary L.
1987-01-01
The NASA Lewis Research Center is actively involved in the development of expert system technology to assist users in applying parallel processing to computational fluid and structural dynamic analysis. The goal of this effort is to eliminate the necessity for the physical scientist to become a computer scientist in order to effectively use the computer as a research tool. Programming and operating software utilities have previously been developed to solve systems of ordinary nonlinear differential equations on parallel scalar processors. Current efforts are aimed at extending these capabilities to systems of partial differential equations, that describe the complex behavior of fluids and structures within aerospace propulsion systems. This paper presents some important considerations in the redesign, in particular, the need for algorithms and software utilities that can automatically identify data flow patterns in the application program and partition and allocate calculations to the parallel processors. A library-oriented multiprocessing concept for integrating the hardware and software functions is described.
ERIC Educational Resources Information Center
Johnson, Maggie; Senges, Max
2010-01-01
Purpose: This paper seeks to analyse the effectiveness and impact of how Google currently trains its new software engineers ("Nooglers") to become productive in the software engineering community. The research focuses on the institutions and support for practice-based learning and cognitive apprenticeship in the Google environment.…
ERIC Educational Resources Information Center
Gagne, Phill; Furlow, Carolyn; Ross, Terris
2009-01-01
In item response theory (IRT) simulation research, it is often necessary to use one software package for data generation and a second software package to conduct the IRT analysis. Because this can substantially slow down the simulation process, it is sometimes offered as a justification for using very few replications. This article provides…
McDonald, Sandra A; Velasco, Elizabeth; Ilasi, Nicholas T
2010-12-01
Pfizer, Inc.'s Tissue Bank, in conjunction with Pfizer's BioBank (biofluid repository), endeavored to create an overarching internal software package to cover all general functions of both research facilities, including sample receipt, reconciliation, processing, storage, and ordering. Business process flow diagrams were developed by the Tissue Bank and Informatics teams as a way of characterizing best practices both within the Bank and in its interactions with key internal and external stakeholders. Besides serving as a first step for the software development, such formalized process maps greatly assisted the identification and communication of best practices and the optimization of current procedures. The diagrams shared here could assist other biospecimen research repositories (both pharmaceutical and other settings) for comparative purposes or as a guide to successful informatics design. Therefore, it is recommended that biorepositories consider establishing formalized business process flow diagrams for their laboratories, to address these objectives of communication and strategy.
GP Workbench Manual: Technical Manual, User's Guide, and Software Guide
Oden, Charles P.; Moulton, Craig W.
2006-01-01
GP Workbench is an open-source general-purpose geophysical data processing software package written primarily for ground penetrating radar (GPR) data. It also includes support for several USGS prototype electromagnetic instruments such as the VETEM and ALLTEM. The two main programs in the package are GP Workbench and GP Wave Utilities. GP Workbench has routines for filtering, gridding, and migrating GPR data; as well as an inversion routine for characterizing UXO (unexploded ordinance) using ALLTEM data. GP Workbench provides two-dimensional (section view) and three-dimensional (plan view or time slice view) processing for GPR data. GP Workbench can produce high-quality graphics for reports when Surfer 8 or higher (Golden Software) is installed. GP Wave Utilities provides a wide range of processing algorithms for single waveforms, such as filtering, correlation, deconvolution, and calculating GPR waveforms. GP Wave Utilities is used primarily for calibrating radar systems and processing individual traces. Both programs also contain research features related to the calibration of GPR systems and calculating subsurface waveforms. The software is written to run on the Windows operating systems. GP Workbench can import GPR data file formats used by major commercial instrument manufacturers including Sensors and Software, GSSI, and Mala. The GP Workbench native file format is SU (Seismic Unix), and subsequently, files generated by GP Workbench can be read by Seismic Unix as well as many other data processing packages.
Information Retrieval Research and ESPRIT.
ERIC Educational Resources Information Center
Smeaton, Alan F.
1987-01-01
Describes the European Strategic Programme of Research and Development in Information Technology (ESPRIT), and its five programs: advanced microelectronics, software technology, advanced information processing, office systems, and computer integrated manufacturing. The emphasis on logic programming and ESPRIT as the European response to the…
Software forecasting as it is really done: A study of JPL software engineers
NASA Technical Reports Server (NTRS)
Griesel, Martha Ann; Hihn, Jairus M.; Bruno, Kristin J.; Fouser, Thomas J.; Tausworthe, Robert C.
1993-01-01
This paper presents a summary of the results to date of a Jet Propulsion Laboratory internally funded research task to study the costing process and parameters used by internally recognized software cost estimating experts. Protocol Analysis and Markov process modeling were used to capture software engineer's forecasting mental models. While there is significant variation between the mental models that were studied, it was nevertheless possible to identify a core set of cost forecasting activities, and it was also found that the mental models cluster around three forecasting techniques. Further partitioning of the mental models revealed clustering of activities, that is very suggestive of a forecasting lifecycle. The different forecasting methods identified were based on the use of multiple-decomposition steps or multiple forecasting steps. The multiple forecasting steps involved either forecasting software size or an additional effort forecast. Virtually no subject used risk reduction steps in combination. The results of the analysis include: the identification of a core set of well defined costing activities, a proposed software forecasting life cycle, and the identification of several basic software forecasting mental models. The paper concludes with a discussion of the implications of the results for current individual and institutional practices.
ERIC Educational Resources Information Center
Gatlin, Rebecca; And Others
Research indicates that people tend to use only five percent of the capabilities available in word processing software. The major objective of this study was to determine to what extent word processing was used by businesses, what competencies were required by those businesses, and how those competencies were being learned in Mid-South states. A…
New Decision Tool To Evaluate Award Selection Process.
ERIC Educational Resources Information Center
Thornley, Richard; Spence, Matthew W.; Taylor, Mark; Magnan, Jacques
2002-01-01
Describes an Alberta Heritage Foundation for Medical Research initiative to enhance the review process for its training awards using a new tool based on the ProGrid decision-assist software. Implementation resulted in several modifications to the review process in the areas of definition, rationality, fairness, timeliness, and responsiveness; the…
Runtime Verification in Context : Can Optimizing Error Detection Improve Fault Diagnosis
NASA Technical Reports Server (NTRS)
Dwyer, Matthew B.; Purandare, Rahul; Person, Suzette
2010-01-01
Runtime verification has primarily been developed and evaluated as a means of enriching the software testing process. While many researchers have pointed to its potential applicability in online approaches to software fault tolerance, there has been a dearth of work exploring the details of how that might be accomplished. In this paper, we describe how a component-oriented approach to software health management exposes the connections between program execution, error detection, fault diagnosis, and recovery. We identify both research challenges and opportunities in exploiting those connections. Specifically, we describe how recent approaches to reducing the overhead of runtime monitoring aimed at error detection might be adapted to reduce the overhead and improve the effectiveness of fault diagnosis.
Computational methods and software systems for dynamics and control of large space structures
NASA Technical Reports Server (NTRS)
Park, K. C.; Felippa, C. A.; Farhat, C.; Pramono, E.
1990-01-01
This final report on computational methods and software systems for dynamics and control of large space structures covers progress to date, projected developments in the final months of the grant, and conclusions. Pertinent reports and papers that have not appeared in scientific journals (or have not yet appeared in final form) are enclosed. The grant has supported research in two key areas of crucial importance to the computer-based simulation of large space structure. The first area involves multibody dynamics (MBD) of flexible space structures, with applications directed to deployment, construction, and maneuvering. The second area deals with advanced software systems, with emphasis on parallel processing. The latest research thrust in the second area, as reported here, involves massively parallel computers.
Shared-resource computing for small research labs.
Ackerman, M J
1982-04-01
A real time laboratory computer network is described. This network is composed of four real-time laboratory minicomputers located in each of four division laboratories and a larger minicomputer in a centrally located computer room. Off the shelf hardware and software were used with no customization. The network is configured for resource sharing using DECnet communications software and the RSX-11-M multi-user real-time operating system. The cost effectiveness of the shared resource network and multiple real-time processing using priority scheduling is discussed. Examples of utilization within a medical research department are given.
Launching GUPPI: the Green Bank Ultimate Pulsar Processing Instrument
NASA Astrophysics Data System (ADS)
DuPlain, Ron; Ransom, Scott; Demorest, Paul; Brandt, Patrick; Ford, John; Shelton, Amy L.
2008-08-01
The National Radio Astronomy Observatory (NRAO) is launching the Green Bank Ultimate Pulsar Processing Instrument (GUPPI), a prototype flexible digital signal processor designed for pulsar observations with the Robert C. Byrd Green Bank Telescope (GBT). GUPPI uses field programmable gate array (FPGA) hardware and design tools developed by the Center for Astronomy Signal Processing and Electronics Research (CASPER) at the University of California, Berkeley. The NRAO has been concurrently developing GUPPI software and hardware using minimal software resources. The software handles instrument monitor and control, data acquisition, and hardware interfacing. GUPPI is currently an expert-only spectrometer, but supports future integration with the full GBT production system. The NRAO was able to take advantage of the unique flexibility of the CASPER FPGA hardware platform, develop hardware and software in parallel, and build a suite of software tools for monitoring, controlling, and acquiring data with a new instrument over a short timeline of just a few months. The NRAO interacts regularly with CASPER and its users, and GUPPI stands as an example of what reconfigurable computing and open-source development can do for radio astronomy. GUPPI is modular for portability, and the NRAO provides the results of development as an open-source resource.
Practical research on the teaching of Optical Design
NASA Astrophysics Data System (ADS)
Fan, Changjiang; Ren, Zhijun; Ying, Chaofu; Peng, Baojin
2017-08-01
Optical design, together with applied optics, forms a complete system from basic theory to application theory, and it plays a very important role in professional education. In order to improve senior undergraduates' understanding of optical design, this course is divided into three parts: theoretical knowledge, software design and product processing. Through learning theoretical knowledge, students can master the aberration theory and the design principles of typical optical system. By using ZEMAX(an imaging design software), TRACEPRO(a lighting optical design software), SOLIDWORKS or PROE( mechanical design software), student can establish a complete model of optical system. Student can use carving machine located in lab or cooperative units to process the model. Through the above three parts, student can learn necessary practical knowledge and get improved in their learning and analysis abilities, thus they can also get enough practice to prompt their creative abilities, then they could gradually change from scientific theory learners to an Optics Engineers.
j5 DNA assembly design automation.
Hillson, Nathan J
2014-01-01
Modern standardized methodologies, described in detail in the previous chapters of this book, have enabled the software-automated design of optimized DNA construction protocols. This chapter describes how to design (combinatorial) scar-less DNA assembly protocols using the web-based software j5. j5 assists biomedical and biotechnological researchers construct DNA by automating the design of optimized protocols for flanking homology sequence as well as type IIS endonuclease-mediated DNA assembly methodologies. Unlike any other software tool available today, j5 designs scar-less combinatorial DNA assembly protocols, performs a cost-benefit analysis to identify which portions of an assembly process would be less expensive to outsource to a DNA synthesis service provider, and designs hierarchical DNA assembly strategies to mitigate anticipated poor assembly junction sequence performance. Software integrated with j5 add significant value to the j5 design process through graphical user-interface enhancement and downstream liquid-handling robotic laboratory automation.
Installé, Arnaud Jf; Van den Bosch, Thierry; De Moor, Bart; Timmerman, Dirk
2014-10-20
Using machine-learning techniques, clinical diagnostic model research extracts diagnostic models from patient data. Traditionally, patient data are often collected using electronic Case Report Form (eCRF) systems, while mathematical software is used for analyzing these data using machine-learning techniques. Due to the lack of integration between eCRF systems and mathematical software, extracting diagnostic models is a complex, error-prone process. Moreover, due to the complexity of this process, it is usually only performed once, after a predetermined number of data points have been collected, without insight into the predictive performance of the resulting models. The objective of the study of Clinical Data Miner (CDM) software framework is to offer an eCRF system with integrated data preprocessing and machine-learning libraries, improving efficiency of the clinical diagnostic model research workflow, and to enable optimization of patient inclusion numbers through study performance monitoring. The CDM software framework was developed using a test-driven development (TDD) approach, to ensure high software quality. Architecturally, CDM's design is split over a number of modules, to ensure future extendability. The TDD approach has enabled us to deliver high software quality. CDM's eCRF Web interface is in active use by the studies of the International Endometrial Tumor Analysis consortium, with over 4000 enrolled patients, and more studies planned. Additionally, a derived user interface has been used in six separate interrater agreement studies. CDM's integrated data preprocessing and machine-learning libraries simplify some otherwise manual and error-prone steps in the clinical diagnostic model research workflow. Furthermore, CDM's libraries provide study coordinators with a method to monitor a study's predictive performance as patient inclusions increase. To our knowledge, CDM is the only eCRF system integrating data preprocessing and machine-learning libraries. This integration improves the efficiency of the clinical diagnostic model research workflow. Moreover, by simplifying the generation of learning curves, CDM enables study coordinators to assess more accurately when data collection can be terminated, resulting in better models or lower patient recruitment costs.
Computer generated animation and movie production at LARC: A case study
NASA Technical Reports Server (NTRS)
Gates, R. L.; Matthews, C. G.; Vonofenheim, W. H.; Randall, D. P.; Jones, K. H.
1984-01-01
The process of producing computer generated 16mm movies using the MOVIE.BYU software package developed by Brigham Young University and the currently available hardware technology at the Langley Research Center is described. A general overview relates the procedures to a specific application. Details are provided which describe the data used, preparation of a storyboard, key frame generation, the actual animation, title generation, filming, and processing/developing the final product. Problems encountered in each of these areas are identified. Both hardware and software problems are discussed along with proposed solutions and recommendations.
Software Development in the Water Sciences: a view from the divide (Invited)
NASA Astrophysics Data System (ADS)
Miles, B.; Band, L. E.
2013-12-01
While training in statistical methods is an important part of many earth scientists' training, these scientists often learn the bulk of their software development skills in an ad hoc, just-in-time manner. Yet to carry out contemporary research scientists are spending more and more time developing software. Here I present perspectives - as an earth sciences graduate student with professional software engineering experience - on the challenges scientists face adopting software engineering practices, with an emphasis on areas of the science software development lifecycle that could benefit most from improved engineering. This work builds on experience gained as part of the NSF-funded Water Science Software Institute (WSSI) conceptualization award (NSF Award # 1216817). Throughout 2013, the WSSI team held a series of software scoping and development sprints with the goals of: (1) adding features to better model green infrastructure within the Regional Hydro-Ecological Simulation System (RHESSys); and (2) infusing test-driven agile software development practices into the processes employed by the RHESSys team. The goal of efforts such as the WSSI is to ensure that investments by current and future scientists in software engineering training will enable transformative science by improving both scientific reproducibility and researcher productivity. Experience with the WSSI indicates: (1) the potential for achieving this goal; and (2) while scientists are willing to adopt some software engineering practices, transformative science will require continued collaboration between domain scientists and cyberinfrastructure experts for the foreseeable future.
Yang, Deshan; Brame, Scott; El Naqa, Issam; Aditya, Apte; Wu, Yu; Goddu, S Murty; Mutic, Sasa; Deasy, Joseph O; Low, Daniel A
2011-01-01
Recent years have witnessed tremendous progress in image guide radiotherapy technology and a growing interest in the possibilities for adapting treatment planning and delivery over the course of treatment. One obstacle faced by the research community has been the lack of a comprehensive open-source software toolkit dedicated for adaptive radiotherapy (ART). To address this need, the authors have developed a software suite called the Deformable Image Registration and Adaptive Radiotherapy Toolkit (DIRART). DIRART is an open-source toolkit developed in MATLAB. It is designed in an object-oriented style with focus on user-friendliness, features, and flexibility. It contains four classes of DIR algorithms, including the newer inverse consistency algorithms to provide consistent displacement vector field in both directions. It also contains common ART functions, an integrated graphical user interface, a variety of visualization and image-processing features, dose metric analysis functions, and interface routines. These interface routines make DIRART a powerful complement to the Computational Environment for Radiotherapy Research (CERR) and popular image-processing toolkits such as ITK. DIRART provides a set of image processing/registration algorithms and postprocessing functions to facilitate the development and testing of DIR algorithms. It also offers a good amount of options for DIR results visualization, evaluation, and validation. By exchanging data with treatment planning systems via DICOM-RT files and CERR, and by bringing image registration algorithms closer to radiotherapy applications, DIRART is potentially a convenient and flexible platform that may facilitate ART and DIR research. 0 2011 Ameri-
Numerical research of the optimal control problem in the semi-Markov inventory model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gorshenin, Andrey K.; Belousov, Vasily V.; Shnourkoff, Peter V.
2015-03-10
This paper is devoted to the numerical simulation of stochastic system for inventory management products using controlled semi-Markov process. The results of a special software for the system’s research and finding the optimal control are presented.
NASA Astrophysics Data System (ADS)
Dunn, Michael
2008-10-01
For over 30 years, the Oak Ridge National Laboratory (ORNL) has performed research and development to provide more accurate nuclear cross-section data in the resonance region. The ORNL Nuclear Data (ND) Program consists of four complementary areas of research: (1) cross-section measurements at the Oak Ridge Electron Linear Accelerator; (2) resonance analysis methods development with the SAMMY R-matrix analysis software; (3) cross-section evaluation development; and (4) cross-section processing methods development with the AMPX software system. The ND Program is tightly coupled with nuclear fuel cycle analyses and radiation transport methods development efforts at ORNL. Thus, nuclear data work is performed in concert with nuclear science and technology needs and requirements. Recent advances in each component of the ORNL ND Program have led to improvements in resonance region measurements, R-matrix analyses, cross-section evaluations, and processing capabilities that directly support radiation transport research and development. Of particular importance are the improvements in cross-section covariance data evaluation and processing capabilities. The benefit of these advances to nuclear science and technology research and development will be discussed during the symposium on Nuclear Physics Research Connections to Nuclear Energy.
Georadar and geoelectricity method to identify the determine zone of sliding landslide
NASA Astrophysics Data System (ADS)
Dalimunthe, Y. K.; Hamid, A.
2018-01-01
The aim of this research is to determine the contrast between the sliding plane by observing the parameters of rock types, fractures, and faults that could potentially land slides in Bandar Baru, Lampung Barat, Indonesia by both methods of georadar and geoelectricity. This research uses radar reflection profiling configuration for georadar and dipole-dipole configuration for geoelectricity. For georadar data processing has been done with Reflexwave software and for geoelectricity, data processing has been done with Earthimager 2DINV software to interpret subsurface section. Results of research by both methods of georadar and geoelectricity shows the area of contact between the sand stone with resistivity value of 200-1449 Ωm and clay stone with a resistivity value of 32-100 Ωm at the limit depth of 9 m as a potential zone of sliding landslides where the physical properties of clay stone easily derail massive material on it.
Munoz, Maria Isabel; Bouldi, Nadia; Barcellini, Flore; Nascimento, Adelaide
2012-01-01
This communication deals with the involvement of ergonomists in a research-action design process of a software platform in radiotherapy. The goal of the design project is to enhance patient safety by designing a workflow software that supports cooperation between professionals producing treatment in radiotherapy. The general framework of our approach is the ergonomics management of a design process, which is based in activity analysis and grounded in participatory design. Two fields are concerned by the present action: a design environment which is a participatory design process that involves software designers, caregivers as future users and ergonomists; and a reference real work setting in radiotherapy. Observations, semi-structured interviews and participatory workshops allow the characterization of activity in radiotherapy dealing with uses of cooperative tools, sources of variability and non-ruled strategies to manage the variability of the situations. This production of knowledge about work searches to enhance the articulation between technocentric and anthropocentric approaches, and helps in clarifying design requirements. An issue of this research-action is to develop a framework to define the parameters of the workflow tool, and the conditions of its deployment.
NASA Astrophysics Data System (ADS)
Weber, Walter H.; Mair, H. Douglas; Jansen, Dion
2003-03-01
A suite of basic signal processors has been developed. These basic building blocks can be cascaded together to form more complex processors without the need for programming. The data structures between each of the processors are handled automatically. This allows a processor built for one purpose to be applied to any type of data such as images, waveform arrays and single values. The processors are part of Winspect Data Acquisition software. The new processors are fast enough to work on A-scan signals live while scanning. Their primary use is to extract features, reduce noise or to calculate material properties. The cascaded processors work equally well on live A-scan displays, live gated data or as a post-processing engine on saved data. Researchers are able to call their own MATLAB or C-code from anywhere within the processor structure. A built-in formula node processor that uses a simple algebraic editor may make external user programs unnecessary. This paper also discusses the problems associated with ad hoc software development and how graphical programming languages can tie up researchers writing software rather than designing experiments.
Villamor Ordozgoiti, Alberto; Delgado Hito, Pilar; Guix Comellas, Eva María; Fernandez Sanchez, Carlos Manuel; Garcia Hernandez, Milagros; Lluch Canut, Teresa
2016-01-01
Information and Communications Technologies in healthcare has increased the need to consider quality criteria through standardised processes. The aim of this study was to analyse the software quality evaluation models applicable to healthcare from the perspective of ICT-purchasers. Through a systematic literature review with the keywords software, product, quality, evaluation and health, we selected and analysed 20 original research papers published from 2005-2016 in health science and technology databases. The results showed four main topics: non-ISO models, software quality evaluation models based on ISO/IEC standards, studies analysing software quality evaluation models, and studies analysing ISO standards for software quality evaluation. The models provide cost-efficiency criteria for specific software, and improve use outcomes. The ISO/IEC25000 standard is shown as the most suitable for evaluating the quality of ICTs for healthcare use from the perspective of institutional acquisition.
ERIC Educational Resources Information Center
Shim, Eunjae; Shim, Minsuk K.; Felner, Robert D.
Automation of the survey process has proved successful in many industries, yet it is still underused in educational research. This is largely due to the facts (1) that number crunching is usually carried out using software that was developed before information technology existed, and (2) that the educational research is to a great extent trapped…
Validation of X1 motorcycle model in industrial plant layout by using WITNESSTM simulation software
NASA Astrophysics Data System (ADS)
Hamzas, M. F. M. A.; Bareduan, S. A.; Zakaria, M. Z.; Tan, W. J.; Zairi, S.
2017-09-01
This paper demonstrates a case study on simulation, modelling and analysis for X1 Motorcycles Model. In this research, a motorcycle assembly plant has been selected as a main place of research study. Simulation techniques by using Witness software were applied to evaluate the performance of the existing manufacturing system. The main objective is to validate the data and find out the significant impact on the overall performance of the system for future improvement. The process of validation starts when the layout of the assembly line was identified. All components are evaluated to validate whether the data is significance for future improvement. Machine and labor statistics are among the parameters that were evaluated for process improvement. Average total cycle time for given workstations is used as criterion for comparison of possible variants. From the simulation process, the data used are appropriate and meet the criteria for two-sided assembly line problems.
Carlton, Connor D; Mitchell, Samantha; Lewis, Patrick
2018-01-01
Over the past decade, Structure from Motion (SfM) has increasingly been used as a means of digital preservation and for documenting archaeological excavations, architecture, and cultural material. However, few studies have tapped the potential of using SfM to document and analyze taphonomic processes affecting burials for forensic sciences purposes. This project utilizes SfM models to elucidate specific post-depositional events that affected a series of three human cadavers deposited at the South East Texas Applied Forensic Science Facility (STAFS). The aim of this research was to test the ability for untrained researchers to employ spatial software and photogrammetry for data collection purposes. For a series of three months a single lens reflex (SLR) camera was used to capture a series of overlapping images at periodic stages in the decomposition process of each cadaver. These images are processed through photogrammetric software that creates a 3D model that can be measured, manipulated, and viewed. This project used photogrammetric and geospatial software to map changes in decomposition and movement of the body from original deposition points. Project results indicate SfM and GIS as a useful tool for documenting decomposition and taphonomic processes. Results indicate photogrammetry is an efficient, relatively simple, and affordable tool for the documentation of decomposition. Copyright © 2017 Elsevier B.V. All rights reserved.
Undergraduate Research Opportunities in OSS
NASA Astrophysics Data System (ADS)
Boldyreff, Cornelia; Capiluppi, Andrea; Knowles, Thomas; Munro, James
Using Open Source Software (OSS) in undergraduate teaching in universities is now commonplace. Students use OSS applications and systems in their courses on programming, operating systems, DBMS, web development to name but a few. Studying OSS projects from both a product and a process view also forms part of the software engineering curriculum at various universities. Many students have taken part in OSS projects as well as developers.
Computational methods and software systems for dynamics and control of large space structures
NASA Technical Reports Server (NTRS)
Park, K. C.; Felippa, C. A.; Farhat, C.; Pramono, E.
1990-01-01
Two key areas of crucial importance to the computer-based simulation of large space structures are discussed. The first area involves multibody dynamics (MBD) of flexible space structures, with applications directed to deployment, construction, and maneuvering. The second area deals with advanced software systems, with emphasis on parallel processing. The latest research thrust in the second area involves massively parallel computers.
Moore, Sarah; Kailasapathy, Kasipathy; Phillips, Michael; Jones, Mark R
2015-07-01
Microencapsulation is proposed to protect probiotic strains from food processing procedures and to maintain probiotic viability. Little research has described the in situ viability of microencapsulated probiotics. This study successfully developed a real-time viability standard curve for microencapsulated bacteria using confocal microscopy, fluorescent dyes and image analysis software. Copyright © 2015 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Fryling, Meg
2010-01-01
Enterprise Research Planning (ERP) software is advertised as the product that will "run the enterprise", improving data access and accuracy as well as enhancing business process efficiency. Unfortunately, organizations often make implementation decisions with little consideration for the maintenance phase of an ERP, resulting in significant…
NASA Technical Reports Server (NTRS)
Butler, C. M.; Hogge, J. E.
1978-01-01
Air quality sampling was conducted. Data for air quality parameters, recorded on written forms, punched cards or magnetic tape, are available for 1972 through 1975. Computer software was developed to (1) calculate several daily statistical measures of location, (2) plot time histories of data or the calculated daily statistics, (3) calculate simple correlation coefficients, and (4) plot scatter diagrams. Computer software was developed for processing air quality data to include time series analysis and goodness of fit tests. Computer software was developed to (1) calculate a larger number of daily statistical measures of location, and a number of daily monthly and yearly measures of location, dispersion, skewness and kurtosis, (2) decompose the extended time series model and (3) perform some goodness of fit tests. The computer program is described, documented and illustrated by examples. Recommendations are made for continuation of the development of research on processing air quality data.
Unified Software Solution for Efficient SPR Data Analysis in Drug Research
Dahl, Göran; Steigele, Stephan; Hillertz, Per; Tigerström, Anna; Egnéus, Anders; Mehrle, Alexander; Ginkel, Martin; Edfeldt, Fredrik; Holdgate, Geoff; O’Connell, Nichole; Kappler, Bernd; Brodte, Annette; Rawlins, Philip B.; Davies, Gareth; Westberg, Eva-Lotta; Folmer, Rutger H. A.; Heyse, Stephan
2016-01-01
Surface plasmon resonance (SPR) is a powerful method for obtaining detailed molecular interaction parameters. Modern instrumentation with its increased throughput has enabled routine screening by SPR in hit-to-lead and lead optimization programs, and SPR has become a mainstream drug discovery technology. However, the processing and reporting of SPR data in drug discovery are typically performed manually, which is both time-consuming and tedious. Here, we present the workflow concept, design and experiences with a software module relying on a single, browser-based software platform for the processing, analysis, and reporting of SPR data. The efficiency of this concept lies in the immediate availability of end results: data are processed and analyzed upon loading the raw data file, allowing the user to immediately quality control the results. Once completed, the user can automatically report those results to data repositories for corporate access and quickly generate printed reports or documents. The software module has resulted in a very efficient and effective workflow through saved time and improved quality control. We discuss these benefits and show how this process defines a new benchmark in the drug discovery industry for the handling, interpretation, visualization, and sharing of SPR data. PMID:27789754
Simple and Efficient Technique for Spatial/Temporal Composite Imagery
2007-08-01
visible spectrum between 412nm and 869nm, three bands at 500m and two bands at 250m. The MODIS data was processed using the Automated Processing System2...Version 3.6 developed by the Naval Research Labo- ratory (NRL). The Automated Processing System (APS) is a collection of software programs assembled
Process Improvement in a Radically Changing Organization
NASA Technical Reports Server (NTRS)
Varga, Denise M.; Wilson, Barbara M.
2007-01-01
This presentation describes how the NASA Glenn Research Center planned and implemented a process improvement effort in response to a radically changing environment. As a result of a presidential decision to redefine the Agency's mission, many ongoing projects were canceled and future workload would be awarded based on relevance to the Exploration Initiative. NASA imposed a new Procedural Requirements standard on all future software development, and the Center needed to redesign its processes from CMM Level 2 objectives to meet the new standard and position itself for CMMI. The intended audience for this presentation is systems/software developers and managers in a large, research-oriented organization that may need to respond to imposed standards while also pursuing CMMI Maturity Level goals. A set of internally developed tools will be presented, including an overall Process Improvement Action Item database, a formal inspection/peer review tool, metrics collection spreadsheet, and other related technologies. The Center also found a need to charter Technical Working Groups (TWGs) to address particular Process Areas. In addition, a Marketing TWG was needed to communicate the process changes to the development community, including an innovative web site portal.
Issues in Defining Software Architectures in a GIS Environment
NASA Technical Reports Server (NTRS)
Acosta, Jesus; Alvorado, Lori
1997-01-01
The primary mission of the Pan-American Center for Earth and Environmental Studies (PACES) is to advance the research areas that are relevant to NASA's Mission to Planet Earth program. One of the activities at PACES is the establishment of a repository for geographical, geological and environmental information that covers various regions of Mexico and the southwest region of the U.S. and that is acquired from NASA and other sources through remote sensing, ground studies or paper-based maps. The center will be providing access of this information to other government entities in the U.S. and Mexico, and research groups from universities, national laboratories and industry. Geographical Information Systems(GIS) provide the means to manage, manipulate, analyze and display geographically referenced information that will be managed by PACES. Excellent off-the-shelf software exists for a complete GIS as well as software for storing and managing spatial databases, processing images, networking and viewing maps with layered information. This allows the user flexibility in combining systems to create a GIS or to mix these software packages with custom-built application programs. Software architectural languages provide the ability to specify the computational components and interactions among these components, an important topic in the domain of GIS because of the need to integrate numerous software packages. This paper discusses the characteristics that architectural languages address with respect to the issues relating to the data that must be communicated between software systems and components when systems interact. The paper presents a background on GIS in section 2. Section 3 gives an overview of software architecture and architectural languages. Section 4 suggests issues that may be of concern when defining the software architecture of a GIS. The last section discusses the future research effort and finishes with a summary.
Incorporating BDI Agents into Human-Agent Decision Making Research
NASA Astrophysics Data System (ADS)
Kamphorst, Bart; van Wissen, Arlette; Dignum, Virginia
Artificial agents, people, institutes and societies all have the ability to make decisions. Decision making as a research area therefore involves a broad spectrum of sciences, ranging from Artificial Intelligence to economics to psychology. The Colored Trails (CT) framework is designed to aid researchers in all fields in examining decision making processes. It is developed both to study interaction between multiple actors (humans or software agents) in a dynamic environment, and to study and model the decision making of these actors. However, agents in the current implementation of CT lack the explanatory power to help understand the reasoning processes involved in decision making. The BDI paradigm that has been proposed in the agent research area to describe rational agents, enables the specification of agents that reason in abstract concepts such as beliefs, goals, plans and events. In this paper, we present CTAPL: an extension to CT that allows BDI software agents that are written in the practical agent programming language 2APL to reason about and interact with a CT environment.
Process and information integration via hypermedia
NASA Technical Reports Server (NTRS)
Hammen, David G.; Labasse, Daniel L.; Myers, Robert M.
1990-01-01
Success stories for advanced automation prototypes abound in the literature but the deployments of practical large systems are few in number. There are several factors that militate against the maturation of such prototypes into products. Here, the integration of advanced automation software into large systems is discussed. Advanced automation systems tend to be specific applications that need to be integrated and aggregated into larger systems. Systems integration can be achieved by providing expert user-developers with verified tools to efficiently create small systems that interface to large systems through standard interfaces. The use of hypermedia as such a tool in the context of the ground control centers that support Shuttle and space station operations is explored. Hypermedia can be an integrating platform for data, conventional software, and advanced automation software, enabling data integration through the display of diverse types of information and through the creation of associative links between chunks of information. Further, hypermedia enables process integration through graphical invoking of system functions. Through analysis and examples, researchers illustrate how diverse information and processing paradigms can be integrated into a single software platform.
Open Source Clinical NLP – More than Any Single System
Masanz, James; Pakhomov, Serguei V.; Xu, Hua; Wu, Stephen T.; Chute, Christopher G.; Liu, Hongfang
2014-01-01
The number of Natural Language Processing (NLP) tools and systems for processing clinical free-text has grown as interest and processing capability have surged. Unfortunately any two systems typically cannot simply interoperate, even when both are built upon a framework designed to facilitate the creation of pluggable components. We present two ongoing activities promoting open source clinical NLP. The Open Health Natural Language Processing (OHNLP) Consortium was originally founded to foster a collaborative community around clinical NLP, releasing UIMA-based open source software. OHNLP’s mission currently includes maintaining a catalog of clinical NLP software and providing interfaces to simplify the interaction of NLP systems. Meanwhile, Apache cTAKES aims to integrate best-of-breed annotators, providing a world-class NLP system for accessing clinical information within free-text. These two activities are complementary. OHNLP promotes open source clinical NLP activities in the research community and Apache cTAKES bridges research to the health information technology (HIT) practice. PMID:25954581
TRACC: An open source software for processing sap flux data from thermal dissipation probes
Ward, Eric J.; Domec, Jean-Christophe; King, John; ...
2017-05-02
Here, thermal dissipation probes (TDPs) have become a widely used method of monitoring plant water use in recent years. The use of TDPs requires calibration to a theoretical zero-flow value (ΔT0); usually based upon the assumption that at least some nighttime measurements represent zero-flow conditions. Fully automating the processing of data from TDPs is made exceedingly difficult due to errors arising from many sources. However, it is desirable to minimize variation arising from different researchers’ processing data, and thus, a common platform for processing data, including editing raw data and determination of ΔT0, is useful and increases the transparency andmore » replicability of TDP-based research. Here, we present the TDP data processing software TRACC (Thermal dissipation Review Assessment Cleaning and Conversion) to serve this purpose. TRACC is an open-source software written in the language R, using graphical presentation of data and on screen prompts with yes/no or simple numerical responses. It allows the user to select several important options, such as calibration coefficients and the exclusion of nights when vapor pressure deficit does not approach zero. Although it is designed for users with no coding experience, the outputs of TRACC could be easily incorporated into more complex models or software.« less
TRACC: An open source software for processing sap flux data from thermal dissipation probes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ward, Eric J.; Domec, Jean-Christophe; King, John
Here, thermal dissipation probes (TDPs) have become a widely used method of monitoring plant water use in recent years. The use of TDPs requires calibration to a theoretical zero-flow value (ΔT0); usually based upon the assumption that at least some nighttime measurements represent zero-flow conditions. Fully automating the processing of data from TDPs is made exceedingly difficult due to errors arising from many sources. However, it is desirable to minimize variation arising from different researchers’ processing data, and thus, a common platform for processing data, including editing raw data and determination of ΔT0, is useful and increases the transparency andmore » replicability of TDP-based research. Here, we present the TDP data processing software TRACC (Thermal dissipation Review Assessment Cleaning and Conversion) to serve this purpose. TRACC is an open-source software written in the language R, using graphical presentation of data and on screen prompts with yes/no or simple numerical responses. It allows the user to select several important options, such as calibration coefficients and the exclusion of nights when vapor pressure deficit does not approach zero. Although it is designed for users with no coding experience, the outputs of TRACC could be easily incorporated into more complex models or software.« less
Adding Processing Functionality to the Sensor Web
NASA Astrophysics Data System (ADS)
Stasch, Christoph; Pross, Benjamin; Jirka, Simon; Gräler, Benedikt
2017-04-01
The Sensor Web allows discovering, accessing and tasking different kinds of environmental sensors in the Web, ranging from simple in-situ sensors to remote sensing systems. However, (geo-)processing functionality needs to be applied to integrate data from different sensor sources and to generate higher level information products. Yet, a common standardized approach for processing sensor data in the Sensor Web is still missing and the integration differs from application to application. Standardizing not only the provision of sensor data, but also the processing facilitates sharing and re-use of processing modules, enables reproducibility of processing results, and provides a common way to integrate external scalable processing facilities or legacy software. In this presentation, we provide an overview on on-going research projects that develop concepts for coupling standardized geoprocessing technologies with Sensor Web technologies. At first, different architectures for coupling sensor data services with geoprocessing services are presented. Afterwards, profiles for linear regression and spatio-temporal interpolation of the OGC Web Processing Services that allow consuming sensor data coming from and uploading predictions to Sensor Observation Services are introduced. The profiles are implemented in processing services for the hydrological domain. Finally, we illustrate how the R software can be coupled with existing OGC Sensor Web and Geoprocessing Services and present an example, how a Web app can be built that allows exploring the results of environmental models in an interactive way using the R Shiny framework. All of the software presented is available as Open Source Software.
Software and hardware infrastructure for research in electrophysiology
Mouček, Roman; Ježek, Petr; Vařeka, Lukáš; Řondík, Tomáš; Brůha, Petr; Papež, Václav; Mautner, Pavel; Novotný, Jiří; Prokop, Tomáš; Štěbeták, Jan
2014-01-01
As in other areas of experimental science, operation of electrophysiological laboratory, design and performance of electrophysiological experiments, collection, storage and sharing of experimental data and metadata, analysis and interpretation of these data, and publication of results are time consuming activities. If these activities are well organized and supported by a suitable infrastructure, work efficiency of researchers increases significantly. This article deals with the main concepts, design, and development of software and hardware infrastructure for research in electrophysiology. The described infrastructure has been primarily developed for the needs of neuroinformatics laboratory at the University of West Bohemia, the Czech Republic. However, from the beginning it has been also designed and developed to be open and applicable in laboratories that do similar research. After introducing the laboratory and the whole architectural concept the individual parts of the infrastructure are described. The central element of the software infrastructure is a web-based portal that enables community researchers to store, share, download and search data and metadata from electrophysiological experiments. The data model, domain ontology and usage of semantic web languages and technologies are described. Current data publication policy used in the portal is briefly introduced. The registration of the portal within Neuroscience Information Framework is described. Then the methods used for processing of electrophysiological signals are presented. The specific modifications of these methods introduced by laboratory researches are summarized; the methods are organized into a laboratory workflow. Other parts of the software infrastructure include mobile and offline solutions for data/metadata storing and a hardware stimulator communicating with an EEG amplifier and recording software. PMID:24639646
Software and hardware infrastructure for research in electrophysiology.
Mouček, Roman; Ježek, Petr; Vařeka, Lukáš; Rondík, Tomáš; Brůha, Petr; Papež, Václav; Mautner, Pavel; Novotný, Jiří; Prokop, Tomáš; Stěbeták, Jan
2014-01-01
As in other areas of experimental science, operation of electrophysiological laboratory, design and performance of electrophysiological experiments, collection, storage and sharing of experimental data and metadata, analysis and interpretation of these data, and publication of results are time consuming activities. If these activities are well organized and supported by a suitable infrastructure, work efficiency of researchers increases significantly. This article deals with the main concepts, design, and development of software and hardware infrastructure for research in electrophysiology. The described infrastructure has been primarily developed for the needs of neuroinformatics laboratory at the University of West Bohemia, the Czech Republic. However, from the beginning it has been also designed and developed to be open and applicable in laboratories that do similar research. After introducing the laboratory and the whole architectural concept the individual parts of the infrastructure are described. The central element of the software infrastructure is a web-based portal that enables community researchers to store, share, download and search data and metadata from electrophysiological experiments. The data model, domain ontology and usage of semantic web languages and technologies are described. Current data publication policy used in the portal is briefly introduced. The registration of the portal within Neuroscience Information Framework is described. Then the methods used for processing of electrophysiological signals are presented. The specific modifications of these methods introduced by laboratory researches are summarized; the methods are organized into a laboratory workflow. Other parts of the software infrastructure include mobile and offline solutions for data/metadata storing and a hardware stimulator communicating with an EEG amplifier and recording software.
A UML-based metamodel for software evolution process
NASA Astrophysics Data System (ADS)
Jiang, Zuo; Zhou, Wei-Hong; Fu, Zhi-Tao; Xiong, Shun-Qing
2014-04-01
A software evolution process is a set of interrelated software processes under which the corresponding software is evolving. An object-oriented software evolution process meta-model (OO-EPMM), abstract syntax and formal OCL constraint of meta-model are presented in this paper. OO-EPMM can not only represent software development process, but also represent software evolution.
The New York City Subways: The First Ten Years. A Library Research Exercise Using a Computer.
ERIC Educational Resources Information Center
Machalow, Robert
This document presents a library research exercise developed at York College which uses the Apple IIe microcomputer and word processing software--the Applewriter--to teach library research skills. Unlike some other library research exercises on disk, this program allows the student to decide on alternative approaches to solving the given problem:…
SlicerRT: radiation therapy research toolkit for 3D Slicer.
Pinter, Csaba; Lasso, Andras; Wang, An; Jaffray, David; Fichtinger, Gabor
2012-10-01
Interest in adaptive radiation therapy research is constantly growing, but software tools available for researchers are mostly either expensive, closed proprietary applications, or free open-source packages with limited scope, extensibility, reliability, or user support. To address these limitations, we propose SlicerRT, a customizable, free, and open-source radiation therapy research toolkit. SlicerRT aspires to be an open-source toolkit for RT research, providing fast computations, convenient workflows for researchers, and a general image-guided therapy infrastructure to assist clinical translation of experimental therapeutic approaches. It is a medium into which RT researchers can integrate their methods and algorithms, and conduct comparative testing. SlicerRT was implemented as an extension for the widely used 3D Slicer medical image visualization and analysis application platform. SlicerRT provides functionality specifically designed for radiation therapy research, in addition to the powerful tools that 3D Slicer offers for visualization, registration, segmentation, and data management. The feature set of SlicerRT was defined through consensus discussions with a large pool of RT researchers, including both radiation oncologists and medical physicists. The development processes used were similar to those of 3D Slicer to ensure software quality. Standardized mechanisms of 3D Slicer were applied for documentation, distribution, and user support. The testing and validation environment was configured to automatically launch a regression test upon each software change and to perform comparison with ground truth results provided by other RT applications. Modules have been created for importing and loading DICOM-RT data, computing and displaying dose volume histograms, creating accumulated dose volumes, comparing dose volumes, and visualizing isodose lines and surfaces. The effectiveness of using 3D Slicer with the proposed SlicerRT extension for radiation therapy research was demonstrated on multiple use cases. A new open-source software toolkit has been developed for radiation therapy research. SlicerRT can import treatment plans from various sources into 3D Slicer for visualization, analysis, comparison, and processing. The provided algorithms are extensively tested and they are accessible through a convenient graphical user interface as well as a flexible application programming interface.
NASA Astrophysics Data System (ADS)
Wang, Qiang
2017-09-01
As an important part of software engineering, the software process decides the success or failure of software product. The design and development feature of security software process is discussed, so is the necessity and the present significance of using such process. Coordinating the function software, the process for security software and its testing are deeply discussed. The process includes requirement analysis, design, coding, debug and testing, submission and maintenance. In each process, the paper proposed the subprocesses to support software security. As an example, the paper introduces the above process into the power information platform.
Lawless, Craig; Hubbard, Simon J.; Fan, Jun; Bessant, Conrad; Hermjakob, Henning; Jones, Andrew R.
2012-01-01
Abstract New methods for performing quantitative proteome analyses based on differential labeling protocols or label-free techniques are reported in the literature on an almost monthly basis. In parallel, a correspondingly vast number of software tools for the analysis of quantitative proteomics data has also been described in the literature and produced by private companies. In this article we focus on the review of some of the most popular techniques in the field and present a critical appraisal of several software packages available to process and analyze the data produced. We also describe the importance of community standards to support the wide range of software, which may assist researchers in the analysis of data using different platforms and protocols. It is intended that this review will serve bench scientists both as a useful reference and a guide to the selection and use of different pipelines to perform quantitative proteomics data analysis. We have produced a web-based tool (http://www.proteosuite.org/?q=other_resources) to help researchers find appropriate software for their local instrumentation, available file formats, and quantitative methodology. PMID:22804616
Software for keratometry measurements using portable devices
NASA Astrophysics Data System (ADS)
Iyomasa, C. M.; Ventura, L.; De Groote, J. J.
2010-02-01
In this work we present an image processing software for automatic astigmatism measurements developed for a hand held keratometer. The system projects 36 light spots, from LEDs, displayed in a precise circle at the lachrymal film of the examined cornea. The displacement, the size and deformation of the reflected image of these light spots are analyzed providing the keratometry. The purpose of this research is to develop a software that performs fast and precise calculations in mainstream mobile devices. In another words, a software that can be implemented in portable computer systems, which could be of low cost and easy to handle. This project allows portability for keratometers and is a previous work for a portable corneal topographer.
NASA Astrophysics Data System (ADS)
de Faria Scheidt, Rafael; Vilain, Patrícia; Dantas, M. A. R.
2014-10-01
Petroleum reservoir engineering is a complex and interesting field that requires large amount of computational facilities to achieve successful results. Usually, software environments for this field are developed without taking care out of possible interactions and extensibilities required by reservoir engineers. In this paper, we present a research work which it is characterized by the design and implementation based on a software product line model for a real distributed reservoir engineering environment. Experimental results indicate successfully the utilization of this approach for the design of distributed software architecture. In addition, all components from the proposal provided greater visibility of the organization and processes for the reservoir engineers.
A new software on TUG-T60 autonomous telescope for astronomical transient events
NASA Astrophysics Data System (ADS)
Dindar, Murat; Helhel, Selçuk; Esenoğlu, Hasan; Parmaksızoğlu, Murat
2015-03-01
Robotic telescopes usually run under the control of a scheduler, which provides high-level control by selecting astronomical targets for observation. TÜBİTAK (Scientific and Technological Research Council of Turkey) National Observatory (TUG)-T60 Robotic Telescope is controlled by open-source OCAAS software, formally named Talon. This study introduces new software which was designed for Talon to catch GRB, GAIA and transient alerts. The new GRB software module (daemon process) alertd is running with all other modules of Talon such as telescoped; focus, dome; camerad and telrun. Maximum slew velocity and acceleration limits of the T60 telescope are enough fast for the GRB and transient observations.
MATHEMATICAL METHODS IN MEDICAL IMAGE PROCESSING
ANGENENT, SIGURD; PICHON, ERIC; TANNENBAUM, ALLEN
2013-01-01
In this paper, we describe some central mathematical problems in medical imaging. The subject has been undergoing rapid changes driven by better hardware and software. Much of the software is based on novel methods utilizing geometric partial differential equations in conjunction with standard signal/image processing techniques as well as computer graphics facilitating man/machine interactions. As part of this enterprise, researchers have been trying to base biomedical engineering principles on rigorous mathematical foundations for the development of software methods to be integrated into complete therapy delivery systems. These systems support the more effective delivery of many image-guided procedures such as radiation therapy, biopsy, and minimally invasive surgery. We will show how mathematics may impact some of the main problems in this area, including image enhancement, registration, and segmentation. PMID:23645963
Discovering objects in a blood recipient information system.
Qiu, D; Junghans, G; Marquardt, K; Kroll, H; Mueller-Eckhardt, C; Dudeck, J
1995-01-01
Application of object-oriented (OO) methodologies has been generally considered as a solution to the problem of improving the software development process and managing the so-called software crisis. Among them, object-oriented analysis (OOA) is the most essential and is a vital prerequisite for the successful use of other OO methodologies. Though there are already a good deal of OOA methods published, the most important aspect common to all these methods: discovering objects classes truly relevant to the given problem domain, has remained a subject to be intensively researched. In this paper, using the successful development of a blood recipient information system as an example, we present our approach which is based on the conceptual framework of responsibility-driven OOA. In the discussion, we also suggest that it may be inadequate to simply attribute the software crisis to the waterfall model of the software development life-cycle. We are convinced that the real causes for the failure of some software and information systems should be sought in the methodologies used in some crucial phases of the software development process. Furthermore, a software system can also fail if object classes essential to the problem domain are not discovered, implemented and visualized, so that the real-world situation cannot be faithfully traced by it.
12 CFR 559.4 - What activities are preapproved for service corporations?
Code of Federal Regulations, 2011 CFR
2011-01-01
... personnel; (12) Research studies and surveys; (13) Software development and systems integration; and (14..., marketing research and other marketing; (3) Clerical; (4) Consulting; (5) Courier; (6) Data processing; (7... housing, small farms, or businesses that are local in character; (2) Investments designed primarily to...
12 CFR 559.4 - What activities are preapproved for service corporations?
Code of Federal Regulations, 2014 CFR
2014-01-01
... personnel; (12) Research studies and surveys; (13) Software development and systems integration; and (14..., marketing research and other marketing; (3) Clerical; (4) Consulting; (5) Courier; (6) Data processing; (7... housing, small farms, or businesses that are local in character; (2) Investments designed primarily to...
12 CFR 559.4 - What activities are preapproved for service corporations?
Code of Federal Regulations, 2010 CFR
2010-01-01
... personnel; (12) Research studies and surveys; (13) Software development and systems integration; and (14..., marketing research and other marketing; (3) Clerical; (4) Consulting; (5) Courier; (6) Data processing; (7... housing, small farms, or businesses that are local in character; (2) Investments designed primarily to...
12 CFR 559.4 - What activities are preapproved for service corporations?
Code of Federal Regulations, 2013 CFR
2013-01-01
... personnel; (12) Research studies and surveys; (13) Software development and systems integration; and (14..., marketing research and other marketing; (3) Clerical; (4) Consulting; (5) Courier; (6) Data processing; (7... housing, small farms, or businesses that are local in character; (2) Investments designed primarily to...
12 CFR 559.4 - What activities are preapproved for service corporations?
Code of Federal Regulations, 2012 CFR
2012-01-01
... personnel; (12) Research studies and surveys; (13) Software development and systems integration; and (14..., marketing research and other marketing; (3) Clerical; (4) Consulting; (5) Courier; (6) Data processing; (7... housing, small farms, or businesses that are local in character; (2) Investments designed primarily to...
Promoting Science Software Best Practices: A Scientist's Perspective (Invited)
NASA Astrophysics Data System (ADS)
Blanton, B. O.
2013-12-01
Software is at the core of most modern scientific activities, and as societal awareness of, and impacts from, extreme weather, disasters, and climate and global change continue to increase, the roles that scientific software play in analyses and decision-making are brought more to the forefront. Reproducibility of research results (particularly those that enter into the decision-making arena) and open access to the software is essential for scientific and scientists' credibility. This has been highlighted in a recent article by Joppa et al (Troubling Trends in Scientific Software Use, Science Magazine, May 2013) that describes reasons for particular software being chosen by scientists, including that the "developer is well-respected" and on "recommendation from a close colleague". This reliance on recommendation, Joppa et al conclude, is fraught with risks to both sciences and scientists. Scientists must frequently take software for granted, assuming that it performs as expected and advertised and that the software itself has been validated and results verified. This is largely due to the manner in which much software is written and developed; in an ad hoc manner, with an inconsistent funding stream, and with little application of core software engineering best practices. Insufficient documentation, limited test cases, and code unavailability are significant barriers to informed and intelligent science software usage. This situation is exacerbated when the scientist becomes the software developer out of necessity due to resource constraints. Adoption of, and adherence to, best practices in scientific software development will substantially increase intelligent software usage and promote a sustainable evolution of the science as encoded in the software. We describe a typical scientist's perspective on using and developing scientific software in the context of storm surge research and forecasting applications that have real-time objectives and regulatory constraints. This include perspectives on what scientists/users of software can contribute back to the software development process and examples of successful scientist/developer interactions, and the competition between "getting it done" and "getting it done right".
NASA Technical Reports Server (NTRS)
Martinez, Debbie; Davidson, Paul C.; Kenney, P. Sean; Hutchinson, Brian K.
2004-01-01
The Flight Simulation and Software Branch (FSSB) at NASA Langley Research Center (LaRC) maintains the unique national asset identified as the Transport Research Facility (TRF). The TRF is a group of facilities and integration laboratories utilized to support the LaRC's simulation-to-flight concept. This concept incorporates common software, hardware, and processes for both groundbased flight simulators and LaRC s B-757-200 flying laboratory identified as the Airborne Research Integrated Experiments System (ARIES). These assets provide Government, industry, and academia with an efficient way to develop and test new technology concepts to enhance the capacity, safety, and operational needs of the ever-changing national airspace system. The integration of the TRF enables a smooth continuous flow of the research from simulation to actual flight test.
Beyond Our Boundaries: Research and Technology
NASA Technical Reports Server (NTRS)
1996-01-01
Topics considered include: Propulsion and Fluid Management; Structures and Dynamics; Materials and Manufacturing Processes; Sensor Technology; Software Technology; Optical Systems; Microgravity Science; Earth System Science; Astrophysics; Solar Physics; and Technology Transfer.
The Future is Hera: Analyzing Astronomical Data Over the Internet
NASA Astrophysics Data System (ADS)
Valencic, Lynne A.; Snowden, S.; Chai, P.; Shafer, R.
2009-01-01
Hera is the new data processing facility provided by the HEASARC at the NASA Goddard Space Flight Center for analyzing astronomical data. Hera provides all the preinstalled software packages, local disk space, and computing resources needed to do general processing of FITS format data files residing on the user's local computer, and to do advanced research using the publicly available data from High Energy Astrophysics missions. Qualified students, educators, and researchers may freely use the Hera services over the internet for research and educational purposes.
NASA Astrophysics Data System (ADS)
Preradović, D. M.; Mićić, Lj S.; Barz, C.
2017-05-01
Production conditions in today’s world require software support at every stage of production and development of new products, for quality assurance and compliance with ISO standards. In addition to ISO standards such as usual metrics of quality, companies today are focused on other optional standards, such as CMMI (Capability Maturity Model Integrated) or prescribing they own standards. However, while there is intensive progress being made in the PM (project management), there is still a significant number of projects, at the global level, that are failures. These have failed to achieve their goals, within budget or timeframe. This paper focuses on checking the role of software tools through the rate of success in projects implemented in the case of internationally manufactured electrical equipment. The results of this research show the level of contribution of the project management software used to manage and develop new products to improve PM processes and PM functions, and how selection of the software tools affects the quality of PM processes and successfully completed projects.
The Future is Hera! Analyzing Astronomical Over the Internet
NASA Technical Reports Server (NTRS)
Valencic, L. A.; Chai, P.; Pence, W.; Shafer, R.; Snowden, S.
2008-01-01
Hera is the data processing facility provided by the High Energy Astrophysics Science Archive Research Center (HEASARC) at the NASA Goddard Space Flight Center for analyzing astronomical data. Hera provides all the pre-installed software packages, local disk space, and computing resources need to do general processing of FITS format data files residing on the users local computer, and to do research using the publicly available data from the High ENergy Astrophysics Division. Qualified students, educators and researchers may freely use the Hera services over the internet of research and educational purposes.
A Survey of Usability Practices in Free/Libre/Open Source Software
NASA Astrophysics Data System (ADS)
Paul, Celeste Lyn
A review of case studies about usability in eight Free/Libre/Open Source Software (FLOSS) projects showed that an important issue regarding a usability initiative in the project was the lack of user research. User research is a key component in the user-centered design (UCD) process and a necessary step for creating usable products. Reasons why FLOSS projects suffered from a lack of user research included poor or unclear project leadership, cultural differences between developer and designers, and a lack of usability engineers. By identifying these critical issues, the FLOSS usability community can begin addressing problems in the efficacy of usability activities and work towards creating more usable FLOSS products.
Bringing macromolecular machinery to life using 3D animation.
Iwasa, Janet H
2015-04-01
Over the past decade, there has been a rapid rise in the use of three-dimensional (3D) animation to depict molecular and cellular processes. Much of the growth in molecular animation has been in the educational arena, but increasingly, 3D animation software is finding its way into research laboratories. In this review, I will discuss a number of ways in which 3d animation software can play a valuable role in visualizing and communicating macromolecular structures and dynamics. I will also consider the challenges of using animation tools within the research sphere. Copyright © 2015. Published by Elsevier Ltd.
Upper Atmosphere Research Satellite (UARS) science data processing center implementation history
NASA Technical Reports Server (NTRS)
Herring, Ellen L.; Taylor, K. David
1990-01-01
NASA-Goddard is responsible for the development of a ground system for the Upper Atmosphere Research Satellite (UARS) observatory, whose launch is scheduled for 1991. This ground system encompasses a dedicated Central Data Handling Facility (CDHF); attention is presently given to the management of software systems design and implementation phases for CDHF by the UARS organization. Also noted are integration and testing activities performed following software deliveries to the CDHF. The UARS project has an obvious requirement for a powerful and flexible data base management system; an off-the-shelf commercial system has been incorporated.
CrossTalk: The Journal of Defense Software Engineering. Volume 19, Number 6
2006-06-01
improvement methods. The total volume of projects studied now exceeds 12,000. Software Productivity Research, LLC Phone: (877) 570-5459 (973) 273-5829...While performing quality con- sulting, Olson has helped organizations measurably improve quality and productivity , save millions of dollars in costs of...This article draws parallels between the outrageous events on the Jerry Springer Show and problems faced by process improvement programs. by Paul
Unified Engineering Software System
NASA Technical Reports Server (NTRS)
Purves, L. R.; Gordon, S.; Peltzman, A.; Dube, M.
1989-01-01
Collection of computer programs performs diverse functions in prototype engineering. NEXUS, NASA Engineering Extendible Unified Software system, is research set of computer programs designed to support full sequence of activities encountered in NASA engineering projects. Sequence spans preliminary design, design analysis, detailed design, manufacturing, assembly, and testing. Primarily addresses process of prototype engineering, task of getting single or small number of copies of product to work. Written in FORTRAN 77 and PROLOG.
ERIC Educational Resources Information Center
Campillo, Cristina; Herrera, Gerardo; Remírez de Ganuza, Conchi; Cuesta, José L.; Abellán, Raquel; Campos, Arturo; Navarro, Ignacio; Sevilla, Javier; Pardo, Carlos; Amati, Fabián
2014-01-01
Deficits in the perception of time and processing of changes across time are commonly observed in individuals with autism. This pilot study evaluated the efficacy of the use of the software tool Tic-Tac, designed to make time visual, in three adults with autism and learning difficulties. This research focused on applying the tool in waiting…
A diagnostic prototype of the potable water subsystem of the Space Station Freedom ECLSS
NASA Technical Reports Server (NTRS)
Lukefahr, Brenda D.; Rochowiak, Daniel M.; Benson, Brian L.; Rogers, John S.; Mckee, James W.
1989-01-01
In analyzing the baseline Environmental Control and Life Support System (ECLSS) command and control architecture, various processes are found which would be enhanced by the use of knowledge based system methods of implementation. The most suitable process for prototyping using rule based methods are documented, while domain knowledge resources and other practical considerations are examined. Requirements for a prototype rule based software system are documented. These requirements reflect Space Station Freedom ECLSS software and hardware development efforts, and knowledge based system requirements. A quick prototype knowledge based system environment is researched and developed.
The Western Aeronautical Test Range. Chapter 10 Tools
NASA Technical Reports Server (NTRS)
Knudtson, Kevin; Park, Alice; Downing, Robert; Sheldon, Jack; Harvey, Robert; Norcross, April
2011-01-01
The Western Aeronautical Test Range (WATR) staff at the NASA Dryden Flight Research Center is developing a translation software called Chapter 10 Tools in response to challenges posed by post-flight processing data files originating from various on-board digital recorders that follow the Range Commanders Council Inter-Range Instrumentation Group (IRIG) 106 Chapter 10 Digital Recording Standard but use differing interpretations of the Standard. The software will read the date files regardless of the vendor implementation of the source recorder, displaying data, identifying and correcting errors, and producing a data file that can be successfully processed post-flight
A CMMI-based approach for medical software project life cycle study.
Chen, Jui-Jen; Su, Wu-Chen; Wang, Pei-Wen; Yen, Hung-Chi
2013-01-01
In terms of medical techniques, Taiwan has gained international recognition in recent years. However, the medical information system industry in Taiwan is still at a developing stage compared with the software industries in other nations. In addition, systematic development processes are indispensable elements of software development. They can help developers increase their productivity and efficiency and also avoid unnecessary risks arising during the development process. Thus, this paper presents an application of Light-Weight Capability Maturity Model Integration (LW-CMMI) to Chang Gung Medical Research Project (CMRP) in the Nuclear medicine field. This application was intended to integrate user requirements, system design and testing of software development processes into three layers (Domain, Concept and Instance) model. Then, expressing in structural System Modeling Language (SysML) diagrams and converts part of the manual effort necessary for project management maintenance into computational effort, for example: (semi-) automatic delivery of traceability management. In this application, it supports establishing artifacts of "requirement specification document", "project execution plan document", "system design document" and "system test document", and can deliver a prototype of lightweight project management tool on the Nuclear Medicine software project. The results of this application can be a reference for other medical institutions in developing medical information systems and support of project management to achieve the aim of patient safety.
NASA Technical Reports Server (NTRS)
Buckner, J. D.; Council, H. W.; Edwards, T. R.
1974-01-01
Description of the hardware and software implementing the system of time-lapse reproduction of images through interactive graphics (TRIIG). The system produces a quality hard copy of processed images in a fast and inexpensive manner. This capability allows for optimal development of processing software through the rapid viewing of many image frames in an interactive mode. Three critical optical devices are used to reproduce an image: an Optronics photo reader/writer, the Adage Graphics Terminal, and Polaroid Type 57 high speed film. Typical sources of digitized images are observation satellites, such as ERTS or Mariner, computer coupled electron microscopes for high-magnification studies, or computer coupled X-ray devices for medical research.
Updates in metabolomics tools and resources: 2014-2015.
Misra, Biswapriya B; van der Hooft, Justin J J
2016-01-01
Data processing and interpretation represent the most challenging and time-consuming steps in high-throughput metabolomic experiments, regardless of the analytical platforms (MS or NMR spectroscopy based) used for data acquisition. Improved machinery in metabolomics generates increasingly complex datasets that create the need for more and better processing and analysis software and in silico approaches to understand the resulting data. However, a comprehensive source of information describing the utility of the most recently developed and released metabolomics resources--in the form of tools, software, and databases--is currently lacking. Thus, here we provide an overview of freely-available, and open-source, tools, algorithms, and frameworks to make both upcoming and established metabolomics researchers aware of the recent developments in an attempt to advance and facilitate data processing workflows in their metabolomics research. The major topics include tools and researches for data processing, data annotation, and data visualization in MS and NMR-based metabolomics. Most in this review described tools are dedicated to untargeted metabolomics workflows; however, some more specialist tools are described as well. All tools and resources described including their analytical and computational platform dependencies are summarized in an overview Table. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Clinical research in a hospital--from the lone rider to teamwork.
Hannisdal, E
1996-01-01
Clinical research of high international standard is very demanding and requires clinical data of high quality, software, hardware and competence in research design and statistical treatment of data. Most busy clinicians have little time allocated for clinical research and this increases the need for a potent infrastructure. This paper describes how the Norwegian Radium Hospital, a specialized cancer hospital, has reorganized the clinical research process. This includes a new department, the Clinical Research Office, which serves the formal framework, a central Diagnosis Registry, clinical databases and multicentre studies. The department assists about 120 users, mainly clinicians. Installation of a network software package with over 10 programs has strongly provided an internal standardization, reduced the costs and saved clinicians a great deal of time. The hospital is building up about 40 diagnosis-specific clinical databases with up to 200 variables registered. These databases are shared by the treatment group and seem to be important tools for quality assurance. We conclude that the clinical research process benefits from a firm infrastructure facilitating teamwork through extensive use of modern information technology. We are now ready for the next phase, which is to work for a better external technical framework for cooperation with other institutions throughout the world.
SenseMyHeart: A cloud service and API for wearable heart monitors.
Pinto Silva, P M; Silva Cunha, J P
2015-01-01
In the era of ubiquitous computing, the growing adoption of wearable systems and body sensor networks is trailing the path for new research and software for cardiovascular intensity, energy expenditure and stress and fatigue detection through cardiovascular monitoring. Several systems have received clinical-certification and provide huge amounts of reliable heart-related data in a continuous basis. PhysioNet provides equally reliable open-source software tools for ECG processing and analysis that can be combined with these devices. However, this software remains difficult to use in a mobile environment and for researchers unfamiliar with Linux-based systems. In the present paper we present an approach that aims at tackling these limitations by developing a cloud service that provides an API for a PhysioNet-based pipeline for ECG processing and Heart Rate Variability measurement. We describe the proposed solution, along with its advantages and tradeoffs. We also present some client tools (windows and Android) and several projects where the developed cloud service has been used successfully as a standard for Heart Rate and Heart Rate Variability studies in different scenarios.
Data management in clinical research: Synthesizing stakeholder perspectives.
Johnson, Stephen B; Farach, Frank J; Pelphrey, Kevin; Rozenblit, Leon
2016-04-01
This study assesses data management needs in clinical research from the perspectives of researchers, software analysts and developers. This is a mixed-methods study that employs sublanguage analysis in an innovative manner to link the assessments. We performed content analysis using sublanguage theory on transcribed interviews conducted with researchers at four universities. A business analyst independently extracted potential software features from the transcriptions, which were translated into the sublanguage. This common sublanguage was then used to create survey questions for researchers, analysts and developers about the desirability and difficulty of features. Results were synthesized using the common sublanguage to compare stakeholder perceptions with the original content analysis. Individual researchers exhibited significant diversity of perspectives that did not correlate by role or site. Researchers had mixed feelings about their technologies, and sought improvements in integration, interoperability and interaction as well as engaging with study participants. Researchers and analysts agreed that data integration has higher desirability and mobile technology has lower desirability but disagreed on the desirability of data validation rules. Developers agreed that data integration and validation are the most difficult to implement. Researchers perceive tasks related to study execution, analysis and quality control as highly strategic, in contrast with tactical tasks related to data manipulation. Researchers have only partial technologic support for analysis and quality control, and poor support for study execution. Software for data integration and validation appears critical to support clinical research, but may be expensive to implement. Features to support study workflow, collaboration and engagement have been underappreciated, but may prove to be easy successes. Software developers should consider the strategic goals of researchers with regard to the overall coordination of research projects and teams, workflow connecting data collection with analysis and processes for improving data quality. Copyright © 2016 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Wild, Christian; Eckhardt, Dave
1987-01-01
The development of a methodology for the production of highly reliable software is one of the greatest challenges facing the computer industry. Meeting this challenge will undoubtably involve the integration of many technologies. This paper describes the use of Artificial Intelligence technologies in the automated analysis of the formal algebraic specifications of abstract data types. These technologies include symbolic execution of specifications using techniques of automated deduction and machine learning through the use of examples. On-going research into the role of knowledge representation and problem solving in the process of developing software is also discussed.
TCGA-assembler 2: software pipeline for retrieval and processing of TCGA/CPTAC data.
Wei, Lin; Jin, Zhilin; Yang, Shengjie; Xu, Yanxun; Zhu, Yitan; Ji, Yuan
2018-05-01
The Cancer Genome Atlas (TCGA) program has produced huge amounts of cancer genomics data providing unprecedented opportunities for research. In 2014, we developed TCGA-Assembler, a software pipeline for retrieval and processing of public TCGA data. In 2016, TCGA data were transferred from the TCGA data portal to the Genomic Data Commons (GDCs), which is supported by a different set of data storage and retrieval mechanisms. In addition, new proteomics data of TCGA samples have been generated by the Clinical Proteomic Tumor Analysis Consortium (CPTAC) program, which were not available for downloading through TCGA-Assembler. It is desirable to acquire and integrate data from both GDC and CPTAC. We develop TCGA-assembler 2 (TA2) to automatically download and integrate data from GDC and CPTAC. We make substantial improvement on the functionality of TA2 to enhance user experience and software performance. TA2 together with its previous version have helped more than 2000 researchers from 64 countries to access and utilize TCGA and CPTAC data in their research. Availability of TA2 will continue to allow existing and new users to conduct reproducible research based on TCGA and CPTAC data. http://www.compgenome.org/TCGA-Assembler/ or https://github.com/compgenome365/TCGA-Assembler-2. zhuyitan@gmail.com or koaeraser@gmail.com. Supplementary data are available at Bioinformatics online.
NASA Astrophysics Data System (ADS)
Wilcox, T.
2016-12-01
How quickly can students (and educators) get started using a "ready to fly" UAS and popular publicly available photogrammetric mapping software for student research at the undergraduate level? This poster presentation focuses on the challenges of starting up your own drone-mapping program for undergraduate research in a compressed timescale of three months. Particular focus will be given to learning the operation of the platforms, hardware and software interface challenges, and using these electronic systems in real-world field settings that pose a range of physical challenges to both operators and equipment. We will be using a combination of the popular DJI Phantom UAS and Pix4D mapping software to investigate mass wasting processes and potential hazards present in public lands popular with recreational users. Projects are aimed at characterizing active geological hazards that operate on short timescales and may include gully headwall erosion in Flaming Geyser State Park and potential landslide instability within Capital State Forest, both in the Puget Sound region of Washington State.
NASA Technical Reports Server (NTRS)
Fitz, Rhonda; Whitman, Gerek
2016-01-01
Research into complexities of software systems Fault Management (FM) and how architectural design decisions affect safety, preservation of assets, and maintenance of desired system functionality has coalesced into a technical reference (TR) suite that advances the provision of safety and mission assurance. The NASA Independent Verification and Validation (IVV) Program, with Software Assurance Research Program support, extracted FM architectures across the IVV portfolio to evaluate robustness, assess visibility for validation and test, and define software assurance methods applied to the architectures and designs. This investigation spanned IVV projects with seven different primary developers, a wide range of sizes and complexities, and encompassed Deep Space Robotic, Human Spaceflight, and Earth Orbiter mission FM architectures. The initiative continues with an expansion of the TR suite to include Launch Vehicles, adding the benefit of investigating differences intrinsic to model-based FM architectures and insight into complexities of FM within an Agile software development environment, in order to improve awareness of how nontraditional processes affect FM architectural design and system health management.
Oak Ridge Computerized Hierarchical Information System (ORCHIS) status report, July 1973
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brooks, A.A.
1974-01-01
This report summarizes the concepts, software, and contents of the Oak Ridge Computerized Hierarchical Information System. This data analysis and text processing system was developed as an integrated, comprehensive information processing capability to meet the needs of an on-going multidisciplinary research and development organization. (auth)
Small but Pristine--Lessons for Small Library Automation.
ERIC Educational Resources Information Center
Clement, Russell; Robertson, Dane
1990-01-01
Compares the more positive library automation experiences of a small public library with those of a large research library. Topics addressed include collection size; computer size and the need for outside control of a data processing center; staff size; selection process for hardware and software; and accountability. (LRW)
Software With Strong Ties to Space
NASA Technical Reports Server (NTRS)
2003-01-01
TieFlow is a simple but powerful business process improvement solution. It can automate and simplify any generic or industry-specific work process, helping organizations to transform work inefficiencies and internal operations involving people, paper, and procedures into a streamlined, well-organized, electronicbased process. TieFlow increases business productivity by improving process cycle times. The software can expedite generic processes in the areas of product design and development, purchase orders, expense reports, benefits enrollment, budgeting, hiring, and sales. It can also shore up vertical market processes such as claims processing, loan application and processing, health care administration, contract management, and advertising agency traffic. The processes can be easily and rapidly captured in a graphical manner and enforced together with rules pertaining to assignments that need to be performed. Aside from boosting productivity, TieFlow also reduces organizational costs and errors. TieFlow was developed with Small Business Innovation Research (SBIR) assistance from Johnson. The SBIR support entitles all Federal Government agencies to utilize the TieFlow software technology free of charge. Tietronix emphasizes that TieFlow is an outstanding workflow resource that could produce dramatic productivity and cost improvements for all agencies, just as it has done and continues to do for NASA. The Space Agency is currently using the software throughout several mission-critical offices, including the Mission Operations Directorate and the Flight Director s Office, for worldwide participation of authorized users in NASA processes. At the Flight Director s Office, TieFlow allows personnel to electronically submit and review changes to the flight rules carried out during missions.
NASA Technical Reports Server (NTRS)
Perusich, Stephen; Moos, Thomas; Muscatello, Anthony
2011-01-01
This innovation provides the user with autonomous on-screen monitoring, embedded computations, and tabulated output for two new processes. The software was originally written for the Continuous Lunar Water Separation Process (CLWSP), but was found to be general enough to be applicable to the Lunar Greenhouse Amplifier (LGA) as well, with minor alterations. The resultant program should have general applicability to many laboratory processes (see figure). The objective for these programs was to create a software application that would provide both autonomous monitoring and data storage, along with manual manipulation. The software also allows operators the ability to input experimental changes and comments in real time without modifying the code itself. Common process elements, such as thermocouples, pressure transducers, and relative humidity sensors, are easily incorporated into the program in various configurations, along with specialized devices such as photodiode sensors. The goal of the CLWSP research project is to design, build, and test a new method to continuously separate, capture, and quantify water from a gas stream. The application is any In-Situ Resource Utilization (ISRU) process that desires to extract or produce water from lunar or planetary regolith. The present work is aimed at circumventing current problems and ultimately producing a system capable of continuous operation at moderate temperatures that can be scaled over a large capacity range depending on the ISRU process. The goal of the LGA research project is to design, build, and test a new type of greenhouse that could be used on the moon or Mars. The LGA uses super greenhouse gases (SGGs) to absorb long-wavelength radiation, thus creating a highly efficient greenhouse at a future lunar or Mars outpost. Silica-based glass, although highly efficient at trapping heat, is heavy, fragile, and not suitable for space greenhouse applications. Plastics are much lighter and resilient, but are not efficient for absorbing longwavelength infrared radiation and therefore will lose more heat to the environment compared to glass. The LGA unit uses a transparent polymer antechamber that surrounds part of the greenhouse and encases the SGGs, thereby minimizing infrared losses through the plastic windows. With ambient temperatures at the lunar poles at 50 C, the LGA should provide a substantial enhancement to currently conceived lunar greenhouses. Positive results obtained from this project could lead to a future large-scale system capable of running autonomously on the Moon, Mars, and beyond. The software for both applications needs to run the entire units and all subprocesses; however, throughout testing, many variables and parameters need to be changed as more is learned about the system operation. The software provides the versatility to permit the software operation to change as the user requirements evolve.
Configuration Management of an Optimization Application in a Research Environment
NASA Technical Reports Server (NTRS)
Townsend, James C.; Salas, Andrea O.; Schuler, M. Patricia
1999-01-01
Multidisciplinary design optimization (MDO) research aims to increase interdisciplinary communication and reduce design cycle time by combining system analyses (simulations) with design space search and decision making. The High Performance Computing and Communication Program's current High Speed Civil Transport application, HSCT4.0, at NASA Langley Research Center involves a highly complex analysis process with high-fidelity analyses that are more realistic than previous efforts at the Center. The multidisciplinary processes have been integrated to form a distributed application by using the Java language and Common Object Request Broker Architecture (CORBA) software techniques. HSCT4.0 is a research project in which both the application problem and the implementation strategy have evolved as the MDO and integration issues became better understood. Whereas earlier versions of the application and integrated system were developed with a simple, manual software configuration management (SCM) process, it was evident that this larger project required a more formal SCM procedure. This report briefly describes the HSCT4.0 analysis and its CORBA implementation and then discusses some SCM concepts and their application to this project. In anticipation that SCM will prove beneficial for other large research projects, the report concludes with some lessons learned in overcoming SCM implementation problems for HSCT4.0.
A combined approach of AHP and TOPSIS methods applied in the field of integrated software systems
NASA Astrophysics Data System (ADS)
Berdie, A. D.; Osaci, M.; Muscalagiu, I.; Barz, C.
2017-05-01
Adopting the most appropriate technology for developing applications on an integrated software system for enterprises, may result in great savings both in cost and hours of work. This paper proposes a research study for the determination of a hierarchy between three SAP (System Applications and Products in Data Processing) technologies. The technologies Web Dynpro -WD, Floorplan Manager - FPM and CRM WebClient UI - CRM WCUI are multi-criteria evaluated in terms of the obtained performances through the implementation of the same web business application. To establish the hierarchy a multi-criteria analysis model that combines the AHP (Analytic Hierarchy Process) and the TOPSIS (Technique for Order Preference by Similarity to Ideal Solution) methods was proposed. This model was built with the help of the SuperDecision software. This software is based on the AHP method and determines the weights for the selected sets of criteria. The TOPSIS method was used to obtain the final ranking and the technologies hierarchy.
Development and Testing of Control Laws for the Active Aeroelastic Wing Program
NASA Technical Reports Server (NTRS)
Dibley, Ryan P.; Allen, Michael J.; Clarke, Robert; Gera, Joseph; Hodgkinson, John
2005-01-01
The Active Aeroelastic Wing research program was a joint program between the U.S. Air Force Research Laboratory and NASA established to investigate the characteristics of an aeroelastic wing and the technique of using wing twist for roll control. The flight test program employed the use of an F/A-18 aircraft modified by reducing the wing torsional stiffness and adding a custom research flight control system. The research flight control system was optimized to maximize roll rate using only wing surfaces to twist the wing while simultaneously maintaining design load limits, stability margins, and handling qualities. NASA Dryden Flight Research Center developed control laws using the software design tool called CONDUIT, which employs a multi-objective function optimization to tune selected control system design parameters. Modifications were made to the Active Aeroelastic Wing implementation in this new software design tool to incorporate the NASA Dryden Flight Research Center nonlinear F/A-18 simulation for time history analysis. This paper describes the design process, including how the control law requirements were incorporated into constraints for the optimization of this specific software design tool. Predicted performance is also compared to results from flight.
Software Defined Cyberinfrastructure
DOE Office of Scientific and Technical Information (OSTI.GOV)
Foster, Ian; Blaiszik, Ben; Chard, Kyle
Within and across thousands of science labs, researchers and students struggle to manage data produced in experiments, simulations, and analyses. Largely manual research data lifecycle management processes mean that much time is wasted, research results are often irreproducible, and data sharing and reuse remain rare. In response, we propose a new approach to data lifecycle management in which researchers are empowered to define the actions to be performed at individual storage systems when data are created or modified: actions such as analysis, transformation, copying, and publication. We term this approach software-defined cyberinfrastructure because users can implement powerful data management policiesmore » by deploying rules to local storage systems, much as software-defined networking allows users to configure networks by deploying rules to switches.We argue that this approach can enable a new class of responsive distributed storage infrastructure that will accelerate research innovation by allowing any researcher to associate data workflows with data sources, whether local or remote, for such purposes as data ingest, characterization, indexing, and sharing. We report on early experiments with this approach in the context of experimental science, in which a simple if-trigger-then-action (IFTA) notation is used to define rules.« less
The Trial Software version for DEMETER power spectrum files visualization and mapping
NASA Astrophysics Data System (ADS)
Lozbin, Anatoliy; Inchin, Alexander; Shpadi, Maxim
2010-05-01
In the frame of Kazakhstan's Scientific Space System creation for earthquakes precursors research, the hardware and software of DEMETER satellite was investigated. The data processing Software of DEMETER is based on package SWAN under IDL Virtual machine and realizes many features, but we can't find an important tool for the spectrograms analysis - space-time visualization of power spectrum files from electromagnetic devices as ICE and IMSC. For elimination of this problem we have developed Software which is offered to use. The DeSS (DEMETER Spectrogram Software) - it is Software for visualization, analysis and a mapping of power spectrum data from electromagnetic devices ICE and IMSC. The Software primary goal is to give the researcher friendly tool for the analysis of electromagnetic data from DEMETER Satellite for earthquake precursors and other ionosphere events researches. The Input data for DeSS Software is a power spectrum files: - Power spectrum of 1 component of the electric field in the VLF range (APID 1132); - Power spectrum of 1 component of the electric field in the HF range (APID 1134); - Power spectrum of 1 component of the magnetic field in the VLF range (APID 1137). The main features and operations of the software is possible: - various time and frequency filtration; - visualization of time dependence of signal intensity on fixed frequency; - spectral density visualization for fixed frequency range; - spectrogram autosize and smooth spectrogram; - the information in each point of the spectrogram: time, frequency and intensity; - the spectrum information in the separate window, consisting of 4 blocks; - data mapping with 6 range scale. On the map we can browse next information: - satellite orbit; - conjugate point at the satellite altitude; - north conjugate point at the altitude 110 km; - south conjugate point at the altitude 110 km. This is only trial software version to help the researchers and we always ready collaborate with scientists for software improvement. References: 1. D.Lagoutte, J.Y. Brochot, D. de Carvalho, L.Madrias and M. Parrot. DEMETER Microsatellite. Scientific Mission Center. Data product description. DMT-SP-9-CM-6054-LPC. 2. D.Lagoutte, J.Y. Brochot, P.Latremoliere. SWAN - Software for Waveform Analysis. LPCE/NI/003.E - Part 1 (User's guide), Part 2 (Analysis tools), Part 3 (User's project interface).
Using software security analysis to verify the secure socket layer (SSL) protocol
NASA Technical Reports Server (NTRS)
Powell, John D.
2004-01-01
nal Aeronautics and Space Administration (NASA) have tens of thousands of networked computer systems and applications. Software Security vulnerabilities present risks such as lost or corrupted data, information the3, and unavailability of critical systems. These risks represent potentially enormous costs to NASA. The NASA Code Q research initiative 'Reducing Software Security Risk (RSSR) Trough an Integrated Approach '' offers, among its capabilities, formal verification of software security properties, through the use of model based verification (MBV) to address software security risks. [1,2,3,4,5,6] MBV is a formal approach to software assurance that combines analysis of software, via abstract models, with technology, such as model checkers, that provide automation of the mechanical portions of the analysis process. This paper will discuss: The need for formal analysis to assure software systems with respect to software and why testing alone cannot provide it. The means by which MBV with a Flexible Modeling Framework (FMF) accomplishes the necessary analysis task. An example of FMF style MBV in the verification of properties over the Secure Socket Layer (SSL) communication protocol as a demonstration.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-22
... NUCLEAR REGULATORY COMMISSION [NRC-2012-0195] Developing Software Life Cycle Processes for Digital... Software Life Cycle Processes for Digital Computer Software used in Safety Systems of Nuclear Power Plants... clarifications, the enhanced consensus practices for developing software life-cycle processes for digital...
A Prototype for the Support of Integrated Software Process Development and Improvement
NASA Astrophysics Data System (ADS)
Porrawatpreyakorn, Nalinpat; Quirchmayr, Gerald; Chutimaskul, Wichian
An efficient software development process is one of key success factors for quality software. Not only can the appropriate establishment but also the continuous improvement of integrated project management and of the software development process result in efficiency. This paper hence proposes a software process maintenance framework which consists of two core components: an integrated PMBOK-Scrum model describing how to establish a comprehensive set of project management and software engineering processes and a software development maturity model advocating software process improvement. Besides, a prototype tool to support the framework is introduced.
The ImageJ ecosystem: an open platform for biomedical image analysis
Schindelin, Johannes; Rueden, Curtis T.; Hiner, Mark C.; Eliceiri, Kevin W.
2015-01-01
Technology in microscopy advances rapidly, enabling increasingly affordable, faster, and more precise quantitative biomedical imaging, which necessitates correspondingly more-advanced image processing and analysis techniques. A wide range of software is available – from commercial to academic, special-purpose to Swiss army knife, small to large–but a key characteristic of software that is suitable for scientific inquiry is its accessibility. Open-source software is ideal for scientific endeavors because it can be freely inspected, modified, and redistributed; in particular, the open-software platform ImageJ has had a huge impact on life sciences, and continues to do so. From its inception, ImageJ has grown significantly due largely to being freely available and its vibrant and helpful user community. Scientists as diverse as interested hobbyists, technical assistants, students, scientific staff, and advanced biology researchers use ImageJ on a daily basis, and exchange knowledge via its dedicated mailing list. Uses of ImageJ range from data visualization and teaching to advanced image processing and statistical analysis. The software's extensibility continues to attract biologists at all career stages as well as computer scientists who wish to effectively implement specific image-processing algorithms. In this review, we use the ImageJ project as a case study of how open-source software fosters its suites of software tools, making multitudes of image-analysis technology easily accessible to the scientific community. We specifically explore what makes ImageJ so popular, how it impacts life science, how it inspires other projects, and how it is self-influenced by coevolving projects within the ImageJ ecosystem. PMID:26153368
The ImageJ ecosystem: An open platform for biomedical image analysis.
Schindelin, Johannes; Rueden, Curtis T; Hiner, Mark C; Eliceiri, Kevin W
2015-01-01
Technology in microscopy advances rapidly, enabling increasingly affordable, faster, and more precise quantitative biomedical imaging, which necessitates correspondingly more-advanced image processing and analysis techniques. A wide range of software is available-from commercial to academic, special-purpose to Swiss army knife, small to large-but a key characteristic of software that is suitable for scientific inquiry is its accessibility. Open-source software is ideal for scientific endeavors because it can be freely inspected, modified, and redistributed; in particular, the open-software platform ImageJ has had a huge impact on the life sciences, and continues to do so. From its inception, ImageJ has grown significantly due largely to being freely available and its vibrant and helpful user community. Scientists as diverse as interested hobbyists, technical assistants, students, scientific staff, and advanced biology researchers use ImageJ on a daily basis, and exchange knowledge via its dedicated mailing list. Uses of ImageJ range from data visualization and teaching to advanced image processing and statistical analysis. The software's extensibility continues to attract biologists at all career stages as well as computer scientists who wish to effectively implement specific image-processing algorithms. In this review, we use the ImageJ project as a case study of how open-source software fosters its suites of software tools, making multitudes of image-analysis technology easily accessible to the scientific community. We specifically explore what makes ImageJ so popular, how it impacts the life sciences, how it inspires other projects, and how it is self-influenced by coevolving projects within the ImageJ ecosystem. © 2015 Wiley Periodicals, Inc.
NASA Technical Reports Server (NTRS)
1998-01-01
Under SBIR (Small Business Innovative Research) contracts with Lewis Research Center, Nektonics, Inc., developed coating process simulation tools, known as Nekton. This powerful simulation software is used specifically for the modeling and analysis of a wide range of coating flows including thin film coating analysis, polymer processing, and glass melt flows. Polaroid, Xerox, 3M, Dow Corning, Mead Paper, BASF, Mitsubishi, Chugai, and Dupont Imaging Systems are only a few of the companies that presently use Nekton.
Faster Aerodynamic Simulation With Cart3D
NASA Technical Reports Server (NTRS)
2003-01-01
A NASA-developed aerodynamic simulation tool is ensuring the safety of future space operations while providing designers and engineers with an automated, highly accurate computer simulation suite. Cart3D, co-winner of NASA's 2002 Software of the Year award, is the result of over 10 years of research and software development conducted by Michael Aftosmis and Dr. John Melton of Ames Research Center and Professor Marsha Berger of the Courant Institute at New York University. Cart3D offers a revolutionary approach to computational fluid dynamics (CFD), the computer simulation of how fluids and gases flow around an object of a particular design. By fusing technological advancements in diverse fields such as mineralogy, computer graphics, computational geometry, and fluid dynamics, the software provides a new industrial geometry processing and fluid analysis capability with unsurpassed automation and efficiency.
Ten recommendations for software engineering in research.
Hastings, Janna; Haug, Kenneth; Steinbeck, Christoph
2014-01-01
Research in the context of data-driven science requires a backbone of well-written software, but scientific researchers are typically not trained at length in software engineering, the principles for creating better software products. To address this gap, in particular for young researchers new to programming, we give ten recommendations to ensure the usability, sustainability and practicality of research software.
Visualizer: 3D Gridded Data Visualization Software for Geoscience Education and Research
NASA Astrophysics Data System (ADS)
Harwood, C.; Billen, M. I.; Kreylos, O.; Jadamec, M.; Sumner, D. Y.; Kellogg, L. H.; Hamann, B.
2008-12-01
In both research and education learning is an interactive and iterative process of exploring and analyzing data or model results. However, visualization software often presents challenges on the path to learning because it assumes the user already knows the locations and types of features of interest, instead of enabling flexible and intuitive examination of results. We present examples of research and teaching using the software, Visualizer, specifically designed to create an effective and intuitive environment for interactive, scientific analysis of 3D gridded data. Visualizer runs in a range of 3D virtual reality environments (e.g., GeoWall, ImmersaDesk, or CAVE), but also provides a similar level of real-time interactivity on a desktop computer. When using Visualizer in a 3D-enabled environment, the software allows the user to interact with the data images as real objects, grabbing, rotating or walking around the data to gain insight and perspective. On the desktop, simple features, such as a set of cross-bars marking the plane of the screen, provide extra 3D spatial cues that allow the user to more quickly understand geometric relationships within the data. This platform portability allows the user to more easily integrate research results into classroom demonstrations and exercises, while the interactivity provides an engaging environment for self-directed and inquiry-based learning by students. Visualizer software is freely available for download (www.keckcaves.org) and runs on Mac OSX and Linux platforms.
Developments at the Advanced Design Technologies Testbed
NASA Technical Reports Server (NTRS)
VanDalsem, William R.; Livingston, Mary E.; Melton, John E.; Torres, Francisco J.; Stremel, Paul M.
2003-01-01
A report presents background and historical information, as of August 1998, on the Advanced Design Technologies Testbed (ADTT) at Ames Research Center. The ADTT is characterized as an activity initiated to facilitate improvements in aerospace design processes; provide a proving ground for product-development methods and computational software and hardware; develop bridging methods, software, and hardware that can facilitate integrated solutions to design problems; and disseminate lessons learned to the aerospace and information technology communities.
Scanning and Measuring Device for Diagnostic of Barrel Bore
NASA Astrophysics Data System (ADS)
Marvan, Ales; Hajek, Josef; Vana, Jan; Dvorak, Radim; Drahansky, Martin; Jankovych, Robert; Skvarek, Jozef
The article discusses the design, mechanical design, electronics and software for robot diagnosis of barrels with caliber of 120 mm to 155 mm. This diagnostic device is intended primarily for experimental research and verification of appropriate methods and technologies for the diagnosis of the main bore guns. Article also discusses the design of sensors and software, the issue of data processing and image reconstruction obtained by scanning of the surface of the bore.
Electromagnetic Field Effects in Semiconductor Crystal Growth
NASA Technical Reports Server (NTRS)
Dulikravich, George S.
1996-01-01
This proposed two-year research project was to involve development of an analytical model, a numerical algorithm for its integration, and a software for the analysis of a solidification process under the influence of electric and magnetic fields in microgravity. Due to the complexity of the analytical model that was developed and its boundary conditions, only a preliminary version of the numerical algorithm was developed while the development of the software package was not completed.
Blagec, Kathrin; Jungwirth, David; Haluza, Daniela; Samwald, Matthias
2018-01-01
Medical device regulations which aim to ensure safety standards do not only apply to hardware devices but also to standalone medical software, e.g. mobile apps. To explore the effects of these regulations on the development and distribution of medical standalone software. We invited a convenience sample of 130 domain experts to participate in an online survey about the impact of current regulations on the development and distribution of medical standalone software. 21 respondents completed the questionnaire. Participants reported slight positive effects on usability, reliability, and data security of their products, whereas the ability to modify already deployed software and customization by end users were negatively impacted. The additional time and costs needed to go through the regulatory process were perceived as the greatest obstacles in developing and distributing medical software. Further research is needed to compare positive effects on software quality with negative impacts on market access and innovation. Strategies for avoiding over-regulation while still ensuring safety standards need to be devised.
Evolution of International Space Station Program Safety Review Processes and Tools
NASA Technical Reports Server (NTRS)
Ratterman, Christian D.; Green, Collin; Guibert, Matt R.; McCracken, Kristle I.; Sang, Anthony C.; Sharpe, Matthew D.; Tollinger, Irene V.
2013-01-01
The International Space Station Program at NASA is constantly seeking to improve the processes and systems that support safe space operations. To that end, the ISS Program decided to upgrade their Safety and Hazard data systems with 3 goals: make safety and hazard data more accessible; better support the interconnection of different types of safety data; and increase the efficiency (and compliance) of safety-related processes. These goals are accomplished by moving data into a web-based structured data system that includes strong process support and supports integration with other information systems. Along with the data systems, ISS is evolving its submission requirements and safety process requirements to support the improved model. In contrast to existing operations (where paper processes and electronic file repositories are used for safety data management) the web-based solution provides the program with dramatically faster access to records, the ability to search for and reference specific data within records, reduced workload for hazard updates and approval, and process support including digital signatures and controlled record workflow. In addition, integration with other key data systems provides assistance with assessments of flight readiness, more efficient review and approval of operational controls and better tracking of international safety certifications. This approach will also provide new opportunities to streamline the sharing of data with ISS international partners while maintaining compliance with applicable laws and respecting restrictions on proprietary data. One goal of this paper is to outline the approach taken by the ISS Progrm to determine requirements for the new system and to devise a practical and efficient implementation strategy. From conception through implementation, ISS and NASA partners utilized a user-centered software development approach focused on user research and iterative design methods. The user-centered approach used on the new ISS hazard system utilized focused user research and iterative design methods employed by the Human Computer Interaction Group at NASA Ames Research Center. Particularly, the approach emphasized the reduction of workload associated with document and data management activities so more resources can be allocated to the operational use of data in problem solving, safety analysis, and recurrence control. The methods and techniques used to understand existing processes and systems, to recognize opportunities for improvement, and to design and review improvements are described with the intent that similar techniques can be employed elsewhere in safety operations. A second goal of this paper is to provide and overview of the web-based data system implemented by ISS. The software selected for the ISS hazard systemMission Assurance System (MAS)is a NASA-customized vairant of the open source software project Bugzilla. The origin and history of MAS as a NASA software project and the rationale for (and advantages of) using open-source software are documented elsewhere (Green, et al., 2009).
Software Formal Inspections Standard
NASA Technical Reports Server (NTRS)
1993-01-01
This Software Formal Inspections Standard (hereinafter referred to as Standard) is applicable to NASA software. This Standard defines the requirements that shall be fulfilled by the software formal inspections process whenever this process is specified for NASA software. The objective of this Standard is to define the requirements for a process that inspects software products to detect and eliminate defects as early as possible in the software life cycle. The process also provides for the collection and analysis of inspection data to improve the inspection process as well as the quality of the software.
An experimental study of factors affecting the selective inhibition of sintering process
NASA Astrophysics Data System (ADS)
Asiabanpour, Bahram
Selective Inhibition of Sintering (SIS) is a new rapid prototyping method that builds parts in a layer-by-layer fabrication basis. SIS works by joining powder particles through sintering in the part's body, and by sintering inhibition of some selected powder areas. The objective of this research has been to improve the new SIS process, which has been invented at USC. The process improvement is based on statistical design of experiments. To conduct the needed experiments a working machine and related path generator software were needed. The machine and its control software were made available prior to this research. The path generator algorithms and software had to be created. This program should obtain model geometry data from a CAD file and generate an appropriate path file for the printer nozzle. Also, the program should generate a simulation file for path file inspection using virtual prototyping. The activities related to path generator constitute the first part of this research, which has resulted in an efficient path generator. In addition, to reach an acceptable level of accuracy, strength, and surface quality in the fabricated parts, all effective factors in the SIS process should be identified and controlled. Simultaneous analytical and experimental studies were conducted to recognize effective factors and to control the SIS process. Also, it was known that polystyrene was the most appropriate polymer powder and saturated potassium iodide was the most effective inhibitor among the available candidate materials. In addition, statistical tools were applied to improve the desirable properties of the parts fabricated by the SIS process. An investigation of part strength was conducted using the Response Surface Methodology (RSM) and a region of acceptable operating conditions for the part strength was found. Then, through analysis of the experimental results, the impact of the factors on the final part surface quality and dimensional accuracy was modeled. After developing a desirability function model, process operating conditions for maximum desirability were identified. Finally, the desirability model was validated.
BEANS - a software package for distributed Big Data analysis
NASA Astrophysics Data System (ADS)
Hypki, Arkadiusz
2018-07-01
BEANS software is a web-based, easy to install and maintain, new tool to store and analyse in a distributed way a massive amount of data. It provides a clear interface for querying, filtering, aggregating, and plotting data from an arbitrary number of data sets. Its main purpose is to simplify the process of storing, examining, and finding new relations in huge data sets. The software is an answer to a growing need of the astronomical community to have a versatile tool to store, analyse, and compare the complex astrophysical numerical simulations with observations (e.g. simulations of the Galaxy or star clusters with the Gaia archive). However, this software was built in a general form and it is ready to use in any other research field. It can be used as a building block for other open-source software too.
BEANS - a software package for distributed Big Data analysis
NASA Astrophysics Data System (ADS)
Hypki, Arkadiusz
2018-03-01
BEANS software is a web based, easy to install and maintain, new tool to store and analyse in a distributed way a massive amount of data. It provides a clear interface for querying, filtering, aggregating, and plotting data from an arbitrary number of datasets. Its main purpose is to simplify the process of storing, examining and finding new relations in huge datasets. The software is an answer to a growing need of the astronomical community to have a versatile tool to store, analyse and compare the complex astrophysical numerical simulations with observations (e.g. simulations of the Galaxy or star clusters with the Gaia archive). However, this software was built in a general form and it is ready to use in any other research field. It can be used as a building block for other open source software too.
Research on infrared small-target tracking technology under complex background
NASA Astrophysics Data System (ADS)
Liu, Lei; Wang, Xin; Chen, Jilu; Pan, Tao
2012-10-01
In this paper, some basic principles and the implementing flow charts of a series of algorithms for target tracking are described. On the foundation of above works, a moving target tracking software base on the OpenCV is developed by the software developing platform MFC. Three kinds of tracking algorithms are integrated in this software. These two tracking algorithms are Kalman Filter tracking method and Camshift tracking method. In order to explain the software clearly, the framework and the function are described in this paper. At last, the implementing processes and results are analyzed, and those algorithms for tracking targets are evaluated from the two aspects of subjective and objective. This paper is very significant in the application of the infrared target tracking technology.
Jiang, Yu; Guarino, Peter; Ma, Shuangge; Simon, Steve; Mayo, Matthew S; Raghavan, Rama; Gajewski, Byron J
2016-07-22
Subject recruitment for medical research is challenging. Slow patient accrual leads to increased costs and delays in treatment advances. Researchers need reliable tools to manage and predict the accrual rate. The previously developed Bayesian method integrates researchers' experience on former trials and data from an ongoing study, providing a reliable prediction of accrual rate for clinical studies. In this paper, we present a user-friendly graphical user interface program developed in R. A closed-form solution for the total subjects that can be recruited within a fixed time is derived. We also present a built-in Android system using Java for web browsers and mobile devices. Using the accrual software, we re-evaluated the Veteran Affairs Cooperative Studies Program 558- ROBOTICS study. The application of the software in monitoring and management of recruitment is illustrated for different stages of the trial. This developed accrual software provides a more convenient platform for estimation and prediction of the accrual process.
Laireiter, Anton Rupert
2017-01-01
Background In recent years, the assessment of mental disorders has become more and more personalized. Modern advancements such as Internet-enabled mobile phones and increased computing capacity make it possible to tap sources of information that have long been unavailable to mental health practitioners. Objective Software packages that combine algorithm-based treatment planning, process monitoring, and outcome monitoring are scarce. The objective of this study was to assess whether the DynAMo Web application can fill this gap by providing a software solution that can be used by both researchers to conduct state-of-the-art psychotherapy process research and clinicians to plan treatments and monitor psychotherapeutic processes. Methods In this paper, we report on the current state of a Web application that can be used for assessing the temporal structure of mental disorders using information on their temporal and synchronous associations. A treatment planning algorithm automatically interprets the data and delivers priority scores of symptoms to practitioners. The application is also capable of monitoring psychotherapeutic processes during therapy and of monitoring treatment outcomes. This application was developed using the R programming language (R Core Team, Vienna) and the Shiny Web application framework (RStudio, Inc, Boston). It is made entirely from open-source software packages and thus is easily extensible. Results The capabilities of the proposed application are demonstrated. Case illustrations are provided to exemplify its usefulness in clinical practice. Conclusions With the broad availability of Internet-enabled mobile phones and similar devices, collecting data on psychopathology and psychotherapeutic processes has become easier than ever. The proposed application is a valuable tool for capturing, processing, and visualizing these data. The combination of dynamic assessment and process- and outcome monitoring has the potential to improve the efficacy and effectiveness of psychotherapy. PMID:28729233
Workflow-Based Software Development Environment
NASA Technical Reports Server (NTRS)
Izygon, Michel E.
2013-01-01
The Software Developer's Assistant (SDA) helps software teams more efficiently and accurately conduct or execute software processes associated with NASA mission-critical software. SDA is a process enactment platform that guides software teams through project-specific standards, processes, and procedures. Software projects are decomposed into all of their required process steps or tasks, and each task is assigned to project personnel. SDA orchestrates the performance of work required to complete all process tasks in the correct sequence. The software then notifies team members when they may begin work on their assigned tasks and provides the tools, instructions, reference materials, and supportive artifacts that allow users to compliantly perform the work. A combination of technology components captures and enacts any software process use to support the software lifecycle. It creates an adaptive workflow environment that can be modified as needed. SDA achieves software process automation through a Business Process Management (BPM) approach to managing the software lifecycle for mission-critical projects. It contains five main parts: TieFlow (workflow engine), Business Rules (rules to alter process flow), Common Repository (storage for project artifacts, versions, history, schedules, etc.), SOA (interface to allow internal, GFE, or COTS tools integration), and the Web Portal Interface (collaborative web environment
ERIC Educational Resources Information Center
Mohammadi, Hadi
2014-01-01
Use of the Patch Vulnerability Management (PVM) process should be seriously considered for any networked computing system. The PVM process prevents the operating system (OS) and software applications from being attacked due to security vulnerabilities, which lead to system failures and critical data leakage. The purpose of this research is to…
Wind speed vector restoration algorithm
NASA Astrophysics Data System (ADS)
Baranov, Nikolay; Petrov, Gleb; Shiriaev, Ilia
2018-04-01
Impulse wind lidar (IWL) signal processing software developed by JSC «BANS» recovers full wind speed vector by radial projections and provides wind parameters information up to 2 km distance. Increasing accuracy and speed of wind parameters calculation signal processing technics have been studied in this research. Measurements results of IWL and continuous scanning lidar were compared. Also, IWL data processing modeling results have been analyzed.
European Science Notes Information Bulletin Reports on Current European and Middle Eastern Science
1992-01-01
Overcash MATERIALS Research and Development in the Abbey-Polymer Processing and Properties ................... 574 J. Magill Corrosion and Protection Centre...gressi• ely pursuing the development of powerful "* Software Engineering and microprocessors and communication chips. The Information Processing ...differential equations, processing , Europe has a number of fascinating weather forecasting) that are to be developed by a projects in distributed
Occupy Hard Drives: Making your work more valuable by giving it away
NASA Astrophysics Data System (ADS)
Weiner, Benjamin J.
2014-01-01
Astronomy is more than ever reliant on scientist-built software, but our systems of supporting research and giving credit for research work have failed to evolve with this reality. Both the perception of short term advantage, and an artificial distinction between "tools" and "science," lead to software and data remaining proprietary or unpublished. The lack of incentives to build and maintain software leads to both a decay of the software infrastructure, and a potential for growing class inequality, a pundit-technician divide. Top-down efforts to direct the field such as the recent US decadal survey have not adequately addressed this future. I argue that writing, freely releasing, and publishing your software is currently not adequately funded, rewarded, or credited, and that you should do it anyway. Writing your software as if you plan to release it is better for you and for the code. Releasing software can get credit from the rest of the community beyond your circle of collaborators or letter-writers, and it can benefit you and everyone else by making astronomy a better place to work. Building a culture of cooperation will be a more effective approach to reforming the system of credit than waiting for leadership from above or outside, but requires that each of us consciously encourage process, values, and behavior that support such a change.
A software framework for real-time multi-modal detection of microsleeps.
Knopp, Simon J; Bones, Philip J; Weddell, Stephen J; Jones, Richard D
2017-09-01
A software framework is described which was designed to process EEG, video of one eye, and head movement in real time, towards achieving early detection of microsleeps for prevention of fatal accidents, particularly in transport sectors. The framework is based around a pipeline structure with user-replaceable signal processing modules. This structure can encapsulate a wide variety of feature extraction and classification techniques and can be applied to detecting a variety of aspects of cognitive state. Users of the framework can implement signal processing plugins in C++ or Python. The framework also provides a graphical user interface and the ability to save and load data to and from arbitrary file formats. Two small studies are reported which demonstrate the capabilities of the framework in typical applications: monitoring eye closure and detecting simulated microsleeps. While specifically designed for microsleep detection/prediction, the software framework can be just as appropriately applied to (i) other measures of cognitive state and (ii) development of biomedical instruments for multi-modal real-time physiological monitoring and event detection in intensive care, anaesthesiology, cardiology, neurosurgery, etc. The software framework has been made freely available for researchers to use and modify under an open source licence.
New technologies for supporting real-time on-board software development
NASA Astrophysics Data System (ADS)
Kerridge, D.
1995-03-01
The next generation of on-board data management systems will be significantly more complex than current designs, and will be required to perform more complex and demanding tasks in software. Improved hardware technology, in the form of the MA31750 radiation hard processor, is one key component in addressing the needs of future embedded systems. However, to complement these hardware advances, improved support for the design and implementation of real-time data management software is now needed. This will help to control the cost and risk assoicated with developing data management software development as it becomes an increasingly significant element within embedded systems. One particular problem with developing embedded software is managing the non-functional requirements in a systematic way. This paper identifies how Logica has exploited recent developments in hard real-time theory to address this problem through the use of new hard real-time analysis and design methods which can be supported by specialized tools. The first stage in transferring this technology from the research domain to industrial application has already been completed. The MA37150 Hard Real-Time Embedded Software Support Environment (HESSE) is a loosely integrated set of hardware and software tools which directly support the process of hard real-time analysis for software targeting the MA31750 processor. With further development, this HESSE promises to provide embedded system developers with software tools which can reduce the risks associated with developing complex hard real-time software. Supported in this way by more sophisticated software methods and tools, it is foreseen that MA31750 based embedded systems can meet the processing needs for the next generation of on-board data management systems.
Electronic and software systems of an automated portable static mass spectrometer
NASA Astrophysics Data System (ADS)
Chichagov, Yu. V.; Bogdanov, A. A.; Lebedev, D. S.; Kogan, V. T.; Tubol'tsev, Yu. V.; Kozlenok, A. V.; Moroshkin, V. S.; Berezina, A. V.
2017-01-01
The electronic systems of a small high-sensitivity static mass spectrometer and software and hardware tools, which allow one to determine trace concentrations of gases and volatile compounds in air and water samples in real time, have been characterized. These systems and tools have been used to set up the device, control the process of measurement, synchronize this process with accompanying measurements, maintain reliable operation of the device, process the obtained results automatically, and visualize and store them. The developed software and hardware tools allow one to conduct continuous measurements for up to 100 h and provide an opportunity for personnel with no special training to perform maintenance on the device. The test results showed that mobile mass spectrometers for geophysical and medical research, which were fitted with these systems, had a determination limit for target compounds as low as several ppb(m) and a mass resolving power (depending on the current task) as high as 250.
Fuzzy Logic Enhanced Digital PIV Processing Software
NASA Technical Reports Server (NTRS)
Wernet, Mark P.
1999-01-01
Digital Particle Image Velocimetry (DPIV) is an instantaneous, planar velocity measurement technique that is ideally suited for studying transient flow phenomena in high speed turbomachinery. DPIV is being actively used at the NASA Glenn Research Center to study both stable and unstable operating conditions in a high speed centrifugal compressor. Commercial PIV systems are readily available which provide near real time feedback of the PIV image data quality. These commercial systems are well designed to facilitate the expedient acquisition of PIV image data. However, as with any general purpose system, these commercial PIV systems do not meet all of the data processing needs required for PIV image data reduction in our compressor research program. An in-house PIV PROCessing (PIVPROC) code has been developed for reducing PIV data. The PIVPROC software incorporates fuzzy logic data validation for maximum information recovery from PIV image data. PIVPROC enables combined cross-correlation/particle tracking wherein the highest possible spatial resolution velocity measurements are obtained.
Metadata-driven Clinical Data Loading into i2b2 for Clinical and Translational Science Institutes.
Post, Andrew R; Pai, Akshatha K; Willard, Richard; May, Bradley J; West, Andrew C; Agravat, Sanjay; Granite, Stephen J; Winslow, Raimond L; Stephens, David S
2016-01-01
Clinical and Translational Science Award (CTSA) recipients have a need to create research data marts from their clinical data warehouses, through research data networks and the use of i2b2 and SHRINE technologies. These data marts may have different data requirements and representations, thus necessitating separate extract, transform and load (ETL) processes for populating each mart. Maintaining duplicative procedural logic for each ETL process is onerous. We have created an entirely metadata-driven ETL process that can be customized for different data marts through separate configurations, each stored in an extension of i2b2 's ontology database schema. We extended our previously reported and open source Eureka! Clinical Analytics software with this capability. The same software has created i2b2 data marts for several projects, the largest being the nascent Accrual for Clinical Trials (ACT) network, for which it has loaded over 147 million facts about 1.2 million patients.
Metadata-driven Clinical Data Loading into i2b2 for Clinical and Translational Science Institutes
Post, Andrew R.; Pai, Akshatha K.; Willard, Richard; May, Bradley J.; West, Andrew C.; Agravat, Sanjay; Granite, Stephen J.; Winslow, Raimond L.; Stephens, David S.
2016-01-01
Clinical and Translational Science Award (CTSA) recipients have a need to create research data marts from their clinical data warehouses, through research data networks and the use of i2b2 and SHRINE technologies. These data marts may have different data requirements and representations, thus necessitating separate extract, transform and load (ETL) processes for populating each mart. Maintaining duplicative procedural logic for each ETL process is onerous. We have created an entirely metadata-driven ETL process that can be customized for different data marts through separate configurations, each stored in an extension of i2b2 ‘s ontology database schema. We extended our previously reported and open source Eureka! Clinical Analytics software with this capability. The same software has created i2b2 data marts for several projects, the largest being the nascent Accrual for Clinical Trials (ACT) network, for which it has loaded over 147 million facts about 1.2 million patients. PMID:27570667
Four simple recommendations to encourage best practices in research software
Jiménez, Rafael C.; Kuzak, Mateusz; Alhamdoosh, Monther; Barker, Michelle; Batut, Bérénice; Borg, Mikael; Capella-Gutierrez, Salvador; Chue Hong, Neil; Cook, Martin; Corpas, Manuel; Flannery, Madison; Garcia, Leyla; Gelpí, Josep Ll.; Gladman, Simon; Goble, Carole; González Ferreiro, Montserrat; Gonzalez-Beltran, Alejandra; Griffin, Philippa C.; Grüning, Björn; Hagberg, Jonas; Holub, Petr; Hooft, Rob; Ison, Jon; Katz, Daniel S.; Leskošek, Brane; López Gómez, Federico; Oliveira, Luis J.; Mellor, David; Mosbergen, Rowland; Mulder, Nicola; Perez-Riverol, Yasset; Pergl, Robert; Pichler, Horst; Pope, Bernard; Sanz, Ferran; Schneider, Maria V.; Stodden, Victoria; Suchecki, Radosław; Svobodová Vařeková, Radka; Talvik, Harry-Anton; Todorov, Ilian; Treloar, Andrew; Tyagi, Sonika; van Gompel, Maarten; Vaughan, Daniel; Via, Allegra; Wang, Xiaochuan; Watson-Haigh, Nathan S.; Crouch, Steve
2017-01-01
Scientific research relies on computer software, yet software is not always developed following practices that ensure its quality and sustainability. This manuscript does not aim to propose new software development best practices, but rather to provide simple recommendations that encourage the adoption of existing best practices. Software development best practices promote better quality software, and better quality software improves the reproducibility and reusability of research. These recommendations are designed around Open Source values, and provide practical suggestions that contribute to making research software and its source code more discoverable, reusable and transparent. This manuscript is aimed at developers, but also at organisations, projects, journals and funders that can increase the quality and sustainability of research software by encouraging the adoption of these recommendations. PMID:28751965
Four simple recommendations to encourage best practices in research software.
Jiménez, Rafael C; Kuzak, Mateusz; Alhamdoosh, Monther; Barker, Michelle; Batut, Bérénice; Borg, Mikael; Capella-Gutierrez, Salvador; Chue Hong, Neil; Cook, Martin; Corpas, Manuel; Flannery, Madison; Garcia, Leyla; Gelpí, Josep Ll; Gladman, Simon; Goble, Carole; González Ferreiro, Montserrat; Gonzalez-Beltran, Alejandra; Griffin, Philippa C; Grüning, Björn; Hagberg, Jonas; Holub, Petr; Hooft, Rob; Ison, Jon; Katz, Daniel S; Leskošek, Brane; López Gómez, Federico; Oliveira, Luis J; Mellor, David; Mosbergen, Rowland; Mulder, Nicola; Perez-Riverol, Yasset; Pergl, Robert; Pichler, Horst; Pope, Bernard; Sanz, Ferran; Schneider, Maria V; Stodden, Victoria; Suchecki, Radosław; Svobodová Vařeková, Radka; Talvik, Harry-Anton; Todorov, Ilian; Treloar, Andrew; Tyagi, Sonika; van Gompel, Maarten; Vaughan, Daniel; Via, Allegra; Wang, Xiaochuan; Watson-Haigh, Nathan S; Crouch, Steve
2017-01-01
Scientific research relies on computer software, yet software is not always developed following practices that ensure its quality and sustainability. This manuscript does not aim to propose new software development best practices, but rather to provide simple recommendations that encourage the adoption of existing best practices. Software development best practices promote better quality software, and better quality software improves the reproducibility and reusability of research. These recommendations are designed around Open Source values, and provide practical suggestions that contribute to making research software and its source code more discoverable, reusable and transparent. This manuscript is aimed at developers, but also at organisations, projects, journals and funders that can increase the quality and sustainability of research software by encouraging the adoption of these recommendations.
NASA Astrophysics Data System (ADS)
Löwe, Peter; Klump, Jens; Robertson, Jesse
2015-04-01
Text mining is commonly employed as a tool in data science to investigate and chart emergent information from corpora of research abstracts, such as the Geophysical Research Abstracts (GRA) published by Copernicus. In this context current standards, such as persistent identifiers like DOI and ORCID, allow us to trace, cite and map links between journal publications, the underlying research data and scientific software. This network can be expressed as a directed graph which enables us to chart networks of cooperation and innovation, thematic foci and the locations of research communities in time and space. However, this approach of data science, focusing on the research process in a self-referential manner, rather than the topical work, is still in a developing stage. Scientific work presented at the EGU General Assembly is often the first step towards new approaches and innovative ideas to the geospatial community. It represents a rich, deep and heterogeneous source of geoscientific thought. This corpus is a significant data source for data science, which has not been analysed on this scale previously. In this work, the corpus of the Geophysical Research Abstracts is used for the first time as a data base for analyses of topical text mining. For this, we used a sturdy and customizable software framework, based on the work of Schmitt et al. [1]. For the analysis we used the High Performance Computing infrastructure of the German Research Centre for Geosciences GFZ in Potsdam, Germany. Here, we report on the first results from the analysis of the continuous spreading the of use of Free and Open Source Software Tools (FOSS) within the EGU communities, mapping the general increase of FOSS-themed GRA articles in the last decade and the developing spatial patterns of involved parties and FOSS topics. References: [1] Schmitt, L. M., Christianson, K.T, Gupta R..: Linguistic Computing with UNIX Tools, in Kao, A., Poteet S.R. (Eds.): Natural Language processing and Text Mining, Springer, 2007. doi:10.1007/978-1-84628-754-1_12.
Software Tools | Office of Cancer Clinical Proteomics Research
The CPTAC program develops new approaches to elucidate aspects of the molecular complexity of cancer made from large-scale proteogenomic datasets, and advance them toward precision medicine. Part of the CPTAC mission is to make data and tools available and accessible to the greater research community to accelerate the discovery process.
Researching the Use of Voice Recognition Writing Software.
ERIC Educational Resources Information Center
Honeycutt, Lee
2003-01-01
Notes that voice recognition technology (VRT) has become accurate and fast enough to be useful in a variety of writing scenarios. Contends that little is known about how this technology might affect writing process or perceptions of silent writing. Explores future use of VRT by examining past research in the technology of dictation. (PM)
2018-01-01
Background Structural and functional brain images are essential imaging modalities for medical experts to study brain anatomy. These images are typically visually inspected by experts. To analyze images without any bias, they must be first converted to numeric values. Many software packages are available to process the images, but they are complex and difficult to use. The software packages are also hardware intensive. The results obtained after processing vary depending on the native operating system used and its associated software libraries; data processed in one system cannot typically be combined with data on another system. Objective The aim of this study was to fulfill the neuroimaging community’s need for a common platform to store, process, explore, and visualize their neuroimaging data and results using Neuroimaging Web Services Interface: a series of processing pipelines designed as a cyber physical system for neuroimaging and clinical data in brain research. Methods Neuroimaging Web Services Interface accepts magnetic resonance imaging, positron emission tomography, diffusion tensor imaging, and functional magnetic resonance imaging. These images are processed using existing and custom software packages. The output is then stored as image files, tabulated files, and MySQL tables. The system, made up of a series of interconnected servers, is password-protected and is securely accessible through a Web interface and allows (1) visualization of results and (2) downloading of tabulated data. Results All results were obtained using our processing servers in order to maintain data validity and consistency. The design is responsive and scalable. The processing pipeline started from a FreeSurfer reconstruction of Structural magnetic resonance imaging images. The FreeSurfer and regional standardized uptake value ratio calculations were validated using Alzheimer’s Disease Neuroimaging Initiative input images, and the results were posted at the Laboratory of Neuro Imaging data archive. Notable leading researchers in the field of Alzheimer’s Disease and epilepsy have used the interface to access and process the data and visualize the results. Tabulated results with unique visualization mechanisms help guide more informed diagnosis and expert rating, providing a truly unique multimodal imaging platform that combines magnetic resonance imaging, positron emission tomography, diffusion tensor imaging, and resting state functional magnetic resonance imaging. A quality control component was reinforced through expert visual rating involving at least 2 experts. Conclusions To our knowledge, there is no validated Web-based system offering all the services that Neuroimaging Web Services Interface offers. The intent of Neuroimaging Web Services Interface is to create a tool for clinicians and researchers with keen interest on multimodal neuroimaging. More importantly, Neuroimaging Web Services Interface significantly augments the Alzheimer’s Disease Neuroimaging Initiative data, especially since our data contain a large cohort of Hispanic normal controls and Alzheimer’s Disease patients. The obtained results could be scrutinized visually or through the tabulated forms, informing researchers on subtle changes that characterize the different stages of the disease. PMID:29699962
Villoria, Eduardo M; Lenzi, Antônio R; Soares, Rodrigo V; Souki, Bernardo Q; Sigurdsson, Asgeir; Marques, Alexandre P; Fidel, Sandra R
2017-01-01
To describe the use of open-source software for the post-processing of CBCT imaging for the assessment of periapical lesions development after endodontic treatment. CBCT scans were retrieved from endodontic records of two patients. Three-dimensional virtual models, voxel counting, volumetric measurement (mm 3 ) and mean intensity of the periapical lesion were performed with ITK-SNAP v. 3.0 software. Three-dimensional models of the lesions were aligned and overlapped through the MeshLab software, which performed an automatic recording of the anatomical structures, based on the best fit. Qualitative and quantitative analyses of the changes in lesions size after treatment were performed with the 3DMeshMetric software. The ITK-SNAP v. 3.0 showed the smaller value corresponding to the voxel count and the volume of the lesion segmented in yellow, indicating reduction in volume of the lesion after the treatment. A higher value of the mean intensity of the segmented image in yellow was also observed, which suggested new bone formation. Colour mapping and "point value" tool allowed the visualization of the reduction of periapical lesions in several regions. Researchers and clinicians in the monitoring of endodontic periapical lesions have the opportunity to use open-source software.
Hardware development process for Human Research facility applications
NASA Astrophysics Data System (ADS)
Bauer, Liz
2000-01-01
The simple goal of the Human Research Facility (HRF) is to conduct human research experiments on the International Space Station (ISS) astronauts during long-duration missions. This is accomplished by providing integration and operation of the necessary hardware and software capabilities. A typical hardware development flow consists of five stages: functional inputs and requirements definition, market research, design life cycle through hardware delivery, crew training, and mission support. The purpose of this presentation is to guide the audience through the early hardware development process: requirement definition through selecting a development path. Specific HRF equipment is used to illustrate the hardware development paths. .
Hera: High Energy Astronomical Data Analysis via the Internet
NASA Astrophysics Data System (ADS)
Valencic, Lynne A.; Chai, P.; Pence, W.; Snowden, S.
2011-09-01
The HEASARC at NASA Goddard Space Flight Center has developed Hera, a data processing facility for analyzing high energy astronomical data over the internet. Hera provides all the software packages, disk space, and computing resources needed to do general processing of and advanced research on publicly available data from High Energy Astrophysics missions. The data and data products are kept on a server at GSFC and can be downloaded to a user's local machine. This service is provided for free to students, educators, and researchers for educational and research purposes.
Digital processing of mesoscale analysis and space sensor data
NASA Technical Reports Server (NTRS)
Hickey, J. S.; Karitani, S.
1985-01-01
The mesoscale analysis and space sensor (MASS) data management and analysis system on the research computer system is presented. The MASS data base management and analysis system was implemented on the research computer system which provides a wide range of capabilities for processing and displaying large volumes of conventional and satellite derived meteorological data. The research computer system consists of three primary computers (HP-1000F, Harris/6, and Perkin-Elmer 3250), each of which performs a specific function according to its unique capabilities. The overall tasks performed concerning the software, data base management and display capabilities of the research computer system in terms of providing a very effective interactive research tool for the digital processing of mesoscale analysis and space sensor data is described.
Chapter 16: text mining for translational bioinformatics.
Cohen, K Bretonnel; Hunter, Lawrence E
2013-04-01
Text mining for translational bioinformatics is a new field with tremendous research potential. It is a subfield of biomedical natural language processing that concerns itself directly with the problem of relating basic biomedical research to clinical practice, and vice versa. Applications of text mining fall both into the category of T1 translational research-translating basic science results into new interventions-and T2 translational research, or translational research for public health. Potential use cases include better phenotyping of research subjects, and pharmacogenomic research. A variety of methods for evaluating text mining applications exist, including corpora, structured test suites, and post hoc judging. Two basic principles of linguistic structure are relevant for building text mining applications. One is that linguistic structure consists of multiple levels. The other is that every level of linguistic structure is characterized by ambiguity. There are two basic approaches to text mining: rule-based, also known as knowledge-based; and machine-learning-based, also known as statistical. Many systems are hybrids of the two approaches. Shared tasks have had a strong effect on the direction of the field. Like all translational bioinformatics software, text mining software for translational bioinformatics can be considered health-critical and should be subject to the strictest standards of quality assurance and software testing.
NASA Astrophysics Data System (ADS)
Saroinsong, T.; A. S Kondoj, M.; Kandiyoh, G.; Pontoh, G.
2018-01-01
The State Polytechnic of Manado (Polimdo) is one of the reliable institutions in North Sulawesi that first implemented ISO 9001. But the accreditation of the institution has not been satisfactory, it means there is still much to be prepared to achieve the expected target. One of the criteria of assessment of institutional accreditation is related to research activities and social work in accordance with the standard seven. Data documentation systems related to research activities and social work are not well integrated and well documented in all existing work units. This causes the process of gathering information related to the activities and the results of research and social work in order to support the accreditation activities of the institution is still not efficient. This study aims to build an integrated software in all work units in Polimdo to obtain documentation and data synchronization in support of activities or reporting of documents accreditation institution in accordance with standard seven specifically in terms of submission of research proposal and dedication. The software will be developed using RUP method with analysis using data flow diagram and ERM so that the result of this research is documentation and synchronization of data and information of research activity and community service which can be used in preparing documents report for accreditation institution.
Proceedings of the Workshop on software tools for distributed intelligent control systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Herget, C.J.
1990-09-01
The Workshop on Software Tools for Distributed Intelligent Control Systems was organized by Lawrence Livermore National Laboratory for the United States Army Headquarters Training and Doctrine Command and the Defense Advanced Research Projects Agency. The goals of the workshop were to the identify the current state of the art in tools which support control systems engineering design and implementation, identify research issues associated with writing software tools which would provide a design environment to assist engineers in multidisciplinary control design and implementation, formulate a potential investment strategy to resolve the research issues and develop public domain code which can formmore » the core of more powerful engineering design tools, and recommend test cases to focus the software development process and test associated performance metrics. Recognizing that the development of software tools for distributed intelligent control systems will require a multidisciplinary effort, experts in systems engineering, control systems engineering, and compute science were invited to participate in the workshop. In particular, experts who could address the following topics were selected: operating systems, engineering data representation and manipulation, emerging standards for manufacturing data, mathematical foundations, coupling of symbolic and numerical computation, user interface, system identification, system representation at different levels of abstraction, system specification, system design, verification and validation, automatic code generation, and integration of modular, reusable code.« less
User-Design: A Case Application in Health Care Training.
ERIC Educational Resources Information Center
Carr-Chellman, Alison; Cuyar, Craig; Breman, Jeroen
1998-01-01
Discussion of user-design as a theoretical process for creating training, software, and computer systems focuses on a case study that describes user-design methodology in home health care through the context of diffusing a new laptop patient-record-keeping system with home nurses. Research needs and implications for the design process are…
A flexible continuous-variable QKD system using off-the-shelf components
NASA Astrophysics Data System (ADS)
Comandar, Lucian C.; Brunner, Hans H.; Bettelli, Stefano; Fung, Fred; Karinou, Fotini; Hillerkuss, David; Mikroulis, Spiros; Wang, Dawei; Kuschnerov, Maxim; Xie, Changsong; Poppe, Andreas; Peev, Momtchil
2017-10-01
We present the development of a robust and versatile CV-QKD architecture based on commercially available optical and electronic components. The system uses a pilot tone for phase synchronization with a local oscillator, as well as local feedback loops to mitigate frequency and polarization drifts. Transmit and receive-side digital signal processing is performed fully in software, allowing for rapid protocol reconfiguration. The quantum link is complemented with a software stack for secure-key processing, key storage and encrypted communication. All these features allow for the system to be at the same time a prototype for a future commercial product and a research platform.
NASA Technical Reports Server (NTRS)
Leach, Ronald J.
1997-01-01
The purpose of this project was to study the feasibility of reusing major components of a software system that had been used to control the operations of a spacecraft launched in the 1980s. The study was done in the context of a ground data processing system that was to be rehosted from a large mainframe to an inexpensive workstation. The study concluded that a systematic approach using inexpensive tools could aid in the reengineering process by identifying a set of certified reusable components. The study also developed procedures for determining duplicate versions of software, which were created because of inadequate naming conventions. Such procedures reduced reengineering costs by approximately 19.4 percent.
A framework for assessing the adequacy and effectiveness of software development methodologies
NASA Technical Reports Server (NTRS)
Arthur, James D.; Nance, Richard E.
1990-01-01
Tools, techniques, environments, and methodologies dominate the software engineering literature, but relatively little research in the evaluation of methodologies is evident. This work reports an initial attempt to develop a procedural approach to evaluating software development methodologies. Prominent in this approach are: (1) an explication of the role of a methodology in the software development process; (2) the development of a procedure based on linkages among objectives, principles, and attributes; and (3) the establishment of a basis for reduction of the subjective nature of the evaluation through the introduction of properties. An application of the evaluation procedure to two Navy methodologies has provided consistent results that demonstrate the utility and versatility of the evaluation procedure. Current research efforts focus on the continued refinement of the evaluation procedure through the identification and integration of product quality indicators reflective of attribute presence, and the validation of metrics supporting the measure of those indicators. The consequent refinement of the evaluation procedure offers promise of a flexible approach that admits to change as the field of knowledge matures. In conclusion, the procedural approach presented in this paper represents a promising path toward the end goal of objectively evaluating software engineering methodologies.
Verification and Validation of Neural Networks for Aerospace Systems
NASA Technical Reports Server (NTRS)
Mackall, Dale; Nelson, Stacy; Schumman, Johann; Clancy, Daniel (Technical Monitor)
2002-01-01
The Dryden Flight Research Center V&V working group and NASA Ames Research Center Automated Software Engineering (ASE) group collaborated to prepare this report. The purpose is to describe V&V processes and methods for certification of neural networks for aerospace applications, particularly adaptive flight control systems like Intelligent Flight Control Systems (IFCS) that use neural networks. This report is divided into the following two sections: 1) Overview of Adaptive Systems; and 2) V&V Processes/Methods.
Verification and Validation of Neural Networks for Aerospace Systems
NASA Technical Reports Server (NTRS)
Mackall, Dale; Nelson, Stacy; Schumann, Johann
2002-01-01
The Dryden Flight Research Center V&V working group and NASA Ames Research Center Automated Software Engineering (ASE) group collaborated to prepare this report. The purpose is to describe V&V processes and methods for certification of neural networks for aerospace applications, particularly adaptive flight control systems like Intelligent Flight Control Systems (IFCS) that use neural networks. This report is divided into the following two sections: Overview of Adaptive Systems and V&V Processes/Methods.
NASA Astrophysics Data System (ADS)
Garov, A. S.; Karachevtseva, I. P.; Matveev, E. V.; Zubarev, A. E.; Florinsky, I. V.
2016-06-01
We are developing a unified distributed communication environment for processing of spatial data which integrates web-, desktop- and mobile platforms and combines volunteer computing model and public cloud possibilities. The main idea is to create a flexible working environment for research groups, which may be scaled according to required data volume and computing power, while keeping infrastructure costs at minimum. It is based upon the "single window" principle, which combines data access via geoportal functionality, processing possibilities and communication between researchers. Using an innovative software environment the recently developed planetary information system (http://cartsrv.mexlab.ru/geoportal) will be updated. The new system will provide spatial data processing, analysis and 3D-visualization and will be tested based on freely available Earth remote sensing data as well as Solar system planetary images from various missions. Based on this approach it will be possible to organize the research and representation of results on a new technology level, which provides more possibilities for immediate and direct reuse of research materials, including data, algorithms, methodology, and components. The new software environment is targeted at remote scientific teams, and will provide access to existing spatial distributed information for which we suggest implementation of a user interface as an advanced front-end, e.g., for virtual globe system.
Tautomerism in chemical information management systems
NASA Astrophysics Data System (ADS)
Warr, Wendy A.
2010-06-01
Tautomerism has an impact on many of the processes in chemical information management systems including novelty checking during registration into chemical structure databases; storage of structures; exact and substructure searching in chemical structure databases; and depiction of structures retrieved by a search. The approaches taken by 27 different software vendors and database producers are compared. It is hoped that this comparison will act as a discussion document that could ultimately improve databases and software for researchers in the future.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Salama, A.; Mikhail, M.
Comprehensive software packages have been developed at the Western Research Centre as tools to help coal preparation engineers analyze, evaluate, and control coal cleaning processes. The COal Preparation Software package (COPS) performs three functions: (1) data handling and manipulation, (2) data analysis, including the generation of washability data, performance evaluation and prediction, density and size modeling, evaluation of density and size partition characteristics and attrition curves, and (3) generation of graphics output. The Separation ChARacteristics Estimation software packages (SCARE) are developed to balance raw density or size separation data. The cases of density and size separation data are considered. Themore » generated balanced data can take the balanced or normalized forms. The scaled form is desirable for direct determination of the partition functions (curves). The raw and generated separation data are displayed in tabular and/or graphical forms. The computer softwares described in this paper are valuable tools for coal preparation plant engineers and operators for evaluating process performance, adjusting plant parameters, and balancing raw density or size separation data. These packages have been applied very successfully in many projects carried out by WRC for the Canadian coal preparation industry. The software packages are designed to run on a personal computer (PC).« less
Review of Software Tools for Design and Analysis of Large scale MRM Proteomic Datasets
Colangelo, Christopher M.; Chung, Lisa; Bruce, Can; Cheung, Kei-Hoi
2013-01-01
Selective or Multiple Reaction monitoring (SRM/MRM) is a liquid-chromatography (LC)/tandem-mass spectrometry (MS/MS) method that enables the quantitation of specific proteins in a sample by analyzing precursor ions and the fragment ions of their selected tryptic peptides. Instrumentation software has advanced to the point that thousands of transitions (pairs of primary and secondary m/z values) can be measured in a triple quadrupole instrument coupled to an LC, by a well-designed scheduling and selection of m/z windows. The design of a good MRM assay relies on the availability of peptide spectra from previous discovery-phase LC-MS/MS studies. The tedious aspect of manually developing and processing MRM assays involving thousands of transitions has spurred to development of software tools to automate this process. Software packages have been developed for project management, assay development, assay validation, data export, peak integration, quality assessment, and biostatistical analysis. No single tool provides a complete end-to-end solution, thus this article reviews the current state and discusses future directions of these software tools in order to enable researchers to combine these tools for a comprehensive targeted proteomics workflow. PMID:23702368
NASA Technical Reports Server (NTRS)
Jakeman, Hali L.
2013-01-01
The Ka-Band Object Observation and Monitoring, or KaBOOM, project is designed mainly to track and characterize near Earth objects. However, a smaller goal of the project would be to monitor pulsars and study their radio frequency signals for use as a clock in interstellar travel. The use of pulsars and their timing accuracy has been studied for decades, but never in the Ka-band of the radio frequency spectrum. In order to begin the use of KaBOOM for this research, the control systems need to be analyzed to ensure its capability. Flaws in the control documentation leave it unclear as to whether the control software processes coordinates from the J200 epoch. This experiment will examine the control software of the Intertronic 12m antennas used for the KaBOOM project and detail its capabilities in its "equatorial mode." The antennas will be pointed at 4 chosen points in the sky on several days while probing the virtual azimuth and elevation (horizon coordinate) registers. The input right ascension and declination coordinates will then be converted separately from the control software to horizontal coordinates and compared, thus determining the ability of the control software to process equatorial coordinates.
Constraints and Opportunities in GCM Model Development
NASA Technical Reports Server (NTRS)
Schmidt, Gavin; Clune, Thomas
2010-01-01
Over the past 30 years climate models have evolved from relatively simple representations of a few atmospheric processes to complex multi-disciplinary system models which incorporate physics from bottom of the ocean to the mesopause and are used for seasonal to multi-million year timescales. Computer infrastructure over that period has gone from punchcard mainframes to modern parallel clusters. Constraints of working within an ever evolving research code mean that most software changes must be incremental so as not to disrupt scientific throughput. Unfortunately, programming methodologies have generally not kept pace with these challenges, and existing implementations now present a heavy and growing burden on further model development as well as limiting flexibility and reliability. Opportunely, advances in software engineering from other disciplines (e.g. the commercial software industry) as well as new generations of powerful development tools can be incorporated by the model developers to incrementally and systematically improve underlying implementations and reverse the long term trend of increasing development overhead. However, these methodologies cannot be applied blindly, but rather must be carefully tailored to the unique characteristics of scientific software development. We will discuss the need for close integration of software engineers and climate scientists to find the optimal processes for climate modeling.
NASA Technical Reports Server (NTRS)
Wilson, Larry
1991-01-01
There are many software reliability models which try to predict future performance of software based on data generated by the debugging process. Unfortunately, the models appear to be unable to account for the random nature of the data. If the same code is debugged multiple times and one of the models is used to make predictions, intolerable variance is observed in the resulting reliability predictions. It is believed that data replication can remove this variance in lab type situations and that it is less than scientific to talk about validating a software reliability model without considering replication. It is also believed that data replication may prove to be cost effective in the real world, thus the research centered on verification of the need for replication and on methodologies for generating replicated data in a cost effective manner. The context of the debugging graph was pursued by simulation and experimentation. Simulation was done for the Basic model and the Log-Poisson model. Reasonable values of the parameters were assigned and used to generate simulated data which is then processed by the models in order to determine limitations on their accuracy. These experiments exploit the existing software and program specimens which are in AIR-LAB to measure the performance of reliability models.
NASA's computer science research program
NASA Technical Reports Server (NTRS)
Larsen, R. L.
1983-01-01
Following a major assessment of NASA's computing technology needs, a new program of computer science research has been initiated by the Agency. The program includes work in concurrent processing, management of large scale scientific databases, software engineering, reliable computing, and artificial intelligence. The program is driven by applications requirements in computational fluid dynamics, image processing, sensor data management, real-time mission control and autonomous systems. It consists of university research, in-house NASA research, and NASA's Research Institute for Advanced Computer Science (RIACS) and Institute for Computer Applications in Science and Engineering (ICASE). The overall goal is to provide the technical foundation within NASA to exploit advancing computing technology in aerospace applications.
Analysis of High-Throughput ELISA Microarray Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
White, Amanda M.; Daly, Don S.; Zangar, Richard C.
Our research group develops analytical methods and software for the high-throughput analysis of quantitative enzyme-linked immunosorbent assay (ELISA) microarrays. ELISA microarrays differ from DNA microarrays in several fundamental aspects and most algorithms for analysis of DNA microarray data are not applicable to ELISA microarrays. In this review, we provide an overview of the steps involved in ELISA microarray data analysis and how the statistically sound algorithms we have developed provide an integrated software suite to address the needs of each data-processing step. The algorithms discussed are available in a set of open-source software tools (http://www.pnl.gov/statistics/ProMAT).
End-to-end observatory software modeling using domain specific languages
NASA Astrophysics Data System (ADS)
Filgueira, José M.; Bec, Matthieu; Liu, Ning; Peng, Chien; Soto, José
2014-07-01
The Giant Magellan Telescope (GMT) is a 25-meter extremely large telescope that is being built by an international consortium of universities and research institutions. Its software and control system is being developed using a set of Domain Specific Languages (DSL) that supports a model driven development methodology integrated with an Agile management process. This approach promotes the use of standardized models that capture the component architecture of the system, that facilitate the construction of technical specifications in a uniform way, that facilitate communication between developers and domain experts and that provide a framework to ensure the successful integration of the software subsystems developed by the GMT partner institutions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scholtz, Jean; Plaisant, Catherine; Whiting, Mark A.
The evaluation of visual analytics environments was a topic in Illuminating the Path [Thomas 2005] as a critical aspect of moving research into practice. For a thorough understanding of the utility of the systems available, evaluation not only involves assessing the visualizations, interactions or data processing algorithms themselves, but also the complex processes that a tool is meant to support (such as exploratory data analysis and reasoning, communication through visualization, or collaborative data analysis [Lam 2012; Carpendale 2007]). Researchers and practitioners in the field have long identified many of the challenges faced when planning, conducting, and executing an evaluation ofmore » a visualization tool or system [Plaisant 2004]. Evaluation is needed to verify that algorithms and software systems work correctly and that they represent improvements over the current infrastructure. Additionally to effectively transfer new software into a working environment, it is necessary to ensure that the software has utility for the end-users and that the software can be incorporated into the end-user’s infrastructure and work practices. Evaluation test beds require datasets, tasks, metrics and evaluation methodologies. As noted in [Thomas 2005] it is difficult and expensive for any one researcher to setup an evaluation test bed so in many cases evaluation is setup for communities of researchers or for various research projects or programs. Examples of successful community evaluations can be found [Chinchor 1993; Voorhees 2007; FRGC 2012]. As visual analytics environments are intended to facilitate the work of human analysts, one aspect of evaluation needs to focus on the utility of the software to the end-user. This requires representative users, representative tasks, and metrics that measure the utility to the end-user. This is even more difficult as now one aspect of the test methodology is access to representative end-users to participate in the evaluation. In many cases the sensitive nature of data and tasks and difficult access to busy analysts puts even more of a burden on researchers to complete this type of evaluation. User-centered design goes beyond evaluation and starts with the user [Beyer 1997, Shneiderman 2009]. Having some knowledge of the type of data, tasks, and work practices helps researchers and developers know the correct paths to pursue in their work. When access to the end-users is problematic at best and impossible at worst, user-centered design becomes difficult. Researchers are unlikely to go to work on the type of problems faced by inaccessible users. Commercial vendors have difficulties evaluating and improving their products when they cannot observe real users working with their products. In well-established fields such as web site design or office software design, user-interface guidelines have been developed based on the results of empirical studies or the experience of experts. Guidelines can speed up the design process and replace some of the need for observation of actual users [heuristics review references]. In 2006 when the visual analytics community was initially getting organized, no such guidelines existed. Therefore, we were faced with the problem of developing an evaluation framework for the field of visual analytics that would provide representative situations and datasets, representative tasks and utility metrics, and finally a test methodology which would include a surrogate for representative users, increase interest in conducting research in the field, and provide sufficient feedback to the researchers so that they could improve their systems.« less
A Case Study of Measuring Process Risk for Early Insights into Software Safety
NASA Technical Reports Server (NTRS)
Layman, Lucas; Basili, Victor; Zelkowitz, Marvin V.; Fisher, Karen L.
2011-01-01
In this case study, we examine software safety risk in three flight hardware systems in NASA's Constellation spaceflight program. We applied our Technical and Process Risk Measurement (TPRM) methodology to the Constellation hazard analysis process to quantify the technical and process risks involving software safety in the early design phase of these projects. We analyzed 154 hazard reports and collected metrics to measure the prevalence of software in hazards and the specificity of descriptions of software causes of hazardous conditions. We found that 49-70% of 154 hazardous conditions could be caused by software or software was involved in the prevention of the hazardous condition. We also found that 12-17% of the 2013 hazard causes involved software, and that 23-29% of all causes had a software control. The application of the TPRM methodology identified process risks in the application of the hazard analysis process itself that may lead to software safety risk.
Free and Open Source Software for Geospatial in the field of planetary science
NASA Astrophysics Data System (ADS)
Frigeri, A.
2012-12-01
Information technology applied to geospatial analyses has spread quickly in the last ten years. The availability of OpenData and data from collaborative mapping projects increased the interest on tools, procedures and methods to handle spatially-related information. Free Open Source Software projects devoted to geospatial data handling are gaining a good success as the use of interoperable formats and protocols allow the user to choose what pipeline of tools and libraries is needed to solve a particular task, adapting the software scene to his specific problem. In particular, the Free Open Source model of development mimics the scientific method very well, and researchers should be naturally encouraged to take part to the development process of these software projects, as this represent a very agile way to interact among several institutions. When it comes to planetary sciences, geospatial Free Open Source Software is gaining a key role in projects that commonly involve different subjects in an international scenario. Very popular software suites for processing scientific mission data (for example, ISIS) and for navigation/planning (SPICE) are being distributed along with the source code and the interaction between user and developer is often very strict, creating a continuum between these two figures. A very widely spread library for handling geospatial data (GDAL) has started to support planetary data from the Planetary Data System, and recent contributions enabled the support to other popular data formats used in planetary science, as the Vicar one. The use of Geographic Information System in planetary science is now diffused, and Free Open Source GIS, open GIS formats and network protocols allow to extend existing tools and methods developed to solve Earth based problems, also to the case of the study of solar system bodies. A day in the working life of a researcher using Free Open Source Software for geospatial will be presented, as well as benefits and solutions to possible detriments coming from the effort required by using, supporting and contributing.
A Matrix Approach to Software Process Definition
NASA Technical Reports Server (NTRS)
Schultz, David; Bachman, Judith; Landis, Linda; Stark, Mike; Godfrey, Sally; Morisio, Maurizio; Powers, Edward I. (Technical Monitor)
2000-01-01
The Software Engineering Laboratory (SEL) is currently engaged in a Methodology and Metrics program for the Information Systems Center (ISC) at Goddard Space Flight Center (GSFC). This paper addresses the Methodology portion of the program. The purpose of the Methodology effort is to assist a software team lead in selecting and tailoring a software development or maintenance process for a specific GSFC project. It is intended that this process will also be compliant with both ISO 9001 and the Software Engineering Institute's Capability Maturity Model (CMM). Under the Methodology program, we have defined four standard ISO-compliant software processes for the ISC, and three tailoring criteria that team leads can use to categorize their projects. The team lead would select a process and appropriate tailoring factors, from which a software process tailored to the specific project could be generated. Our objective in the Methodology program is to present software process information in a structured fashion, to make it easy for a team lead to characterize the type of software engineering to be performed, and to apply tailoring parameters to search for an appropriate software process description. This will enable the team lead to follow a proven, effective software process and also satisfy NASA's requirement for compliance with ISO 9001 and the anticipated requirement for CMM assessment. This work is also intended to support the deployment of sound software processes across the ISC.
The Simple Video Coder: A free tool for efficiently coding social video data.
Barto, Daniel; Bird, Clark W; Hamilton, Derek A; Fink, Brandi C
2017-08-01
Videotaping of experimental sessions is a common practice across many disciplines of psychology, ranging from clinical therapy, to developmental science, to animal research. Audio-visual data are a rich source of information that can be easily recorded; however, analysis of the recordings presents a major obstacle to project completion. Coding behavior is time-consuming and often requires ad-hoc training of a student coder. In addition, existing software is either prohibitively expensive or cumbersome, which leaves researchers with inadequate tools to quickly process video data. We offer the Simple Video Coder-free, open-source software for behavior coding that is flexible in accommodating different experimental designs, is intuitive for students to use, and produces outcome measures of event timing, frequency, and duration. Finally, the software also offers extraction tools to splice video into coded segments suitable for training future human coders or for use as input for pattern classification algorithms.
DBCC Software as Database for Collisional Cross-Sections
NASA Astrophysics Data System (ADS)
Moroz, Daniel; Moroz, Paul
2014-10-01
Interactions of species, such as atoms, radicals, molecules, electrons, and photons, in plasmas used for materials processing could be very complex, and many of them could be described in terms of collisional cross-sections. Researchers involved in plasma simulations must select reasonable cross-sections for collisional processes for implementing them into their simulation codes to be able to correctly simulate plasmas. However, collisional cross-section data are difficult to obtain, and, for some collisional processes, the cross-sections are still not known. Data on collisional cross-sections can be obtained from numerous sources including numerical calculations, experiments, journal articles, conference proceedings, scientific reports, various universities' websites, national labs and centers specifically devoted to collecting data on cross-sections. The cross-sections data received from different sources could be partial, corresponding to limited energy ranges, or could even not be in agreement. The DBCC software package was designed to help researchers in collecting, comparing, and selecting cross-sections, some of which could be constructed from others or chosen as defaults. This is important as different researchers may place trust in different cross-sections or in different sources. We will discuss the details of DBCC and demonstrate how it works and why it is beneficial to researchers working on plasma simulations.
Design of Control Software for a High-Speed Coherent Doppler Lidar System for CO2 Measurement
NASA Technical Reports Server (NTRS)
Vanvalkenburg, Randal L.; Beyon, Jeffrey Y.; Koch, Grady J.; Yu, Jirong; Singh, Upendra N.; Kavaya, Michael J.
2010-01-01
The design of the software for a 2-micron coherent high-speed Doppler lidar system for CO2 measurement at NASA Langley Research Center is discussed in this paper. The specific strategy and design topology to meet the requirements of the system are reviewed. In order to attain the high-speed digitization of the different types of signals to be sampled on multiple channels, a carefully planned design of the control software is imperative. Samples of digitized data from each channel and their roles in data analysis post processing are also presented. Several challenges of extremely-fast, high volume data acquisition are discussed. The software must check the validity of each lidar return as well as other monitoring channel data in real-time. For such high-speed data acquisition systems, the software is a key component that enables the entire scope of CO2 measurement studies using commercially available system components.
Evaluation of Visualization Software
NASA Technical Reports Server (NTRS)
Globus, Al; Uselton, Sam
1995-01-01
Visualization software is widely used in scientific and engineering research. But computed visualizations can be very misleading, and the errors are easy to miss. We feel that the software producing the visualizations must be thoroughly evaluated and the evaluation process as well as the results must be made available. Testing and evaluation of visualization software is not a trivial problem. Several methods used in testing other software are helpful, but these methods are (apparently) often not used. When they are used, the description and results are generally not available to the end user. Additional evaluation methods specific to visualization must also be developed. We present several useful approaches to evaluation, ranging from numerical analysis of mathematical portions of algorithms to measurement of human performance while using visualization systems. Along with this brief survey, we present arguments for the importance of evaluations and discussions of appropriate use of some methods.
Towards a controlled vocabulary on software engineering education
NASA Astrophysics Data System (ADS)
Pizard, Sebastián; Vallespir, Diego
2017-11-01
Software engineering is the discipline that develops all the aspects of the production of software. Although there are guidelines about what topics to include in a software engineering curricula, it is usually unclear which are the best methods to teach them. In any science discipline the construction of a classification schema is a common approach to understand a thematic area. This study examines previous publications in software engineering education to obtain a first controlled vocabulary (a more formal definition of a classification schema) in the field. Publications from 1988 to 2014 were collected and processed using automatic clustering techniques and the outcomes were analysed manually. The result is an initial controlled vocabulary with a taxonomy form with 43 concepts that were identified as the most used in the research publications. We present the classification of the concepts in three facets: 'what to teach', 'how to teach' and 'where to teach' and the evolution of concepts over time.
GIS Facility and Services at the Ronald Greeley Center for Planetary Studies
NASA Astrophysics Data System (ADS)
Nelson, D. M.; Williams, D. A.
2017-06-01
At the RGCPS, we established a Geographic Information Systems (GIS) computer laboratory, where we instruct researchers how to use GIS and image processing software. Seminars demonstrate viewing, integrating, and digitally mapping planetary data.
Optimized mobile retroreflectivity unit data processing algorithms : [project summary].
DOT National Transportation Integrated Search
2017-06-01
Researchers examined both hardware and : software components of the MRU to determine : where improvements could be made. The MRUs laser makes one-meter sweeps, which : detect retroreflective striping and measure its reflectivity. The MRU also dete...
Software Engineering Guidebook
NASA Technical Reports Server (NTRS)
Connell, John; Wenneson, Greg
1993-01-01
The Software Engineering Guidebook describes SEPG (Software Engineering Process Group) supported processes and techniques for engineering quality software in NASA environments. Three process models are supported: structured, object-oriented, and evolutionary rapid-prototyping. The guidebook covers software life-cycles, engineering, assurance, and configuration management. The guidebook is written for managers and engineers who manage, develop, enhance, and/or maintain software under the Computer Software Services Contract.
Basic to Advanced InSAR Processing: GMTSAR
NASA Astrophysics Data System (ADS)
Sandwell, D. T.; Xu, X.; Baker, S.; Hogrelius, A.; Mellors, R. J.; Tong, X.; Wei, M.; Wessel, P.
2017-12-01
Monitoring crustal deformation using InSAR is becoming a standard technique for the science and application communities. Optimal use of the new data streams from Sentinel-1 and NISAR will require open software tools as well as education on the strengths and limitations of the InSAR methods. Over the past decade we have developed freely available, open-source software for processing InSAR data. The software relies on the Generic Mapping Tools (GMT) for the back-end data analysis and display and is thus called GMTSAR. With startup funding from NSF, we accelerated the development of GMTSAR to include more satellite data sources and provide better integration and distribution with GMT. In addition, with support from UNAVCO we have offered 6 GMTSAR short courses to educate mostly novice InSAR users. Currently, the software is used by hundreds of scientists and engineers around the world to study deformation at more than 4300 different sites. The most challenging aspect of the recent software development was the transition from image alignment using the cross-correlation method to a completely new alignment algorithm that uses only the precise orbital information to geometrically align images to an accuracy of better than 7 cm. This development was needed to process a new data type that is being acquired by the Sentinel-1A/B satellites. This combination of software and open data is transforming radar interferometry from a research tool into a fully operational time series analysis tool. Over the next 5 years we are planning to continue to broaden the user base through: improved software delivery methods; code hardening; better integration with data archives; support for high level products being developed for NISAR; and continued education and outreach.
Product-oriented Software Certification Process for Software Synthesis
NASA Technical Reports Server (NTRS)
Nelson, Stacy; Fischer, Bernd; Denney, Ewen; Schumann, Johann; Richardson, Julian; Oh, Phil
2004-01-01
The purpose of this document is to propose a product-oriented software certification process to facilitate use of software synthesis and formal methods. Why is such a process needed? Currently, software is tested until deemed bug-free rather than proving that certain software properties exist. This approach has worked well in most cases, but unfortunately, deaths still occur due to software failure. Using formal methods (techniques from logic and discrete mathematics like set theory, automata theory and formal logic as opposed to continuous mathematics like calculus) and software synthesis, it is possible to reduce this risk by proving certain software properties. Additionally, software synthesis makes it possible to automate some phases of the traditional software development life cycle resulting in a more streamlined and accurate development process.
A study about teaching quadratic functions using mathematical models and free software
NASA Astrophysics Data System (ADS)
Nepomucena, T. V.; da Silva, A. C.; Jardim, D. F.; da Silva, J. M.
2017-12-01
In the face of the reality of teaching Mathematics in Basic Education in Brazil, specially relating teach functions focusing their relevance to the student’s academic development in Basic and Superior Education, this work proposes the use of educational software to help the teaching of functions in Basic Education since the computers and software show as an outstanding option to help the teaching and learning processes. On the other hand, the study also proposes the use of Didactic Transposition as a methodology investigation and research. Along with this survey, some teaching interventions were applied to detect the main difficulties in the teaching process of functions in the Basic Education, analyzing the results obtained along the interventions in a qualitative form. Considering the discussion of the results at the end of the didactic interventions, it was verified that the results obtained were satisfactory.
Security Risks of Cloud Computing and Its Emergence as 5th Utility Service
NASA Astrophysics Data System (ADS)
Ahmad, Mushtaq
Cloud Computing is being projected by the major cloud services provider IT companies such as IBM, Google, Yahoo, Amazon and others as fifth utility where clients will have access for processing those applications and or software projects which need very high processing speed for compute intensive and huge data capacity for scientific, engineering research problems and also e- business and data content network applications. These services for different types of clients are provided under DASM-Direct Access Service Management based on virtualization of hardware, software and very high bandwidth Internet (Web 2.0) communication. The paper reviews these developments for Cloud Computing and Hardware/Software configuration of the cloud paradigm. The paper also examines the vital aspects of security risks projected by IT Industry experts, cloud clients. The paper also highlights the cloud provider's response to cloud security risks.
Matching software practitioner needs to researcher activities
NASA Technical Reports Server (NTRS)
Feather, M. S.; Menzies, T.; Connelly, J. R.
2003-01-01
We present an approach to matching software practitioners' needs to software researchers' activities. It uses an accepted taxonomical software classfication scheme as intermediary, in terms of which practitioners express needs, and researchers express activities.
Computational science: shifting the focus from tools to models
Hinsen, Konrad
2014-01-01
Computational techniques have revolutionized many aspects of scientific research over the last few decades. Experimentalists use computation for data analysis, processing ever bigger data sets. Theoreticians compute predictions from ever more complex models. However, traditional articles do not permit the publication of big data sets or complex models. As a consequence, these crucial pieces of information no longer enter the scientific record. Moreover, they have become prisoners of scientific software: many models exist only as software implementations, and the data are often stored in proprietary formats defined by the software. In this article, I argue that this emphasis on software tools over models and data is detrimental to science in the long term, and I propose a means by which this can be reversed. PMID:25309728
NASA Technical Reports Server (NTRS)
Eckhardt, Dave E., Jr.; Jipping, Michael J.; Wild, Chris J.; Zeil, Steven J.; Roberts, Cathy C.
1993-01-01
A study of computer engineering tool integration using the Portable Common Tool Environment (PCTE) Public Interface Standard is presented. Over a 10-week time frame, three existing software products were encapsulated to work in the Emeraude environment, an implementation of the PCTE version 1.5 standard. The software products used were a computer-aided software engineering (CASE) design tool, a software reuse tool, and a computer architecture design and analysis tool. The tool set was then demonstrated to work in a coordinated design process in the Emeraude environment. The project and the features of PCTE used are described, experience with the use of Emeraude environment over the project time frame is summarized, and several related areas for future research are summarized.
NASA Astrophysics Data System (ADS)
Comyn-Wattiau, Isabelle; Thalheim, Bernhard
Quality assurance is a growing research domain within the Information Systems (IS) and Conceptual Modeling (CM) disciplines. Ongoing research on quality in IS and CM is highly diverse and encompasses theoretical aspects including quality definition and quality models, and practical/empirical aspects such as the development of methods, approaches and tools for quality measurement and improvement. Current research on quality also includes quality characteristics definitions, validation instruments, methodological and development approaches to quality assurance during software and information systems development, quality monitors, quality assurance during information systems development processes and practices, quality assurance both for data and (meta)schemata, quality support for information systems data import and export, quality of query answering, and cost/benefit analysis of quality assurance processes. Quality assurance is also depending on the application area and the specific requirements in applications such as health sector, logistics, public sector, financial sector, manufacturing, services, e-commerce, software, etc. Furthermore, quality assurance must also be supported for data aggregation, ETL processes, web content management and other multi-layered applications. Quality assurance is typically requiring resources and has therefore beside its benefits a computational and economical trade-off. It is therefore also based on compromising between the value of quality data and the cost for quality assurance.
NASA Astrophysics Data System (ADS)
Zaborowicz, M.; Przybył, J.; Koszela, K.; Boniecki, P.; Mueller, W.; Raba, B.; Lewicki, A.; Przybył, K.
2014-04-01
The aim of the project was to make the software which on the basis on image of greenhouse tomato allows for the extraction of its characteristics. Data gathered during the image analysis and processing were used to build learning sets of artificial neural networks. Program enables to process pictures in jpeg format, acquisition of statistical information of the picture and export them to an external file. Produced software is intended to batch analyze collected research material and obtained information saved as a csv file. Program allows for analysis of 33 independent parameters implicitly to describe tested image. The application is dedicated to processing and image analysis of greenhouse tomatoes. The program can be used for analysis of other fruits and vegetables of a spherical shape.
Mousetrap: An integrated, open-source mouse-tracking package.
Kieslich, Pascal J; Henninger, Felix
2017-10-01
Mouse-tracking - the analysis of mouse movements in computerized experiments - is becoming increasingly popular in the cognitive sciences. Mouse movements are taken as an indicator of commitment to or conflict between choice options during the decision process. Using mouse-tracking, researchers have gained insight into the temporal development of cognitive processes across a growing number of psychological domains. In the current article, we present software that offers easy and convenient means of recording and analyzing mouse movements in computerized laboratory experiments. In particular, we introduce and demonstrate the mousetrap plugin that adds mouse-tracking to OpenSesame, a popular general-purpose graphical experiment builder. By integrating with this existing experimental software, mousetrap allows for the creation of mouse-tracking studies through a graphical interface, without requiring programming skills. Thus, researchers can benefit from the core features of a validated software package and the many extensions available for it (e.g., the integration with auxiliary hardware such as eye-tracking, or the support of interactive experiments). In addition, the recorded data can be imported directly into the statistical programming language R using the mousetrap package, which greatly facilitates analysis. Mousetrap is cross-platform, open-source and available free of charge from https://github.com/pascalkieslich/mousetrap-os .
Component-based integration of chemistry and optimization software.
Kenny, Joseph P; Benson, Steven J; Alexeev, Yuri; Sarich, Jason; Janssen, Curtis L; McInnes, Lois Curfman; Krishnan, Manojkumar; Nieplocha, Jarek; Jurrus, Elizabeth; Fahlstrom, Carl; Windus, Theresa L
2004-11-15
Typical scientific software designs make rigid assumptions regarding programming language and data structures, frustrating software interoperability and scientific collaboration. Component-based software engineering is an emerging approach to managing the increasing complexity of scientific software. Component technology facilitates code interoperability and reuse. Through the adoption of methodology and tools developed by the Common Component Architecture Forum, we have developed a component architecture for molecular structure optimization. Using the NWChem and Massively Parallel Quantum Chemistry packages, we have produced chemistry components that provide capacity for energy and energy derivative evaluation. We have constructed geometry optimization applications by integrating the Toolkit for Advanced Optimization, Portable Extensible Toolkit for Scientific Computation, and Global Arrays packages, which provide optimization and linear algebra capabilities. We present a brief overview of the component development process and a description of abstract interfaces for chemical optimizations. The components conforming to these abstract interfaces allow the construction of applications using different chemistry and mathematics packages interchangeably. Initial numerical results for the component software demonstrate good performance, and highlight potential research enabled by this platform.
ERIC Educational Resources Information Center
Read, Sue; Nte, Sol; Corcoran, Patsy; Stephens, Richard
2013-01-01
Background: Loss is a universal experience and death is perceived as the ultimate loss. The overarching aim of this research is to produce a qualitative, flexible, interactive, computerised tool to support the facilitation of emotional expressions around loss for people with intellectual disabilities. This paper explores the process of using…
Organizational Analysis of the United States Army Evaluation Center
2014-12-01
analysis of qualitative or quantitative data obtained from design reviews, hardware inspections, M&S, hardware and software testing , metrics review... Research Development Test & Evaluation (RDT&E) appropriation account. The Defense Acquisition Portal ACQuipedia website describes RDT&E as “ one of the... research , design , development, test and evaluation, production, installation, operation, and maintenance; data collection; processing and analysis
A multimedia perioperative record keeper for clinical research.
Perrino, A C; Luther, M A; Phillips, D B; Levin, F L
1996-05-01
To develop a multimedia perioperative recordkeeper that provides: 1. synchronous, real-time acquisition of multimedia data, 2. on-line access to the patient's chart data, and 3. advanced data analysis capabilities through integrated, multimedia database and analysis applications. To minimize cost and development time, the system design utilized industry standard hardware components and graphical. software development tools. The system was configured to use a Pentium PC complemented with a variety of hardware interfaces to external data sources. These sources included physiologic monitors with data in digital, analog, video, and audio as well as paper-based formats. The development process was guided by trials in over 80 clinical cases and by the critiques from numerous users. As a result of this process, a suite of custom software applications were created to meet the design goals. The Perioperative Data Acquisition application manages data collection from a variety of physiological monitors. The Charter application provides for rapid creation of an electronic medical record from the patient's paper-based chart and investigator's notes. The Multimedia Medical Database application provides a relational database for the organization and management of multimedia data. The Triscreen application provides an integrated data analysis environment with simultaneous, full-motion data display. With recent technological advances in PC power, data acquisition hardware, and software development tools, the clinical researcher now has the ability to collect and examine a more complete perioperative record. It is hoped that the description of the MPR and its development process will assist and encourage others to advance these tools for perioperative research.
CNC Machining Of The Complex Copper Electrodes
NASA Astrophysics Data System (ADS)
Popan, Ioan Alexandru; Balc, Nicolae; Popan, Alina
2015-07-01
This paper presents the machining process of the complex copper electrodes. Machining of the complex shapes in copper is difficult because this material is soft and sticky. This research presents the main steps for processing those copper electrodes at a high dimensional accuracy and a good surface quality. Special tooling solutions are required for this machining process and optimal process parameters have been found for the accurate CNC equipment, using smart CAD/CAM software.
CrossTalk: The Journal of Defense Software Engineering. Volume 21, Number 10, October 2008
2008-10-01
proprietary modeling offerings, there is considerable conver- gence around Business Process Modeling Notation ( BPMN ). The research also found strong...support across vendors for the Business Process Execution Language standard, though there is also emerging support for direct execution of BPMN through...the use of the XML Process Definition Language, an XML serialization of BPMN . Many vendors also provide the needed moni- toring of those processes at
2010-08-14
Jeffrey Beyon, left, and Paul Joseph Petzar, right, from NASA's Langley Research Center, work with DAWN Air Data Acquisition and Processing software aboard NASA's DC-8 research aircraft, Sunday, Aug. 15, 2010, in support of the GRIP experiment at Fort Lauderdale International Airport in Fort Lauderdale, Fla. The Genesis and Rapid Intensification Processes (GRIP) experiment is a NASA Earth science field experiment in 2010 that is being conducted to better understand how tropical storms form and develop into major hurricanes. Photo Credit: (NASA/Paul E. Alers)
NASA Technical Reports Server (NTRS)
Baez, Marivell; Vickerman, Mary; Choo, Yung
2000-01-01
SmaggIce (Surface Modeling And Grid Generation for Iced Airfoils) is one of NASNs aircraft icing research codes developed at the Glenn Research Center. It is a software toolkit used in the process of aerodynamic performance prediction of iced airfoils. It includes tools which complement the 2D grid-based Computational Fluid Dynamics (CFD) process: geometry probing; surface preparation for gridding: smoothing and re-discretization of geometry. Future releases will also include support for all aspects of gridding: domain decomposition; perimeter discretization; grid generation and modification.
Brown, Jason L; Bennett, Joseph R; French, Connor M
2017-01-01
SDMtoolbox 2.0 is a software package for spatial studies of ecology, evolution, and genetics. The release of SDMtoolbox 2.0 allows researchers to use the most current ArcGIS software and MaxEnt software, and reduces the amount of time that would be spent developing common solutions. The central aim of this software is to automate complicated and repetitive spatial analyses in an intuitive graphical user interface. One core tenant facilitates careful parameterization of species distribution models (SDMs) to maximize each model's discriminatory ability and minimize overfitting. This includes carefully processing of occurrence data, environmental data, and model parameterization. This program directly interfaces with MaxEnt, one of the most powerful and widely used species distribution modeling software programs, although SDMtoolbox 2.0 is not limited to species distribution modeling or restricted to modeling in MaxEnt. Many of the SDM pre- and post-processing tools have 'universal' analogs for use with any modeling software. The current version contains a total of 79 scripts that harness the power of ArcGIS for macroecology, landscape genetics, and evolutionary studies. For example, these tools allow for biodiversity quantification (such as species richness or corrected weighted endemism), generation of least-cost paths and corridors among shared haplotypes, assessment of the significance of spatial randomizations, and enforcement of dispersal limitations of SDMs projected into future climates-to only name a few functions contained in SDMtoolbox 2.0. Lastly, dozens of generalized tools exists for batch processing and conversion of GIS data types or formats, which are broadly useful to any ArcMap user.
nStudy: A System for Researching Information Problem Solving
ERIC Educational Resources Information Center
Winne, Philip H.; Nesbit, John C.; Popowich, Fred
2017-01-01
A bottleneck in gathering big data about learning is instrumentation designed to record data about processes students use to learn and information on which those processes operate. The software system nStudy fills this gap. nStudy is an extension to the Chrome web browser plus a server side database for logged trace data plus peripheral modules…
Analysing Culture and Interculture in Saudi EFL Textbooks: A Corpus Linguistic Approach
ERIC Educational Resources Information Center
Almujaiwel, Sultan
2018-01-01
This paper combines corpus processing tools to investigate the cultural elements of Saudi education of English as a foreign language (EFL). The latest Saudi EFL textbooks (2016 onwards) are available in researchable PDF formats. This helps process them through corpus search software tools. The method adopted is based on analysing 20 cultural…
ERIC Educational Resources Information Center
Gebremedhin, Mewcha Amha; Fenta, Ayele Almaw
2015-01-01
Rapid growth and improvement in ICT have led to the diffusion of technology in education. The purpose of this study is to investigate teachers' perception on integrating ICT in teaching-learning process. The research questions sought to measure teachers' software usage as well as other instructional tools and materials, preferences for…
ERIC Educational Resources Information Center
Monk, Ellen F.; Lycett, Mark
2016-01-01
Enterprise Resource Planning Systems (ERP) are very large and complex software packages that run every aspect of an organization. Increasingly, ERP systems are being used in higher education as one way to teach business processes, essential knowledge for students competing in today's business environment. Past research attempting to measure…
Systems Engineering Processes at NASA/SR-71 Pratt and Whitney J58 Engine
NASA Technical Reports Server (NTRS)
Donastorg, Cristina
2010-01-01
This summer I was given several opportunities at NASA's Dryden Flight Research Center (DFRC). The first opportunity was given to me by a Senior Propulsion Engineer, Kurtt Kloesel, to work in a specialized engineering discipline. My task was to research the Pratt & Whitney J58 engine that was used on the SR-71 Blackbird. I entered the data I collected into engine modeling software programs in order to receive certain outputs, such as net thrust. I also had to take a "crash course" in propulsion in order to better understand the research I was performing. To facilitate my understanding of propulsion principals and formulas, I worked many problems out of thermodynamics and propulsion textbooks and entered the given values of various situations into the modeling software.
Application of Neutron Tomography in Culture Heritage research.
Mongy, T
2014-02-01
Neutron Tomography (NT) investigation of Culture Heritages (CH) is an efficient tool for understanding the culture of ancient civilizations. Neutron imaging (NI) is a-state-of-the-art non-destructive tool in the area of CH and plays an important role in the modern archeology. The NI technology can be widely utilized in the field of elemental analysis. At Egypt Second Research Reactor (ETRR-2), a collimated Neutron Radiography (NR) beam is employed for neutron imaging purposes. A digital CCD camera is utilized for recording the beam attenuation in the sample. This helps for the detection of hidden objects and characterization of material properties. Research activity can be extended to use computer software for quantitative neutron measurement. Development of image processing algorithms can be used to obtain high quality images. In this work, full description of ETRR-2 was introduced with up to date neutron imaging system as well. Tomographic investigation of a clay forged artifact represents CH object was studied by neutron imaging methods in order to obtain some hidden information and highlight some attractive quantitative measurements. Computer software was used for imaging processing and enhancement. Also the Astra Image 3.0 Pro software was employed for high precise measurements and imaging enhancement using advanced algorithms. This work increased the effective utilization of the ETRR-2 Neutron Radiography/Tomography (NR/T) technique in Culture Heritages activities. © 2013 Elsevier Ltd. All rights reserved.
Heat and Mass Transfer Processes in Scrubber of Flue Gas Heat Recovery Device
NASA Astrophysics Data System (ADS)
Veidenbergs, Ivars; Blumberga, Dagnija; Vigants, Edgars; Kozuhars, Grigorijs
2010-01-01
The paper deals with the heat and mass transfer process research in a flue gas heat recovery device, where complicated cooling, evaporation and condensation processes are taking place simultaneously. The analogy between heat and mass transfer is used during the process of analysis. In order to prepare a detailed process analysis based on heat and mass process descriptive equations, as well as the correlation for wet gas parameter calculation, software in the
Automated Software Vulnerability Analysis
NASA Astrophysics Data System (ADS)
Sezer, Emre C.; Kil, Chongkyung; Ning, Peng
Despite decades of research, software continues to have vulnerabilities. Successful exploitations of these vulnerabilities by attackers cost millions of dollars to businesses and individuals. Unfortunately, most effective defensive measures, such as patching and intrusion prevention systems, require an intimate knowledge of the vulnerabilities. Many systems for detecting attacks have been proposed. However, the analysis of the exploited vulnerabilities is left to security experts and programmers. Both the human effortinvolved and the slow analysis process are unfavorable for timely defensive measure to be deployed. The problem is exacerbated by zero-day attacks.
Software for visualization, analysis, and manipulation of laser scan images
NASA Astrophysics Data System (ADS)
Burnsides, Dennis B.
1997-03-01
The recent introduction of laser surface scanning to scientific applications presents a challenge to computer scientists and engineers. Full utilization of this two- dimensional (2-D) and three-dimensional (3-D) data requires advances in techniques and methods for data processing and visualization. This paper explores the development of software to support the visualization, analysis and manipulation of laser scan images. Specific examples presented are from on-going efforts at the Air Force Computerized Anthropometric Research and Design (CARD) Laboratory.
NASA Astrophysics Data System (ADS)
Liang, Likai; Bi, Yushen
Considered on the distributed network management system's demand of high distributives, extensibility and reusability, a framework model of Three-tier distributed network management system based on COM/COM+ and DNA is proposed, which adopts software component technology and N-tier application software framework design idea. We also give the concrete design plan of each layer of this model. Finally, we discuss the internal running process of each layer in the distributed network management system's framework model.
Validation of vision-based obstacle detection algorithms for low-altitude helicopter flight
NASA Technical Reports Server (NTRS)
Suorsa, Raymond; Sridhar, Banavar
1991-01-01
A validation facility being used at the NASA Ames Research Center is described which is aimed at testing vision based obstacle detection and range estimation algorithms suitable for low level helicopter flight. The facility is capable of processing hundreds of frames of calibrated multicamera 6 degree-of-freedom motion image sequencies, generating calibrated multicamera laboratory images using convenient window-based software, and viewing range estimation results from different algorithms along with truth data using powerful window-based visualization software.
Software Suite to Support In-Flight Characterization of Remote Sensing Systems
NASA Technical Reports Server (NTRS)
Stanley, Thomas; Holekamp, Kara; Gasser, Gerald; Tabor, Wes; Vaughan, Ronald; Ryan, Robert; Pagnutti, Mary; Blonski, Slawomir; Kenton, Ross
2014-01-01
A characterization software suite was developed to facilitate NASA's in-flight characterization of commercial remote sensing systems. Characterization of aerial and satellite systems requires knowledge of ground characteristics, or ground truth. This information is typically obtained with instruments taking measurements prior to or during a remote sensing system overpass. Acquired ground-truth data, which can consist of hundreds of measurements with different data formats, must be processed before it can be used in the characterization. Accurate in-flight characterization of remote sensing systems relies on multiple field data acquisitions that are efficiently processed, with minimal error. To address the need for timely, reproducible ground-truth data, a characterization software suite was developed to automate the data processing methods. The characterization software suite is engineering code, requiring some prior knowledge and expertise to run. The suite consists of component scripts for each of the three main in-flight characterization types: radiometric, geometric, and spatial. The component scripts for the radiometric characterization operate primarily by reading the raw data acquired by the field instruments, combining it with other applicable information, and then reducing it to a format that is appropriate for input into MODTRAN (MODerate resolution atmospheric TRANsmission), an Air Force Research Laboratory-developed radiative transport code used to predict at-sensor measurements. The geometric scripts operate by comparing identified target locations from the remote sensing image to known target locations, producing circular error statistics defined by the Federal Geographic Data Committee Standards. The spatial scripts analyze a target edge within the image, and produce estimates of Relative Edge Response and the value of the Modulation Transfer Function at the Nyquist frequency. The software suite enables rapid, efficient, automated processing of ground truth data, which has been used to provide reproducible characterizations on a number of commercial remote sensing systems. Overall, this characterization software suite improves the reliability of ground-truth data processing techniques that are required for remote sensing system in-flight characterizations.
Design and Testing of Space Telemetry SCA Waveform
NASA Technical Reports Server (NTRS)
Mortensen, Dale J.; Handler, Louis M.; Quinn, Todd M.
2006-01-01
A Software Communications Architecture (SCA) Waveform for space telemetry is being developed at the NASA Glenn Research Center (GRC). The space telemetry waveform is implemented in a laboratory testbed consisting of general purpose processors, field programmable gate arrays (FPGAs), analog-to-digital converters (ADCs), and digital-to-analog converters (DACs). The radio hardware is integrated with an SCA Core Framework and other software development tools. The waveform design is described from both the bottom-up signal processing and top-down software component perspectives. Simulations and model-based design techniques used for signal processing subsystems are presented. Testing with legacy hardware-based modems verifies proper design implementation and dynamic waveform operations. The waveform development is part of an effort by NASA to define an open architecture for space based reconfigurable transceivers. Use of the SCA as a reference has increased understanding of software defined radio architectures. However, since space requirements put a premium on size, mass, and power, the SCA may be impractical for today s space ready technology. Specific requirements for an SCA waveform and other lessons learned from this development are discussed.
Combining analysis with optimization at Langley Research Center. An evolutionary process
NASA Technical Reports Server (NTRS)
Rogers, J. L., Jr.
1982-01-01
The evolutionary process of combining analysis and optimization codes was traced with a view toward providing insight into the long term goal of developing the methodology for an integrated, multidisciplinary software system for the concurrent analysis and optimization of aerospace structures. It was traced along the lines of strength sizing, concurrent strength and flutter sizing, and general optimization to define a near-term goal for combining analysis and optimization codes. Development of a modular software system combining general-purpose, state-of-the-art, production-level analysis computer programs for structures, aerodynamics, and aeroelasticity with a state-of-the-art optimization program is required. Incorporation of a modular and flexible structural optimization software system into a state-of-the-art finite element analysis computer program will facilitate this effort. This effort results in the software system used that is controlled with a special-purpose language, communicates with a data management system, and is easily modified for adding new programs and capabilities. A 337 degree-of-freedom finite element model is used in verifying the accuracy of this system.
Collected software engineering papers, volume 2
NASA Technical Reports Server (NTRS)
1983-01-01
Topics addressed include: summaries of the software engineering laboratory (SEL) organization, operation, and research activities; results of specific research projects in the areas of resource models and software measures; and strategies for data collection for software engineering research.
Research on radiation exposure from CT part of hybrid camera and diagnostic CT
NASA Astrophysics Data System (ADS)
Solný, Pavel; Zimák, Jaroslav
2014-11-01
Research on radiation exposure from CT part of hybrid camera in seven different Departments of Nuclear Medicine (DNM) was conducted. Processed data and effective dose (E) estimations led to the idea of phantom verification and comparison of absorbed doses and software estimation. Anonymous data from about 100 examinations from each DNM was gathered. Acquired data was processed and utilized by dose estimation programs (ExPACT, ImPACT, ImpactDose) with respect to the type of examination and examination procedures. Individual effective doses were calculated using enlisted programs. Preserving the same procedure in dose estimation process allows us to compare the resulting E. Some differences and disproportions during dose estimation led to the idea of estimated E verification. Consequently, two different sets of about 100 of TLD 100H detectors were calibrated for measurement inside the Aldersnon RANDO Anthropomorphic Phantom. Standard examination protocols were examined using a 2 Slice CT- part of hybrid SPECT/CT. Moreover, phantom exposure from body examining protocol for 32 Slice and 64 Slice diagnostic CT scanner was also verified. Absorbed dose (DT,R) measured using TLD detectors was compared with software estimation of equivalent dose HT values, computed by E estimation software. Though, only limited number of cavities for detectors enabled measurement within the regions of lung, liver, thyroid and spleen-pancreas region, some basic comparison is possible.
Crowell, Kevin L; Slysz, Gordon W; Baker, Erin S; LaMarche, Brian L; Monroe, Matthew E; Ibrahim, Yehia M; Payne, Samuel H; Anderson, Gordon A; Smith, Richard D
2013-11-01
The addition of ion mobility spectrometry to liquid chromatography-mass spectrometry experiments requires new, or updated, software tools to facilitate data processing. We introduce a command line software application LC-IMS-MS Feature Finder that searches for molecular ion signatures in multidimensional liquid chromatography-ion mobility spectrometry-mass spectrometry (LC-IMS-MS) data by clustering deisotoped peaks with similar monoisotopic mass, charge state, LC elution time and ion mobility drift time values. The software application includes an algorithm for detecting and quantifying co-eluting chemical species, including species that exist in multiple conformations that may have been separated in the IMS dimension. LC-IMS-MS Feature Finder is available as a command-line tool for download at http://omics.pnl.gov/software/LC-IMS-MS_Feature_Finder.php. The Microsoft.NET Framework 4.0 is required to run the software. All other dependencies are included with the software package. Usage of this software is limited to non-profit research to use (see README). rds@pnnl.gov. Supplementary data are available at Bioinformatics online.
Instrumentation complex for Langley Research Center's National Transonic Facility
NASA Technical Reports Server (NTRS)
Russell, C. H.; Bryant, C. S.
1977-01-01
The instrumentation discussed in the present paper was developed to ensure reliable operation for a 2.5-meter cryogenic high-Reynolds-number fan-driven transonic wind tunnel. It will incorporate four CPU's and associated analog and digital input/output equipment, necessary for acquiring research data, controlling the tunnel parameters, and monitoring the process conditions. Connected in a multipoint distributed network, the CPU's will support data base management and processing; research measurement data acquisition and display; process monitoring; and communication control. The design will allow essential processes to continue, in the case of major hardware failures, by switching input/output equipment to alternate CPU's and by eliminating nonessential functions. It will also permit software modularization by CPU activity and thereby reduce complexity and development time.
Bietz, Stefan; Inhester, Therese; Lauck, Florian; Sommer, Kai; von Behren, Mathias M; Fährrolfes, Rainer; Flachsenberg, Florian; Meyder, Agnes; Nittinger, Eva; Otto, Thomas; Hilbig, Matthias; Schomburg, Karen T; Volkamer, Andrea; Rarey, Matthias
2017-11-10
Nowadays, computational approaches are an integral part of life science research. Problems related to interpretation of experimental results, data analysis, or visualization tasks highly benefit from the achievements of the digital era. Simulation methods facilitate predictions of physicochemical properties and can assist in understanding macromolecular phenomena. Here, we will give an overview of the methods developed in our group that aim at supporting researchers from all life science areas. Based on state-of-the-art approaches from structural bioinformatics and cheminformatics, we provide software covering a wide range of research questions. Our all-in-one web service platform ProteinsPlus (http://proteins.plus) offers solutions for pocket and druggability prediction, hydrogen placement, structure quality assessment, ensemble generation, protein-protein interaction classification, and 2D-interaction visualization. Additionally, we provide a software package that contains tools targeting cheminformatics problems like file format conversion, molecule data set processing, SMARTS editing, fragment space enumeration, and ligand-based virtual screening. Furthermore, it also includes structural bioinformatics solutions for inverse screening, binding site alignment, and searching interaction patterns across structure libraries. The software package is available at http://software.zbh.uni-hamburg.de. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
Intelligent systems for KSC ground processing
NASA Technical Reports Server (NTRS)
Heard, Astrid E.
1992-01-01
The ground processing and launch of Shuttle vehicles and their payloads is the primary task of Kennedy Space Center. It is a process which is largely manual and contains little inherent automation. Business is conducted today much as it was during previous NASA programs such as Apollo. In light of new programs and decreasing budgets, NASA must find more cost effective ways in which to do business while retaining the quality and safety of activities. Advanced technologies including artificial intelligence could cut manpower and processing time. This paper is an overview of the research and development in Al technology at KSC with descriptions of the systems which have been implemented, as well as a few under development which are promising additions to ground processing software. Projects discussed cover many facets of ground processing activities, including computer sustaining engineering, subsystem monitor and diagnosis tools and launch team assistants. The deployed Al applications have proven an effectiveness which has helped to demonstrate the benefits of utilizing intelligent software in the ground processing task.