NASA Technical Reports Server (NTRS)
Hinchey, Michael G. (Inventor); Rash, James L. (Inventor); Pena, Joaquin (Inventor)
2011-01-01
Systems, methods and apparatus are provided through which an evolutionary system is managed and viewed as a software product line. In some embodiments, the core architecture is a relatively unchanging part of the system, and each version of the system is viewed as a product from the product line. Each software product is generated from the core architecture with some agent-based additions. The result may be a multi-agent system software product line.
A Role-Playing Game for a Software Engineering Lab: Developing a Product Line
ERIC Educational Resources Information Center
Zuppiroli, Sara; Ciancarini, Paolo; Gabbrielli, Maurizio
2012-01-01
Software product line development refers to software engineering practices and techniques for creating families of similar software systems from a basic set of reusable components, called shared assets. Teaching how to deal with software product lines in a university lab course is a challenging task, because there are several practical issues that…
A first-generation software product line for data acquisition systems in astronomy
NASA Astrophysics Data System (ADS)
López-Ruiz, J. C.; Heradio, Rubén; Cerrada Somolinos, José Antonio; Coz Fernandez, José Ramón; López Ramos, Pablo
2008-07-01
This article presents a case study on developing a software product line for data acquisition systems in astronomy based on the Exemplar Driven Development methodology and the Exemplar Flexibilization Language tool. The main strategies to build the software product line are based on the domain commonality and variability, the incremental scope and the use of existing artifacts. It consists on a lean methodology with little impact on the organization, suitable for small projects, which reduces product line start-up time. Software Product Lines focuses on creating a family of products instead of individual products. This approach has spectacular benefits on reducing the time to market, maintaining the know-how, reducing the development costs and increasing the quality of new products. The maintenance of the products is also enhanced since all the data acquisition systems share the same product line architecture.
Safeguarding End-User Military Software
2014-12-04
product lines using composi- tional symbolic execution [17] Software product lines are families of products defined by feature commonality and vari...ability, with a well-managed asset base. Recent work in testing of software product lines has exploited similarities across development phases to reuse...feature dependence graph to extract the set of possible interaction trees in a product family. It composes these to incrementally and symbolically
Managing the Evolution of an Enterprise Architecture using a MAS-Product-Line Approach
NASA Technical Reports Server (NTRS)
Pena, Joaquin; Hinchey, Michael G.; Resinas, manuel; Sterritt, Roy; Rash, James L.
2006-01-01
We view an evolutionary system ns being n software product line. The core architecture is the unchanging part of the system, and each version of the system may be viewed as a product from the product line. Each "product" may be described as the core architecture with sonre agent-based additions. The result is a multiagent system software product line. We describe an approach to such n Software Product Line-based approach using the MaCMAS Agent-Oriented nzethoclology. The approach scales to enterprise nrchitectures as a multiagent system is an approprinre means of representing a changing enterprise nrchitectclre nnd the inferaction between components in it.
Software Product Lines: Report of the 2010 US Army Software Product Line Workshop
2010-06-01
requirements and statement of work ( SOW ) tasks can be in- cluded in the request for proposal (RFP) and the contract. 2.2.1 Basic Product Line Acquisition... SOW tasks in Figure 1. Two additional tasks (at the third tier level) ac- count for sustaining the production capability over the life cycle and...Acquisition Strategy RFP and SOW Initial Product Line Scope Product Line Business Case Capability Description Document Teaming Product Line
Managing Variation in Services in a Software Product Line Context
2010-05-01
Oriented Domain Analysis ( FODA ) Feasibility Study (CMU/SEI-90-TR-021, ADA235785). Software Engineering Institute, Carnegie Mellon University, 1990...the systems in the product line, and a plan for building the systems. Product line scope and product line analysis define the boundaries and...systems, as well as expected ways in which they may vary. Product line analysis applies established modeling techniques to engineer the common and
Warth, Benedikt; Rajkai, György; Mandenius, Carl-Fredrik
2010-05-03
Software sensors for monitoring and on-line estimation of critical bioprocess variables have mainly been used with standard bioreactor sensors, such as electrodes and gas analyzers, where algorithms in the software model have generated the desired state variables. In this article we propose that other on-line instruments, such as NIR probes and on-line HPLC, should be used to make more reliable and flexible software sensors. Five software sensor architectures were compared and evaluated: (1) biomass concentration from an on-line NIR probe, (2) biomass concentration from titrant addition, (3) specific growth rate from titrant addition, (4) specific growth rate from the NIR probe, and (5) specific substrate uptake rate and by-product rate from on-line HPLC and NIR probe signals. The software sensors were demonstrated on an Escherichia coli cultivation expressing a recombinant protein, green fluorescent protein (GFP), but the results could be extrapolated to other production organisms and product proteins. We conclude that well-maintained on-line instrumentation (hardware sensors) can increase the potential of software sensors. This would also strongly support the intentions with process analytical technology and quality-by-design concepts. 2010 Elsevier B.V. All rights reserved.
A Scientific Software Product Line for the Bioinformatics domain.
Costa, Gabriella Castro B; Braga, Regina; David, José Maria N; Campos, Fernanda
2015-08-01
Most specialized users (scientists) that use bioinformatics applications do not have suitable training on software development. Software Product Line (SPL) employs the concept of reuse considering that it is defined as a set of systems that are developed from a common set of base artifacts. In some contexts, such as in bioinformatics applications, it is advantageous to develop a collection of related software products, using SPL approach. If software products are similar enough, there is the possibility of predicting their commonalities, differences and then reuse these common features to support the development of new applications in the bioinformatics area. This paper presents the PL-Science approach which considers the context of SPL and ontology in order to assist scientists to define a scientific experiment, and to specify a workflow that encompasses bioinformatics applications of a given experiment. This paper also focuses on the use of ontologies to enable the use of Software Product Line in biological domains. In the context of this paper, Scientific Software Product Line (SSPL) differs from the Software Product Line due to the fact that SSPL uses an abstract scientific workflow model. This workflow is defined according to a scientific domain and using this abstract workflow model the products (scientific applications/algorithms) are instantiated. Through the use of ontology as a knowledge representation model, we can provide domain restrictions as well as add semantic aspects in order to facilitate the selection and organization of bioinformatics workflows in a Scientific Software Product Line. The use of ontologies enables not only the expression of formal restrictions but also the inferences on these restrictions, considering that a scientific domain needs a formal specification. This paper presents the development of the PL-Science approach, encompassing a methodology and an infrastructure, and also presents an approach evaluation. This evaluation presents case studies in bioinformatics, which were conducted in two renowned research institutions in Brazil. Copyright © 2015 Elsevier Inc. All rights reserved.
Second Generation Product Line Engineering Takes Hold in the DoD
2014-01-01
Feature- Oriented Domain Analysis ( FODA ) Feasibility Study” (CMU/SEI-90- TR-021, ADA235785). Pittsburgh, PA: Software Engineering Institute...software product line engineering and software architecture documentation and analysis . Clements is co-author of three practitioner-oriented books about
2016-01-06
of- breed software components and software products lines (SPLs) that are subject to different IP license and cybersecurity requirements. The... commercially priced closed source software components, to be used in the design, implementation, deployment, and evolution of open architecture (OA... breed software components and software products lines (SPLs) that are subject to different IP license and cybersecurity requirements. The Department
NASA Technical Reports Server (NTRS)
Pena, Joaquin; Hinchey, Michael G.; Ruiz-Cortes, Antonio
2006-01-01
The field of Software Product Lines (SPL) emphasizes building a core architecture for a family of software products from which concrete products can be derived rapidly. This helps to reduce time-to-market, costs, etc., and can result in improved software quality and safety. Current AOSE methodologies are concerned with developing a single Multiagent System. We propose an initial approach to developing the core architecture of a Multiagent Systems Product Line (MAS-PL), exemplifying our approach with reference to a concept NASA mission based on multiagent technology.
Software Product Lines: Report of the 2009 U.S. Army Software Product Line Workshop
2009-04-01
record system was fielded in 2008. One early challenge for Overwatch was coming up with a funding model that would support core asset development (a...match the organizational model to the funding model . Product line architecture is essential. Address product line requirements up front. Put processes...when trying to move from a customer-driven, product-specific funding model to one in which at least some of the funds are allocated to the creation and
Aspect-Oriented Model-Driven Software Product Line Engineering
NASA Astrophysics Data System (ADS)
Groher, Iris; Voelter, Markus
Software product line engineering aims to reduce development time, effort, cost, and complexity by taking advantage of the commonality within a portfolio of similar products. The effectiveness of a software product line approach directly depends on how well feature variability within the portfolio is implemented and managed throughout the development lifecycle, from early analysis through maintenance and evolution. This article presents an approach that facilitates variability implementation, management, and tracing by integrating model-driven and aspect-oriented software development. Features are separated in models and composed of aspect-oriented composition techniques on model level. Model transformations support the transition from problem to solution space models. Aspect-oriented techniques enable the explicit expression and modularization of variability on model, template, and code level. The presented concepts are illustrated with a case study of a home automation system.
Assembly Line Efficiency Improvement by Using WITNESS Simulation Software
NASA Astrophysics Data System (ADS)
Yasir, A. S. H. M.; Mohamed, N. M. Z. N.
2018-03-01
In the nowadays-competitive world, efficiencies and the productivity of the assembly line are essential in manufacturing company. This paper demonstrates the study of the existing production line performance. The actual cycle time observed and recorded during the working process. The current layout was designed and analysed using Witness simulation software. The productivity and effectiveness for every single operator are measured to determine the operator idle time and busy time. Two new alternatives layout were proposed and analysed by using Witness simulation software to improve the performance of production activities. This research provided valuable and better understanding of production effectiveness by adjusting the line balancing. After analysing the data, simulation result from the current layout and the proposed plan later been tabulated to compare the improved efficiency and productivity. The proposed design plan has shown an increase in yield and productivity compared to the current arrangement. This research has been carried out in company XYZ, which is one of the automotive premises in Pahang, Malaysia.
Formulation of a Production Strategy for a Software Product Line
2009-08-01
chooses to develop its products) as a series of scenarios • identifying the production factors critical to the success of the organization’s...line approach to achieve its business goals. AGM, a subsidiary of a multinational corporation, produces a series of software-intensive products deli...days from time of request. 2 The core assets mentioned in this example are available at http://www.sei.cmu.edu/productlines/ ppl . 19 | CMU/SEI
NASA Technical Reports Server (NTRS)
McComas, David; Stark, Michael; Leake, Stephen; White, Michael; Morisio, Maurizio; Travassos, Guilherme H.; Powers, Edward I. (Technical Monitor)
2000-01-01
The NASA Goddard Space Flight Center Flight Software Branch (FSB) is developing a Guidance, Navigation, and Control (GNC) Flight Software (FSW) product line. The demand for increasingly more complex flight software in less time while maintaining the same level of quality has motivated us to look for better FSW development strategies. The GNC FSW product line has been planned to address the core GNC FSW functionality very similar on many recent low/near Earth missions in the last ten years. Unfortunately these missions have not accomplished significant drops in development cost since a systematic approach towards reuse has not been adopted. In addition, new demands are continually being placed upon the FSW which means the FSB must become more adept at providing GNC FSW functionality's core so it can accommodate additional requirements. These domain features together with engineering concepts are influencing the specification, description and evaluation of FSW product line. Domain engineering is the foundation for emerging product line software development approaches. A product line is 'A family of products designed to take advantage of their common aspects and predicted variabilities'. In our product line approach, domain engineering includes the engineering activities needed to produce reusable artifacts for a domain. Application engineering refers to developing an application in the domain starting from reusable artifacts. The focus of this paper is regarding the software process, lessons learned and on how the GNC FSW product line manages variability. Existing domain engineering approaches do not enforce any specific notation for domain analysis or commonality and variability analysis. Usually, natural language text is the preferred tool. The advantage is the flexibility and adapt ability of natural language. However, one has to be ready to accept also its well-known drawbacks, such as ambiguity, inconsistency, and contradictions. While most domain analysis approaches are functionally oriented, the idea of applying the object-oriented approach in domain analysis is not new. Some authors propose to use UML as the notation underlying domain analysis. Our work is based on the same idea of merging UML and domain analysis. Further, we propose a few extensions to UML in order to express variability, and we define precisely their semantics so that a tool can support them. The extensions are designed to be implemented on the API of a popular industrial CASE tool, with obvious advantages in cost and availability of tool support. The paper outlines the product line processes and identifies where variability must be addressed. Then it describes the product line products with respect to how they accommodate variability. The Celestial Body subdomain is used as a working example. Our results to date are summarized and plans for the future are described.
Object-oriented productivity metrics
NASA Technical Reports Server (NTRS)
Connell, John L.; Eller, Nancy
1992-01-01
Software productivity metrics are useful for sizing and costing proposed software and for measuring development productivity. Estimating and measuring source lines of code (SLOC) has proven to be a bad idea because it encourages writing more lines of code and using lower level languages. Function Point Analysis is an improved software metric system, but it is not compatible with newer rapid prototyping and object-oriented approaches to software development. A process is presented here for counting object-oriented effort points, based on a preliminary object-oriented analysis. It is proposed that this approach is compatible with object-oriented analysis, design, programming, and rapid prototyping. Statistics gathered on actual projects are presented to validate the approach.
Commonality and Variability Analysis for Xenon Family of Separation Virtual Machine Monitors (CVAX)
2017-07-18
technical approach is a systematic application of Software Product Line Engineering (SPLE). A systematic application requires describing the family and... engineering Software family September 2016 – October 2016 OSD/OUSD/ATL/ASD(R&E)/RDOffice of Information Systems & Cyber Security RD / ASD(R&E) / AT&L...by the evolving open-source Xen hypervisor. The technical approach is a systematic application of Software Product Line Engineering (SPLE). A
Architecture-Based Unit Testing of the Flight Software Product Line
NASA Technical Reports Server (NTRS)
Ganesan, Dharmalingam; Lindvall, Mikael; McComas, David; Bartholomew, Maureen; Slegel, Steve; Medina, Barbara
2010-01-01
This paper presents an analysis of the unit testing approach developed and used by the Core Flight Software (CFS) product line team at the NASA GSFC. The goal of the analysis is to understand, review, and reconunend strategies for improving the existing unit testing infrastructure as well as to capture lessons learned and best practices that can be used by other product line teams for their unit testing. The CFS unit testing framework is designed and implemented as a set of variation points, and thus testing support is built into the product line architecture. The analysis found that the CFS unit testing approach has many practical and good solutions that are worth considering when deciding how to design the testing architecture for a product line, which are documented in this paper along with some suggested innprovennents.
Testing Product Generation in Software Product Lines Using Pairwise for Features Coverage
NASA Astrophysics Data System (ADS)
Pérez Lamancha, Beatriz; Polo Usaola, Macario
A Software Product Lines (SPL) is "a set of software-intensive systems sharing a common, managed set of features that satisfy the specific needs of a particular market segment or mission and that are developed from a common set of core assets in a prescribed way". Variability is a central concept that permits the generation of different products of the family by reusing core assets. It is captured through features which, for a SPL, define its scope. Features are represented in a feature model, which is later used to generate the products from the line. From the testing point of view, testing all the possible combinations in feature models is not practical because: (1) the number of possible combinations (i.e., combinations of features for composing products) may be untreatable, and (2) some combinations may contain incompatible features. Thus, this paper resolves the problem by the implementation of combinatorial testing techniques adapted to the SPL context.
IEEE Computer Society/Software Engineering Institute Software Process Achievement (SPA) Award 2009
2011-03-01
capabilities to our GDM. We also introduced software as a service ( SaaS ) as part our technology solutions and have further enhanced our ability to...model PROSPER Infosys production support methodology Q&P quality and productivity R&D research and development SaaS software as a service ... Software Development Life Cycle (SDLC) 23 Table 10: Scientific Estimation Coverage by Service Line 27 CMU/SEI-2011-TR-008 | vi CMU/SEI-2011
Calculation and use of an environment's characteristic software metric set
NASA Technical Reports Server (NTRS)
Basili, Victor R.; Selby, Richard W., Jr.
1985-01-01
Since both cost/quality and production environments differ, this study presents an approach for customizing a characteristic set of software metrics to an environment. The approach is applied in the Software Engineering Laboratory (SEL), a NASA Goddard production environment, to 49 candidate process and product metrics of 652 modules from six (51,000 to 112,000 lines) projects. For this particular environment, the method yielded the characteristic metric set (source lines, fault correction effort per executable statement, design effort, code effort, number of I/O parameters, number of versions). The uses examined for a characteristic metric set include forecasting the effort for development, modification, and fault correction of modules based on historical data.
NASA Astrophysics Data System (ADS)
de Faria Scheidt, Rafael; Vilain, Patrícia; Dantas, M. A. R.
2014-10-01
Petroleum reservoir engineering is a complex and interesting field that requires large amount of computational facilities to achieve successful results. Usually, software environments for this field are developed without taking care out of possible interactions and extensibilities required by reservoir engineers. In this paper, we present a research work which it is characterized by the design and implementation based on a software product line model for a real distributed reservoir engineering environment. Experimental results indicate successfully the utilization of this approach for the design of distributed software architecture. In addition, all components from the proposal provided greater visibility of the organization and processes for the reservoir engineers.
Introduction to Software Product Line Adoption
2005-09-01
plans • improvement suggestions • risks and mitigation strategies • progress reports • risks and mitigation strategies • adoption plan • funding ... model • organization chart • product line concept of operations (CONOPS) • marketing plan • product proposals • acquisition strategy • organization risk
Verifying Architectural Design Rules of the Flight Software Product Line
NASA Technical Reports Server (NTRS)
Ganesan, Dharmalingam; Lindvall, Mikael; Ackermann, Chris; McComas, David; Bartholomew, Maureen
2009-01-01
This paper presents experiences of verifying architectural design rules of the NASA Core Flight Software (CFS) product line implementation. The goal of the verification is to check whether the implementation is consistent with the CFS architectural rules derived from the developer's guide. The results indicate that consistency checking helps a) identifying architecturally significant deviations that were eluded during code reviews, b) clarifying the design rules to the team, and c) assessing the overall implementation quality. Furthermore, it helps connecting business goals to architectural principles, and to the implementation. This paper is the first step in the definition of a method for analyzing and evaluating product line implementations from an architecture-centric perspective.
Using Decision Structures for Policy Analysis in Software Product-line Evolution - A Case Study
NASA Astrophysics Data System (ADS)
Sarang, Nita; Sanglikar, Mukund A.
Project management decisions are the primary basis for project success (or failure). Mostly, such decisions are based on an intuitive understanding of the underlying software engineering and management process and have a likelihood of being misjudged. Our problem domain is product-line evolution. We model the dynamics of the process by incorporating feedback loops appropriate to two decision structures: staffing policy, and the forces of growth associated with long-term software evolution. The model is executable and supports project managers to assess the long-term effects of possible actions. Our work also corroborates results from earlier studies of E-type systems, in particular the FEAST project and the rules for software evolution, planning and management.
Type Safe Extensible Programming
NASA Astrophysics Data System (ADS)
Chae, Wonseok
2009-10-01
Software products evolve over time. Sometimes they evolve by adding new features, and sometimes by either fixing bugs or replacing outdated implementations with new ones. When software engineers fail to anticipate such evolution during development, they will eventually be forced to re-architect or re-build from scratch. Therefore, it has been common practice to prepare for changes so that software products are extensible over their lifetimes. However, making software extensible is challenging because it is difficult to anticipate successive changes and to provide adequate abstraction mechanisms over potential changes. Such extensibility mechanisms, furthermore, should not compromise any existing functionality during extension. Software engineers would benefit from a tool that provides a way to add extensions in a reliable way. It is natural to expect programming languages to serve this role. Extensible programming is one effort to address these issues. In this thesis, we present type safe extensible programming using the MLPolyR language. MLPolyR is an ML-like functional language whose type system provides type-safe extensibility mechanisms at several levels. After presenting the language, we will show how these extensibility mechanisms can be put to good use in the context of product line engineering. Product line engineering is an emerging software engineering paradigm that aims to manage variations, which originate from successive changes in software.
[Simulation and Design of Infant Incubator Assembly Line].
Ke, Huqi; Hu, Xiaoyong; Ge, Xia; Hu, Yanhai; Chen, Zaihong
2015-11-01
According to current assembly situation of infant incubator in company A, basic industrial engineering means such as time study was used to analyze the actual products assembly production and an assembly line was designed. The assembly line was modeled and simulated with software Flexsim. The problem of the assembly line was found by comparing simulation result and actual data, then through optimization to obtain high efficiency assembly line.
NASA Technical Reports Server (NTRS)
Gaffney, J. E., Jr.; Judge, R. W.
1981-01-01
A model of a software development process is described. The software development process is seen to consist of a sequence of activities, such as 'program design' and 'module development' (or coding). A manpower estimate is made by multiplying code size by the rates (man months per thousand lines of code) for each of the activities relevant to the particular case of interest and summing up the results. The effect of four objectively determinable factors (organization, software product type, computer type, and code type) on productivity values for each of nine principal software development activities was assessed. Four factors were identified which account for 39% of the observed productivity variation.
NASA Technical Reports Server (NTRS)
2003-01-01
When NASA needed a real-time, online database system capable of tracking documentation changes in its propulsion test facilities, engineers at Stennis Space Center joined with ECT International, of Brookfield, Wisconsin, to create a solution. Through NASA's Dual-Use Program, ECT developed Exdata, a software program that works within the company's existing Promise software. Exdata not only satisfied NASA s requirements, but also expanded ECT s commercial product line. Promise, ECT s primary product, is an intelligent software program with specialized functions for designing and documenting electrical control systems. An addon to AutoCAD software, Promis e generates control system schematics, panel layouts, bills of material, wire lists, and terminal plans. The drawing functions include symbol libraries, macros, and automatic line breaking. Primary Promise customers include manufacturing companies, utilities, and other organizations with complex processes to control.
Agile Methods: Selected DoD Management and Acquisition Concerns
2011-10-01
SIDRE Software Intensive Innovative Development and Reengineering/Evolution SLIM Software Lifecycle Management -Estimate SLOC source lines of code...ISBN #0321502752 Coaching Agile Teams Lyssa Adkins ISBN #0321637704 Agile Project Management : Creating Innovative Products – Second Edition Jim...Accessed July 13, 2011. [Highsmith 2009] Highsmith, J. Agile Project Management : Creating Innovative Products, 2nd ed. Addison- Wesley, 2009
The software product assurance metrics study: JPL's software systems quality and productivity
NASA Technical Reports Server (NTRS)
Bush, Marilyn W.
1989-01-01
The findings are reported of the Jet Propulsion Laboratory (JPL)/Software Product Assurance (SPA) Metrics Study, conducted as part of a larger JPL effort to improve software quality and productivity. Until recently, no comprehensive data had been assembled on how JPL manages and develops software-intensive systems. The first objective was to collect data on software development from as many projects and for as many years as possible. Results from five projects are discussed. These results reflect 15 years of JPL software development, representing over 100 data points (systems and subsystems), over a third of a billion dollars, over four million lines of code and 28,000 person months. Analysis of this data provides a benchmark for gauging the effectiveness of past, present and future software development work. In addition, the study is meant to encourage projects to record existing metrics data and to gather future data. The SPA long term goal is to integrate the collection of historical data and ongoing project data with future project estimations.
1986 Petroleum Software Directory. [800 mini, micro and mainframe computer software packages
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1985-01-01
Pennwell's 1986 Petroleum Software Directory is a complete listing of software created specifically for the petroleum industry. Details are provided on over 800 mini, micro and mainframe computer software packages from more than 250 different companies. An accountant can locate programs to automate bookkeeping functions in large oil and gas production firms. A pipeline engineer will find programs designed to calculate line flow and wellbore pressure drop.
Lessons from 30 Years of Flight Software
NASA Technical Reports Server (NTRS)
McComas, David C.
2015-01-01
This presentation takes a brief historical look at flight software over the past 30 years, extracts lessons learned and shows how many of the lessons learned are embodied in the Flight Software product line called the core Flight System (cFS). It also captures the lessons learned from developing and applying the cFS.
A Study of Integrated Instruction for Flexible Manufacturing Systems.
ERIC Educational Resources Information Center
Fang, Rong-Jyue; Kuo, Shin-Gia
A study was undertaken to develop hardware and software to help students learn the operation of real production lines and imitate them without disturbing the actual working of the production line. The study also identified major research topics according to the list of eight major technologies targeted by the Taiwanese government and considered…
10th Annual CMMI Technology Conference and User Group Tutorial Session
2010-11-15
Reuse That Pays Off: Software Product Lines BUSINESS GOALS/ APPLICATION DOMAIN ARCHITECTURE COMPONENTS and SERVICES pertain to share an are built... services PRODUCT LINES = STRATEGIC REUSE CMMI V1.3 and Architecture Oct 2010 © 2010 Carnegie Mellon University 46 91 CMMI V1.3 and Architecture © 2010... product component, the performance mustquality attribute can sometimes be partitioned for unique allocation to each product component as a derived
The Need for V&V in Reuse-Based Software Engineering
NASA Technical Reports Server (NTRS)
Addy, Edward A.
1997-01-01
V&V is currently performed during application development for many systems, especially safety-critical and mission-critical systems. The V&V process is intended to discover errors, especially errors related to entire' domain or product line rather than a critical processing, as early as possible during the development process. The system application provides the context under which the software artifacts are validated. engineering. This paper describes a framework that extends V&V from an individual application system to a product line of systems that are developed within an architecture-based software engineering environment. This framework includes the activities of traditional application-level V&V, and extends these activities into the transition between domain engineering and application engineering. The framework includes descriptions of the types of activities to be performed during each of the life-cycle phases, and provides motivation for activities.
Podevin, Michael; Fotidis, Ioannis A; Angelidaki, Irini
2018-08-01
Microalgae are well known for their ability to accumulate lipids intracellularly, which can be used for biofuels and mitigate CO 2 emissions. However, due to economic challenges, microalgae bioprocesses have maneuvered towards the simultaneous production of food, feed, fuel, and various high-value chemicals in a biorefinery concept. On-line and in-line monitoring of macromolecules such as lipids, proteins, carbohydrates, and high-value pigments will be more critical to maintain product quality and consistency for downstream processing in a biorefinery to maintain and valorize these markets. The main contribution of this review is to present current and prospective advances of on-line and in-line process analytical technology (PAT), with high-selectivity - the capability of monitoring several analytes simultaneously - in the interest of improving product quality, productivity, and process automation of a microalgal biorefinery. The high-selectivity PAT under consideration are mid-infrared (MIR), near-infrared (NIR), and Raman vibrational spectroscopies. The current review contains a critical assessment of these technologies in the context of recent advances in software and hardware in order to move microalgae production towards process automation through multivariate process control (MVPC) and software sensors trained on "big data". The paper will also include a comprehensive overview of off-line implementations of vibrational spectroscopy in microalgal research as it pertains to spectral interpretation and process automation to aid and motivate development.
NASA Technical Reports Server (NTRS)
Green, Scott; Kouchakdjian, Ara; Basili, Victor; Weidow, David
1990-01-01
This case study analyzes the application of the cleanroom software development methodology to the development of production software at the NASA/Goddard Space Flight Center. The cleanroom methodology emphasizes human discipline in program verification to produce reliable software products that are right the first time. Preliminary analysis of the cleanroom case study shows that the method can be applied successfully in the FDD environment and may increase staff productivity and product quality. Compared to typical Software Engineering Laboratory (SEL) activities, there is evidence of lower failure rates, a more complete and consistent set of inline code documentation, a different distribution of phase effort activity, and a different growth profile in terms of lines of code developed. The major goals of the study were to: (1) assess the process used in the SEL cleanroom model with respect to team structure, team activities, and effort distribution; (2) analyze the products of the SEL cleanroom model and determine the impact on measures of interest, including reliability, productivity, overall life-cycle cost, and software quality; and (3) analyze the residual products in the application of the SEL cleanroom model, such as fault distribution, error characteristics, system growth, and computer usage.
Tailoring a software production environment for a large project
NASA Technical Reports Server (NTRS)
Levine, D. R.
1984-01-01
A software production environment was constructed to meet the specific goals of a particular large programming project. These goals, the specific solutions as implemented, and the experiences on a project of over 100,000 lines of source code are discussed. The base development environment for this project was an ordinary PWB Unix (tm) system. Several important aspects of the development process required support not available in the existing tool set.
Time cycle analysis and simulation of material flow in MOX process layout
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chakraborty, S.; Saraswat, A.; Danny, K.M.
The (U,Pu)O{sub 2} MOX fuel is the driver fuel for the upcoming PFBR (Prototype Fast Breeder Reactor). The fuel has around 30% PuO{sub 2}. The presence of high percentages of reprocessed PuO{sub 2} necessitates the design of optimized fuel fabrication process line which will address both production need as well as meet regulatory norms regarding radiological safety criteria. The powder pellet route has highly unbalanced time cycle. This difficulty can be overcome by optimizing process layout in terms of equipment redundancy and scheduling of input powder batches. Different schemes are tested before implementing in the process line with the helpmore » of a software. This software simulates the material movement through the optimized process layout. The different material processing schemes have been devised and validity of the schemes are tested with the software. Schemes in which production batches are meeting at any glove box location are considered invalid. A valid scheme ensures adequate spacing between the production batches and at the same time it meets the production target. This software can be further improved by accurately calculating material movement time through glove box train. One important factor is considering material handling time with automation systems in place.« less
Global Situational Awareness with Free Tools
2015-01-15
Client Technical Solutions • Software Engineering Measurement and Analysis • Architecture Practices • Product Line Practice • Team Software Process...multiple data sources • Snort (Snorby on Security Onion ) • Nagios • SharePoint RSS • Flow • Others • Leverage standard data formats • Keyhole Markup Language
Security Requirements Management in Software Product Line Engineering
NASA Astrophysics Data System (ADS)
Mellado, Daniel; Fernández-Medina, Eduardo; Piattini, Mario
Security requirements engineering is both a central task and a critical success factor in product line development due to the complexity and extensive nature of product lines. However, most of the current product line practices in requirements engineering do not adequately address security requirements engineering. Therefore, in this chapter we will propose a security requirements engineering process (SREPPLine) driven by security standards and based on a security requirements decision model along with a security variability model to manage the variability of the artefacts related to security requirements. The aim of this approach is to deal with security requirements from the early stages of the product line development in a systematic way, in order to facilitate conformance with the most relevant security standards with regard to the management of security requirements, such as ISO/IEC 27001 and ISO/IEC 15408.
Technology for Manufacturing Efficiency
NASA Technical Reports Server (NTRS)
1995-01-01
The Ground Processing Scheduling System (GPSS) was developed by Ames Research Center, Kennedy Space Center and divisions of the Lockheed Company to maintain the scheduling for preparing a Space Shuttle Orbiter for a mission. Red Pepper Software Company, now part of PeopleSoft, Inc., commercialized the software as their ResponseAgent product line. The software enables users to monitor manufacturing variables, report issues and develop solutions to existing problems.
NASA Technical Reports Server (NTRS)
Steib, Michael
1991-01-01
The APD software features include: On-line help, Three level architecture, (Logic environments, Setup/Application environment, Data environment), Explanation capability, and File handling. The kinds of experimentation and record keeping that leads to effective expert systems is facilitated by: (1) a library of inferencing modules (in the logic environment); (2) an explanation capability which reveals logic strategies to users; (3) automated file naming conventions; (4) an information retrieval system; and (5) on-line help. These aid with effective use of knowledge, debugging and experimentation. Since the APD software anticipates the logical rules becoming complicated, it is embedded in a production system language (CLIPS) to insure the full power of the production system paradigm of CLIPS and availability of the procedural language C. The development is discussed of the APD software and three example applications: toy, experimental, and operational prototype for submarine maintenance predictions.
Performing Verification and Validation in Reuse-Based Software Engineering
NASA Technical Reports Server (NTRS)
Addy, Edward A.
1999-01-01
The implementation of reuse-based software engineering not only introduces new activities to the software development process, such as domain analysis and domain modeling, it also impacts other aspects of software engineering. Other areas of software engineering that are affected include Configuration Management, Testing, Quality Control, and Verification and Validation (V&V). Activities in each of these areas must be adapted to address the entire domain or product line rather than a specific application system. This paper discusses changes and enhancements to the V&V process, in order to adapt V&V to reuse-based software engineering.
Productivity improvement using discrete events simulation
NASA Astrophysics Data System (ADS)
Hazza, M. H. F. Al; Elbishari, E. M. Y.; Ismail, M. Y. Bin; Adesta, E. Y. T.; Rahman, Nur Salihah Binti Abdul
2018-01-01
The increasing in complexity of the manufacturing systems has increased the cost of investment in many industries. Furthermore, the theoretical feasibility studies are not enough to take the decision in investing for that particular area. Therefore, the development of the new advanced software is protecting the manufacturer from investing money in production lines that may not be sufficient and effective with their requirement in terms of machine utilization and productivity issue. By conducting a simulation, using accurate model will reduce and eliminate the risk associated with their new investment. The aim of this research is to prove and highlight the importance of simulation in decision-making process. Delmia quest software was used as a simulation program to run a simulation for the production line. A simulation was first done for the existing production line and show that the estimated production rate is 261 units/day. The results have been analysed based on utilization percentage and idle time. Two different scenarios have been proposed based on different objectives. The first scenario is by focusing on low utilization machines and their idle time, this was resulted in minimizing the number of machines used by three with the addition of the works who maintain them without having an effect on the production rate. The second scenario is to increase the production rate by upgrading the curing machine which lead to the increase in the daily productivity by 7% from 261 units to 281 units.
Quality Assurance in the Presence of Variability
NASA Astrophysics Data System (ADS)
Lauenroth, Kim; Metzger, Andreas; Pohl, Klaus
Software Product Line Engineering (SPLE) is a reuse-driven development paradigm that has been applied successfully in information system engineering and other domains. Quality assurance of the reusable artifacts of the product line (e.g. requirements, design, and code artifacts) is essential for successful product line engineering. As those artifacts are reused in several products, a defect in a reusable artifact can affect several products of the product line. A central challenge for quality assurance in product line engineering is how to consider product line variability. Since the reusable artifacts contain variability, quality assurance techniques from single-system engineering cannot directly be applied to those artifacts. Therefore, different strategies and techniques have been developed for quality assurance in the presence of variability. In this chapter, we describe those strategies and discuss in more detail one of those strategies, the so called comprehensive strategy. The comprehensive strategy aims at checking the quality of all possible products of the product line and thus offers the highest benefits, since it is able to uncover defects in all possible products of the product line. However, the central challenge for applying the comprehensive strategy is the complexity that results from the product line variability and the large number of potential products of a product line. In this chapter, we present one concrete technique that we have developed to implement the comprehensive strategy that addresses this challenge. The technique is based on model checking technology and allows for a comprehensive verification of domain artifacts against temporal logic properties.
Comparing Acquisition Strategies: Open Architecture versus Product Lines
2010-04-30
software • New SOW language for accepting software deliveries – Enables third-party reuse • Additional SOW language regarding conducting software code walkthroughs and for using integrated development environments ...change the business environment must be the primary factor that drives the technical approach. Accordingly, there are business case decisions to be...elements of a system design should be made available to the customer to observe throughout the design process. Electronic access to the design environment
1985-11-01
McAuto) Transaction Manager Subsystem during 1984/1985 period. On-Line Software Responsible for programming the International (OSI) Communications...Network Transaction Manager (NTM) in 1981/1984 period. Software Performance Responsible for directing the Engineering (SPE) work on performance...computer software Contained herein are theoretical and/or SCAN Project 1prierity sao referenoes that In so way reflect Air Forceowmed or -developed $62 LO
Video library for video imaging detection at intersection stop lines.
DOT National Transportation Integrated Search
2010-04-01
The objective of this activity was to record video that could be used for controlled : evaluation of video image vehicle detection system (VIVDS) products and software upgrades to : existing products based on a list of conditions that might be diffic...
Di Benedetto, Raffaele; Fanti, Michele
2012-01-01
This paper wants to present an integrated approach to Line Balancing and Risk Assessment and a Software Tool named ErgoAnalysis that makes it easy to control the whole production process and produces a Risk Index for the actual work tasks in an Assembly Line. Assembly Line Balancing, or simply Line Balancing, is the problem of assigning operations to workstations along an assembly line, in such a way that the assignment be optimal in some sense. Assembly lines are characterized by production constraints and restrictions due to several aspects such as the nature of the product and the flow of orders. To be able to respond effectively to the needs of production, companies need to frequently change the workload and production models. Each manufacturing process might be quite different from another. To optimize very specific operations, assembly line balancing might utilize a number of methods and the Engineer must consider ergonomic constraints, in order to reduce the risk of WMDSs. Risk Assessment may result very expensive because the Engineer must evaluate it at every change. ErgoAnalysis can reduce cost and improve effectiveness in Risk Assessment during the Line Balancing.
Comprehensive analysis of NMR data using advanced line shape fitting.
Niklasson, Markus; Otten, Renee; Ahlner, Alexandra; Andresen, Cecilia; Schlagnitweit, Judith; Petzold, Katja; Lundström, Patrik
2017-10-01
NMR spectroscopy is uniquely suited for atomic resolution studies of biomolecules such as proteins, nucleic acids and metabolites, since detailed information on structure and dynamics are encoded in positions and line shapes of peaks in NMR spectra. Unfortunately, accurate determination of these parameters is often complicated and time consuming, in part due to the need for different software at the various analysis steps and for validating the results. Here, we present an integrated, cross-platform and open-source software that is significantly more versatile than the typical line shape fitting application. The software is a completely redesigned version of PINT ( https://pint-nmr.github.io/PINT/ ). It features a graphical user interface and includes functionality for peak picking, editing of peak lists and line shape fitting. In addition, the obtained peak intensities can be used directly to extract, for instance, relaxation rates, heteronuclear NOE values and exchange parameters. In contrast to most available software the entire process from spectral visualization to preparation of publication-ready figures is done solely using PINT and often within minutes, thereby, increasing productivity for users of all experience levels. Unique to the software are also the outstanding tools for evaluating the quality of the fitting results and extensive, but easy-to-use, customization of the fitting protocol and graphical output. In this communication, we describe the features of the new version of PINT and benchmark its performance.
Variability extraction and modeling for product variants.
Linsbauer, Lukas; Lopez-Herrejon, Roberto Erick; Egyed, Alexander
2017-01-01
Fast-changing hardware and software technologies in addition to larger and more specialized customer bases demand software tailored to meet very diverse requirements. Software development approaches that aim at capturing this diversity on a single consolidated platform often require large upfront investments, e.g., time or budget. Alternatively, companies resort to developing one variant of a software product at a time by reusing as much as possible from already-existing product variants. However, identifying and extracting the parts to reuse is an error-prone and inefficient task compounded by the typically large number of product variants. Hence, more disciplined and systematic approaches are needed to cope with the complexity of developing and maintaining sets of product variants. Such approaches require detailed information about the product variants, the features they provide and their relations. In this paper, we present an approach to extract such variability information from product variants. It identifies traces from features and feature interactions to their implementation artifacts, and computes their dependencies. This work can be useful in many scenarios ranging from ad hoc development approaches such as clone-and-own to systematic reuse approaches such as software product lines. We applied our variability extraction approach to six case studies and provide a detailed evaluation. The results show that the extracted variability information is consistent with the variability in our six case study systems given by their variability models and available product variants.
NASA Technical Reports Server (NTRS)
Pitman, C. L.; Erb, D. M.; Izygon, M. E.; Fridge, E. M., III; Roush, G. B.; Braley, D. M.; Savely, R. T.
1992-01-01
The United State's big space projects of the next decades, such as Space Station and the Human Exploration Initiative, will need the development of many millions of lines of mission critical software. NASA-Johnson (JSC) is identifying and developing some of the Computer Aided Software Engineering (CASE) technology that NASA will need to build these future software systems. The goal is to improve the quality and the productivity of large software development projects. New trends are outlined in CASE technology and how the Software Technology Branch (STB) at JSC is endeavoring to provide some of these CASE solutions for NASA is described. Key software technology components include knowledge-based systems, software reusability, user interface technology, reengineering environments, management systems for the software development process, software cost models, repository technology, and open, integrated CASE environment frameworks. The paper presents the status and long-term expectations for CASE products. The STB's Reengineering Application Project (REAP), Advanced Software Development Workstation (ASDW) project, and software development cost model (COSTMODL) project are then discussed. Some of the general difficulties of technology transfer are introduced, and a process developed by STB for CASE technology insertion is described.
Software engineering methodologies and tools
NASA Technical Reports Server (NTRS)
Wilcox, Lawrence M.
1993-01-01
Over the years many engineering disciplines have developed, including chemical, electronic, etc. Common to all engineering disciplines is the use of rigor, models, metrics, and predefined methodologies. Recently, a new engineering discipline has appeared on the scene, called software engineering. For over thirty years computer software has been developed and the track record has not been good. Software development projects often miss schedules, are over budget, do not give the user what is wanted, and produce defects. One estimate is there are one to three defects per 1000 lines of deployed code. More and more systems are requiring larger and more complex software for support. As this requirement grows, the software development problems grow exponentially. It is believed that software quality can be improved by applying engineering principles. Another compelling reason to bring the engineering disciplines to software development is productivity. It has been estimated that productivity of producing software has only increased one to two percent a year in the last thirty years. Ironically, the computer and its software have contributed significantly to the industry-wide productivity, but computer professionals have done a poor job of using the computer to do their job. Engineering disciplines and methodologies are now emerging supported by software tools that address the problems of software development. This paper addresses some of the current software engineering methodologies as a backdrop for the general evaluation of computer assisted software engineering (CASE) tools from actual installation of and experimentation with some specific tools.
2007-12-01
price dispersion at least as large as dispersion for traditional retailers for books, music CDs, and software offered through 52 Internet and...dispersion differences. For instance, for 22 old-hit albums , average price percentage differences are 31% on-line, compared to 11% off-line. But for 21...current-hit albums , differences are smaller at 18% on-line and 19% off-line. This suggests price dispersion levels are related to product
Meeting the Challenge of Distributed Real-Time & Embedded (DRE) Systems
2012-05-10
IP RTOS Middleware Middleware Services DRE Applications Operating Sys & Protocols Hardware & Networks Middleware Middleware Services DRE...Services COTS & standards-based middleware, language, OS , network, & hardware platforms • Real-time CORBA (TAO) middleware • ADAPTIVE Communication...SPLs) F-15 product variant A/V 8-B product variant F/A 18 product variant UCAV product variant Software Produce-Line Hardware (CPU, Memory, I/O) OS
Detecting Inconsistencies in Multi-View Models with Variability
NASA Astrophysics Data System (ADS)
Lopez-Herrejon, Roberto Erick; Egyed, Alexander
Multi-View Modeling (MVM) is a common modeling practice that advocates the use of multiple, different and yet related models to represent the needs of diverse stakeholders. Of crucial importance in MVM is consistency checking - the description and verification of semantic relationships amongst the views. Variability is the capacity of software artifacts to vary, and its effective management is a core tenet of the research in Software Product Lines (SPL). MVM has proven useful for developing one-of-a-kind systems; however, to reap the potential benefits of MVM in SPL it is vital to provide consistency checking mechanisms that cope with variability. In this paper we describe how to address this need by applying Safe Composition - the guarantee that all programs of a product line are type safe. We evaluate our approach with a case study.
LAMPAT and LAMPATNL User’s Manual
2012-09-01
nonlinearity. These tools are implemented as subroutines in the finite element software ABAQUS . This user’s manual provides information on the proper...model either through the General tab of the Edit Job dialog box in Abaqus /CAE or the command line with user=( subroutine filename). Table 1...Selection of software product and subroutine . Static Analysis With Abaqus /Standard Dynamic Analysis With Abaqus /Explicit Linear, uncoupled
The new meaning of quality in the information age.
Prahalad, C K; Krishnan, M S
1999-01-01
Software applications are now a mission-critical source of competitive advantage for most companies. They are also a source of great risk, as the Y2K bug has made clear. Yet many line managers still haven't confronted software issues--partly because they aren't sure how best to define the quality of the applications in their IT infrastructures. Some companies such as Wal-Mart and the Gap have successfully integrated the software in their networks, but most have accumulated an unwidely number of incompatible applications--all designed to perform the same tasks. The authors provide a framework for measuring the performance of software in a company's IT portfolio. Quality traditionally has been measured according to a product's ability to meet certain specifications; other views of quality have emerged that measure a product's adaptability to customers' needs and a product's ability to encourage innovation. To judge software quality properly, argue the authors, managers must measure applications against all three approaches. Understanding the domain of a software application is an important part of that process. The domain is the body of knowledge about a user's needs and expectations for a product. Software domains change frequently based on how a consumer chooses to use, for example, Microsoft Word or a spreadsheet application. The domain can also be influenced by general changes in technology, such as the development of a new software platform. Thus, applications can't be judged only according to whether they conform to specifications. The authors discuss how to identify domain characteristics and software risks and suggest ways to reduce the variability of software domains.
A Comparison of Product Realization Frameworks
1993-10-01
software (integrated FrameMaker ). Also included are BOLD for on-line documentation delivery, printer/plotter support, and 18 network licensing support. AMPLE...are built with DSS. Documentation tools include an on-line information system (BOLD), text editing (Notepad), word processing (integrated FrameMaker ...within an application. FrameMaker is fully integrated with the Falcon Framework to provide consistent documentation capabilities within engineering
NASA Technical Reports Server (NTRS)
Basili, Victor R.
1992-01-01
The concepts of quality improvements have permeated many businesses. It is clear that the nineties will be the quality era for software and there is a growing need to develop or adapt quality improvement approaches to the software business. Thus we must understand software as an artifact and software as a business. Since the business we are dealing with is software, we must understand the nature of software and software development. The software discipline is evolutionary and experimental; it is a laboratory science. Software is development not production. The technologies of the discipline are human based. There is a lack of models that allow us to reason about the process and the product. All software is not the same; process is a variable, goals are variable, etc. Packaged, reusable, experiences require additional resources in the form of organization, processes, people, etc. There have been a variety of organizational frameworks proposed to improve quality for various businesses. The ones discussed in this presentation include: Plan-Do-Check-Act, a quality improvement process based upon a feedback cycle for optimizing a single process model/production line; the Experience Factory/Quality Improvement Paradigm, continuous improvements through the experimentation, packaging, and reuse of experiences based upon a business's needs; Total Quality Management, a management approach to long term success through customer satisfaction based on the participation of all members of an organization; the SEI capability maturity model, a staged process improvement based upon assessment with regard to a set of key process areas until you reach a level 5 which represents a continuous process improvement; and Lean (software) Development, a principle supporting the concentration of the production on 'value added' activities and the elimination of reduction of 'not value added' activities.
2008-07-01
cycle Evolution of a system, product, service, project or other human-made entity from conception through retirement [ ISO 12207 ]. Logical line of...012 [ ISO 1995] International Organization for Standardization. ISO /IEC 12207 :1995—Information technology— Software life cycle processes. http...definitions, authors were asked to use or align with already existing standards such as those available through ISO and IEEE when possible. Literature
Cargo Movement Operations System (CMOS) Draft Software Product Specification, Increment I
1990-12-13
NO [ ] COMMENT DISPOSITION: ACCEPT [ ] REJECT [ ] COMMENT STATUS: OPEN [ ] CLOSED [ ] Cmnt Page Paragraph No. No. Number Comment 1. 2 1.3 Delete "Software Product Specification" from lines 4 & 5 of the first paragraph. 2. 11 3.4 Change "Table 3.4" to "Appendix E". 3. App B (all) Change the term "Deskview" used to describe the development language to "DESQview". ORIGINATOR CONTROL NUMBER: SPS1-0002 PROGRAM OFFICE CONTROL NUMBER: DATA ITEM DISCREPANCY WORKSHEET CDRL NUMBER: A014-02 DATE: 12/13/90 ORIGINATOR NAME:
The advanced software development workstation project
NASA Technical Reports Server (NTRS)
Fridge, Ernest M., III; Pitman, Charles L.
1991-01-01
The Advanced Software Development Workstation (ASDW) task is researching and developing the technologies required to support Computer Aided Software Engineering (CASE) with the emphasis on those advanced methods, tools, and processes that will be of benefit to support all NASA programs. Immediate goals are to provide research and prototype tools that will increase productivity, in the near term, in projects such as the Software Support Environment (SSE), the Space Station Control Center (SSCC), and the Flight Analysis and Design System (FADS) which will be used to support the Space Shuttle and Space Station Freedom. Goals also include providing technology for development, evolution, maintenance, and operations. The technologies under research and development in the ASDW project are targeted to provide productivity enhancements during the software life cycle phase of enterprise and information system modeling, requirements generation and analysis, system design and coding, and system use and maintenance. On-line user's guides will assist users in operating the developed information system with knowledge base expert assistance.
Dynamic bottleneck elimination in mattress manufacturing line using theory of constraints.
Gundogar, Emin; Sari, Murat; Kokcam, Abdullah H
2016-01-01
There is a tough competition in the furniture sector like other sectors. Along with the varying product range, production system should also be renewed on a regular basis and the production costs should be kept under control. In this study, spring mattress manufacturing line of a furniture manufacturing company is analyzed. The company wants to increase its production output with new investments. The objective is to find the bottlenecks in production line in order to balance the semi-finished material flow. These bottlenecks are investigated and several different scenarios are tested to improve the current manufacturing system. The problem with a main theme based on the elimination of the bottleneck is solved using Goldratt and Cox's theory of constraints with a simulation based heuristic method. Near optimal alternatives are determined by system models built in Arena 13.5 simulation software. Results show that approximately 46 % capacity enhancements with 2 buffer stocks have increased average production by 88.8 %.
CubeIndexer: Indexer for regions of interest in data cubes
NASA Astrophysics Data System (ADS)
Chilean Virtual Observatory; Araya, Mauricio; Candia, Gabriel; Gregorio, Rodrigo; Mendoza, Marcelo; Solar, Mauricio
2015-12-01
CubeIndexer indexes regions of interest (ROIs) in data cubes reducing the necessary storage space. The software can process data cubes containing megabytes of data in fractions of a second without human supervision, thus allowing it to be incorporated into a production line for displaying objects in a virtual observatory. The software forms part of the Chilean Virtual Observatory (ChiVO) and provides the capability of content-based searches on data cubes to the astronomical community.
Command Center Library Model Document. Comprehensive Approach to Reusable Defense Software (CARDS)
1992-05-31
system, and functionality for specifying the layout of the document. 3.7.16.1 FrameMaker FrameMaker is a Commercial Off The Shelf (COTS) component...facilitating WYSIWYG creation of formatted reports with embedded graphics. FrameMaker is an advanced publishing tool that integrates word processing...available for the component FrameMaker : * Product evaluation reports in ASCII and postscript formats • Product assessment on line in model 0 Product
NASA Astrophysics Data System (ADS)
Jaffrey, V.; Mohamed, N. M. Z. N.; Rose, A. N. M.
2017-10-01
In almost all manufacturing industry, increased productivity and better efficiency of the production line are the most important goals. Most factories especially small scale factory has less awareness of manufacturing system optimization and lack of knowledge about it and uses the traditional way of management. Problems that are commonly identified in the factory are a high idle time of labour and also small production. This study is done in a Small and Medium Enterprises (SME) low volume production company. Data collection and problems affecting productivity and efficiency are identified. In this study, Witness simulation software is being used to simulate the layout and the output is focusing on the improvement of layout in terms of productivity and efficiency. In this study, the layout is rearranged by reducing the travel time from a workstation to another workstation. Then, the improved layout is modelled and the machine and labour statistic of both, original and improved layout is taken. Productivity and efficiency are calculated for both layout and then being compared.
Prospective comparison of speckle tracking longitudinal bidimensional strain between two vendors.
Castel, Anne-Laure; Szymanski, Catherine; Delelis, François; Levy, Franck; Menet, Aymeric; Mailliet, Amandine; Marotte, Nathalie; Graux, Pierre; Tribouilloy, Christophe; Maréchaux, Sylvestre
2014-02-01
Speckle tracking is a relatively new, largely angle-independent technique used for the evaluation of myocardial longitudinal strain (LS). However, significant differences have been reported between LS values obtained by speckle tracking with the first generation of software products. To compare LS values obtained with the most recently released equipment from two manufacturers. Systematic scanning with head-to-head acquisition with no modification of the patient's position was performed in 64 patients with equipment from two different manufacturers, with subsequent off-line post-processing for speckle tracking LS assessment (Philips QLAB 9.0 and General Electric [GE] EchoPAC BT12). The interobserver variability of each software product was tested on a randomly selected set of 20 echocardiograms from the study population. GE and Philips interobserver coefficients of variation (CVs) for global LS (GLS) were 6.63% and 5.87%, respectively, indicating good reproducibility. Reproducibility was very variable for regional and segmental LS values, with CVs ranging from 7.58% to 49.21% with both software products. The concordance correlation coefficient (CCC) between GLS values was high at 0.95, indicating substantial agreement between the two methods. While good agreement was observed between midwall and apical regional strains with the two software products, basal regional strains were poorly correlated. The agreement between the two software products at a segmental level was very variable; the highest correlation was obtained for the apical cap (CCC 0.90) and the poorest for basal segments (CCC range 0.31-0.56). A high level of agreement and reproducibility for global but not for basal regional or segmental LS was found with two vendor-dependent software products. This finding may help to reinforce clinical acceptance of GLS in everyday clinical practice. Copyright © 2014 Elsevier Masson SAS. All rights reserved.
1992-12-01
36 V.33. Coe ncint of De minstioi ........................ 37 V3A. F-Raio .................................... 37 V3.5... de ations. Instructions ae defined as lines of code or card images. Thus, a line containin two or mome souce statements counts as one instruction; a...understand the productivity paradox, recall de concept of virtual machines. When a higher level machine groups ogether many instructm of a lower level
A Software Product Line Process to Develop Agents for the IoT.
Ayala, Inmaculada; Amor, Mercedes; Fuentes, Lidia; Troya, José M
2015-07-01
One of the most important challenges of this decade is the Internet of Things (IoT), which aims to enable things to be connected anytime, anyplace, with anything and anyone, ideally using any path/network and any service. IoT systems are usually composed of heterogeneous and interconnected lightweight devices that support applications that are subject to change in their external environment and in the functioning of these devices. The management of the variability of these changes, autonomously, is a challenge in the development of these systems. Agents are a good option for developing self-managed IoT systems due to their distributed nature, context-awareness and self-adaptation. Our goal is to enhance the development of IoT applications using agents and software product lines (SPL). Specifically, we propose to use Self-StarMASMAS, multi-agent system) agents and to define an SPL process using the Common Variability Language. In this contribution, we propose an SPL process for Self-StarMAS, paying particular attention to agents embedded in sensor motes.
Community tools for cartographic and photogrammetric processing of Mars Express HRSC images
Kirk, Randolph L.; Howington-Kraus, Elpitha; Edmundson, Kenneth L.; Redding, Bonnie L.; Galuszka, Donna M.; Hare, Trent M.; Gwinner, K.; Wu, B.; Di, K.; Oberst, J.; Karachevtseva, I.
2017-01-01
The High Resolution Stereo Camera (HRSC) on the Mars Express orbiter (Neukum et al. 2004) is a multi-line pushbroom scanner that can obtain stereo and color coverage of targets in a single overpass, with pixel scales as small as 10 m at periapsis. Since commencing operations in 2004 it has imaged ~ 77 % of Mars at 20 m/pixel or better. The instrument team uses the Video Image Communication And Retrieval (VICAR) software to produce and archive a range of data products from uncalibrated and radiometrically calibrated images to controlled digital topographic models (DTMs) and orthoimages and regional mosaics of DTM and orthophoto data (Gwinner et al. 2009; 2010b; 2016). Alternatives to this highly effective standard processing pipeline are nevertheless of interest to researchers who do not have access to the full VICAR suite and may wish to make topographic products or perform other (e. g., spectrophotometric) analyses prior to the release of the highest level products. We have therefore developed software to ingest HRSC images and model their geometry in the USGS Integrated Software for Imagers and Spectrometers (ISIS3), which can be used for data preparation, geodetic control, and analysis, and the commercial photogrammetric software SOCET SET (® BAE Systems; Miller and Walker 1993; 1995) which can be used for independent production of DTMs and orthoimages. The initial implementation of this capability utilized the then-current ISIS2 system and the generic pushbroom sensor model of SOCET SET, and was described in the DTM comparison of independent photogrammetric processing by different elements of the HRSC team (Heipke et al. 2007). A major drawback of this prototype was that neither software system then allowed for pushbroom images in which the exposure time changes from line to line. Except at periapsis, HRSC makes such timing changes every few hundred lines to accommodate changes of altitude and velocity in its elliptical orbit. As a result, it was necessary to split observations into blocks of constant exposure time, greatly increasing the effort needed to control the images and collect DTMs. Here, we describe a substantially improved HRSC processing capability that incorporates sensor models with varying line timing in the current ISIS3 system (Sides 2017) and SOCET SET. This enormously reduces the work effort for processing most images and eliminates the artifacts that arose from segmenting them. In addition, the software takes advantage of the continuously evolving capabilities of ISIS3 and the improved image matching module NGATE (Next Generation Automatic Terrain Extraction, incorporating area and feature based algorithms, multi-image and multi-direction matching) of SOCET SET, thus greatly reducing the need for manual editing of DTM errors. We have also developed a procedure for geodetically controlling the images to Mars Orbiter Laser Altimeter (MOLA) data by registering a preliminary stereo topographic model to MOLA by using the point cloud alignment (pc_align) function of the NASA Ames Stereo Pipeline (ASP; Moratto et al. 2010). This effectively converts inter-image tiepoints into ground control points in the MOLA coordinate system. The result is improved absolute accuracy and a significant reduction in work effort relative to manual measurement of ground control. The ISIS and ASP software used are freely available; SOCET SET, is a commercial product. By the end of 2017 we expect to have ported our SOCET SET HRSC sensor model to the Community Sensor Model (CSM; Community Sensor Model Working Group 2010; Hare and Kirk 2017) standard utilized by the successor photogrammetric system SOCET GXP that is currently offered by BAE. In early 2018, we are also working with BAE to release the CSM source code under a BSD or MIT open source license.
1979-12-01
the functional management level, a real-time production con- trol system and an order processing system at the operational level. SIDMS was designed...at any one time. 26 An overview of the major software systems in operation is listed below: a. Major Software Systems: Order processing system e Order ... processing for the supply support center/AWP locker. e Order processing for the airwing squadron material controls. e Order processing for the IMA
NASA Technical Reports Server (NTRS)
Stark, Michael; Hennessy, Joseph F. (Technical Monitor)
2002-01-01
My assertion is that not only are product lines a relevant research topic, but that the tools used by empirical software engineering researchers can address observed practical problems. Our experience at NASA has been there are often externally proposed solutions available, but that we have had difficulties applying them in our particular context. We have also focused on return on investment issues when evaluating product lines, and while these are important, one can not attain objective data on success or failure until several applications from a product family have been deployed. The use of the Quality Improvement Paradigm (QIP) can address these issues: (1) Planning an adoption path from an organization's current state to a product line approach; (2) Constructing a development process to fit the organization's adoption path; (3) Evaluation of product line development processes as the project is being developed. The QIP consists of the following six steps: (1) Characterize the project and its environment; (2) Set quantifiable goals for successful project performance; (3) Choose the appropriate process models, supporting methods, and tools for the project; (4) Execute the process, analyze interim results, and provide real-time feedback for corrective action; (5) Analyze the results of completed projects and recommend improvements; and (6) Package the lessons learned as updated and refined process models. A figure shows the QIP in detail. The iterative nature of the QIP supports an incremental development approach to product lines, and the project learning and feedback provide the necessary early evaluations.
The study of production performance of water heater manufacturing by using simulation method
NASA Astrophysics Data System (ADS)
Iqbal, M.; Bamatraf, OAA; Tadjuddin, M.
2018-02-01
In industrial companies, as demand increases, decision-making to increase production becomes difficult due to the complexity of the model systems. Companies are trying to find the optimum methods to tackle such problems so that resources are utilized and production is increased. One line system of a manufacturing company in Malaysia was considered in this research. The Company produces several types of water heater and each type went into many processes, which was divided into twenty six sections. Each section has several operations. The main type of the product was 10G water heater which is produced most compare to other types, hence it was taken under consideration to be studied in this research. It was difficult to find the critical section that could improve the productions of the company. This research paper employed Delmia Quest software, Distribution Analyser software and Design of Experiment (DOE software) to simulate one model system taken from the company to be studied and to find the critical section that will improve the production system. As a result, assembly of inner and outer tank section were found to be the bottleneck section. Adding one section to the bottleneck increases the production rate by four products a day. The buffer size is determined by the experiment was six items.
Identification and evaluation of software measures
NASA Technical Reports Server (NTRS)
Card, D. N.
1981-01-01
A large scale, systematic procedure for identifying and evaluating measures that meaningfully characterize one or more elements of software development is described. The background of this research, the nature of the data involved, and the steps of the analytic procedure are discussed. An example of the application of this procedure to data from real software development projects is presented. As the term is used here, a measure is a count or numerical rating of the occurrence of some property. Examples of measures include lines of code, number of computer runs, person hours expended, and degree of use of top down design methodology. Measures appeal to the researcher and the manager as a potential means of defining, explaining, and predicting software development qualities, especially productivity and reliability.
Large Scale Software Building with CMake in ATLAS
NASA Astrophysics Data System (ADS)
Elmsheuser, J.; Krasznahorkay, A.; Obreshkov, E.; Undrus, A.; ATLAS Collaboration
2017-10-01
The offline software of the ATLAS experiment at the Large Hadron Collider (LHC) serves as the platform for detector data reconstruction, simulation and analysis. It is also used in the detector’s trigger system to select LHC collision events during data taking. The ATLAS offline software consists of several million lines of C++ and Python code organized in a modular design of more than 2000 specialized packages. Because of different workflows, many stable numbered releases are in parallel production use. To accommodate specific workflow requests, software patches with modified libraries are distributed on top of existing software releases on a daily basis. The different ATLAS software applications also require a flexible build system that strongly supports unit and integration tests. Within the last year this build system was migrated to CMake. A CMake configuration has been developed that allows one to easily set up and build the above mentioned software packages. This also makes it possible to develop and test new and modified packages on top of existing releases. The system also allows one to detect and execute partial rebuilds of the release based on single package changes. The build system makes use of CPack for building RPM packages out of the software releases, and CTest for running unit and integration tests. We report on the migration and integration of the ATLAS software to CMake and show working examples of this large scale project in production.
NASA Technical Reports Server (NTRS)
McNeill, Justin
1995-01-01
The Multimission Image Processing Subsystem (MIPS) at the Jet Propulsion Laboratory (JPL) has managed transitions of application software sets from one operating system and hardware platform to multiple operating systems and hardware platforms. As a part of these transitions, cost estimates were generated from the personal experience of in-house developers and managers to calculate the total effort required for such projects. Productivity measures have been collected for two such transitions, one very large and the other relatively small in terms of source lines of code. These estimates used a cost estimation model similar to the Software Engineering Laboratory (SEL) Effort Estimation Model. Experience in transitioning software within JPL MIPS have uncovered a high incidence of interface complexity. Interfaces, both internal and external to individual software applications, have contributed to software transition project complexity, and thus to scheduling difficulties and larger than anticipated design work on software to be ported.
Flexible data registration and automation in semiconductor production
NASA Astrophysics Data System (ADS)
Dudde, Ralf; Staudt-Fischbach, Peter; Kraemer, Benedict
1997-08-01
The need for cost reduction and flexibility in semiconductor production will result in a wider application of computer based automation systems. With the setup of a new and advanced CMOS semiconductor line in the Fraunhofer Institute for Silicon Technology [ISIT, Itzehoe (D)] a new line information system (LIS) was introduced based on an advanced model for the underlying data structure. This data model was implemented into an ORACLE-RDBMS. A cellworks based system (JOSIS) was used for the integration of the production equipment, communication and automated database bookings and information retrievals. During the ramp up of the production line this new system is used for the fab control. The data model and the cellworks based system integration is explained. This system enables an on-line overview of the work in progress in the fab, lot order history and equipment status and history. Based on this figures improved production and cost monitoring and optimization is possible. First examples of the information gained by this system are presented. The modular set-up of the LIS system will allow easy data exchange with additional software tools like scheduler, different fab control systems like PROMIS and accounting systems like SAP. Modifications necessary for the integration of PROMIS are described.
Culture shock: Improving software quality
DOE Office of Scientific and Technical Information (OSTI.GOV)
de Jong, K.; Trauth, S.L.
1988-01-01
The concept of software quality can represent a significant shock to an individual who has been developing software for many years and who believes he or she has been doing a high quality job. The very idea that software includes lines of code and associated documentation is foreign and difficult to grasp, at best. Implementation of a software quality program hinges on the concept that software is a product whose quality needs improving. When this idea is introduced into a technical community that is largely ''self-taught'' and has been producing ''good'' software for some time, a fundamental understanding of themore » concepts associated with software is often weak. Software developers can react as if to say, ''What are you talking about. What do you mean I'm not doing a good job. I haven't gotten any complaints about my code yetexclamation'' Coupling such surprise and resentment with the shock that software really is a product and software quality concepts do exist, can fuel the volatility of these emotions. In this paper, we demonstrate that the concept of software quality can indeed pose a culture shock to developers. We also show that a ''typical'' quality assurance approach, that of imposing a standard and providing inspectors and auditors to assure its adherence, contributes to this shock and detracts from the very goal the approach should achieve. We offer an alternative, adopted through experience, to implement a software quality program: cooperative assistance. We show how cooperation, education, consultation and friendly assistance can overcome this culture shock. 3 refs.« less
MOST: a software environment for constraint-based metabolic modeling and strain design.
Kelley, James J; Lane, Anatoliy; Li, Xiaowei; Mutthoju, Brahmaji; Maor, Shay; Egen, Dennis; Lun, Desmond S
2015-02-15
MOST (metabolic optimization and simulation tool) is a software package that implements GDBB (genetic design through branch and bound) in an intuitive user-friendly interface with excel-like editing functionality, as well as implementing FBA (flux balance analysis), and supporting systems biology markup language and comma-separated values files. GDBB is currently the fastest algorithm for finding gene knockouts predicted by FBA to increase production of desired products, but GDBB has only been available on a command line interface, which is difficult to use for those without programming knowledge, until the release of MOST. MOST is distributed for free on the GNU General Public License. The software and full documentation are available at http://most.ccib.rutgers.edu/. dslun@rutgers.edu. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Proceedings of the First Workshop on Service-Oriented Architectures and Software Product Lines
2008-05-01
Addison-Wesley, Har- low, 2000. [8] Kang, K., Cohen, S., Hess, J., Novak, W., & Peterson, S. Feature-Oriented Domain Analysis ( FODA ) Feasibility...Intensive Systems-Description, 2000. [17] K. Kang, S. Cohen, J. Hess, W. No- vak, and S. Peterson. Feature- Oriented Domain Analysis ( FODA ...product models. SPF modeling employs many approaches such as Feature- Oriented Domain Analysis and extensions to existing approaches such as UML
Directory of Assistive Technology: Data Sources.
ERIC Educational Resources Information Center
Council for Exceptional Children, Reston, VA. Center for Special Education Technology.
The annotated directory describes in detail both on-line and print databases in the area of assistive technology for individuals with disabilities. For each database, the directory provides the name, address, and telephone number of the sponsoring organization; disability areas served; number of hardware and software products; types of information…
1980-09-30
typography is voluminous and directly applicable. Research dealing directly with the line printer used in computer output is scanty, but consistent with...available to the researcher. While this may stimulate rapid software production, it often creates sets of chain- reaction problems. Accordingly
CD-ROM Networking: Navigating through VINES and NetWare and the New Software Technologies.
ERIC Educational Resources Information Center
Lieberman, Paula
1995-01-01
Provides an overview of developments in CD-ROM networking technology and describes products offered by Axis, Banyan (VINES--network operating environment), CD Connection, Celerity, Data/Ware, Document Imaging Systems Corporation (DISC), Imagery, Jodian, Meridian, Micro Design International, Microsoft, Microtest, Novell, OnLine Computer Systems,…
Analysis on flexible manufacturing system layout using arena simulation software
NASA Astrophysics Data System (ADS)
Fadzly, M. K.; Saad, Mohd Sazli; Shayfull, Z.
2017-09-01
Flexible manufacturing system (FMS) was defined as highly automated group technology machine cell, consisting of a group of processing stations interconnected by an automated material handling and storage system, and controlled by an integrated computer system. FMS can produce parts or products are in the mid-volume, mid-variety production range. The layout system in FMS is an important criterion to design the FMS system to produce a part or product. This facility layout of an FMS involves the positioning of cells within given boundaries, so as to minimize the total projected travel time between cells. Defining the layout includes specifying the spatial coordinates of each cell, its orientation in either a horizontal or vertical position, and the location of its load or unloads point. There are many types of FMS layout such as In-line, loop ladder and robot centered cell layout. The research is concentrating on the design and optimization FMS layout. The final conclusion can be summarized that the objective to design and optimisation of FMS layout for this study is successful because the FMS In-line layout is the best layout based on effective time and cost using ARENA simulation software.
Park, Sophie Elizabeth; Thomas, James
2018-06-07
It can be challenging to decide which evidence synthesis software to choose when doing a systematic review. This article discusses some of the important questions to consider in relation to the chosen method and synthesis approach. Software can support researchers in a range of ways. Here, a range of review conditions and software solutions. For example, facilitating contemporaneous collaboration across time and geographical space; in-built bias assessment tools; and line-by-line coding for qualitative textual analysis. EPPI-Reviewer is a review software for research synthesis managed by the EPPI-centre, UCL Institute of Education. EPPI-Reviewer has text mining automation technologies. Version 5 supports data sharing and re-use across the systematic review community. Open source software will soon be released. EPPI-Centre will continue to offer the software as a cloud-based service. The software is offered via a subscription with a one-month (extendible) trial available and volume discounts for 'site licences'. It is free to use for Cochrane and Campbell reviews. The next EPPI-Reviewer version is being built in collaboration with National Institute for Health and Care Excellence using 'surveillance' of newly published research to support 'living' iterative reviews. This is achieved using a combination of machine learning and traditional information retrieval technologies to identify the type of research each new publication describes and determine its relevance for a particular review, domain or guideline. While the amount of available knowledge and research is constantly increasing, the ways in which software can support the focus and relevance of data identification are also developing fast. Software advances are maximising the opportunities for the production of relevant and timely reviews. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Software architecture of the III/FBI segment of the FBI's integrated automated identification system
NASA Astrophysics Data System (ADS)
Booker, Brian T.
1997-02-01
This paper will describe the software architecture of the Interstate Identification Index (III/FBI) Segment of the FBI's Integrated Automated Fingerprint Identification System (IAFIS). IAFIS is currently under development, with deployment to begin in 1998. III/FBI will provide the repository of criminal history and photographs for criminal subjects, as well as identification data for military and civilian federal employees. Services provided by III/FBI include maintenance of the criminal and civil data, subject search of the criminal and civil data, and response generation services for IAFIS. III/FBI software will be comprised of both COTS and an estimated 250,000 lines of developed C code. This paper will describe the following: (1) the high-level requirements of the III/FBI software; (2) the decomposition of the III/FBI software into Computer Software Configuration Items (CSCIs); (3) the top-level design of the III/FBI CSCIs; and (4) the relationships among the developed CSCIs and the COTS products that will comprise the III/FBI software.
User’s Manual for the Modular Analysis-Package Libraries ANAPAC and TRANL
1977-09-01
number) Computer software Fourier transforms Computer software library Interpolation software Digitized data...disregarded to give the user a simplified plot. (b) The last digit of ISPACE determines the type of line to be drawn, provided KODE is not...negative. If the last digit of ISPACE is 0 a solid line is drawn 1 a dashed line is drawn - - - 2 a dotted line is drawn .... 3 a dash-dot line is
AIRSAR Web-Based Data Processing
NASA Technical Reports Server (NTRS)
Chu, Anhua; Van Zyl, Jakob; Kim, Yunjin; Hensley, Scott; Lou, Yunling; Madsen, Soren; Chapman, Bruce; Imel, David; Durden, Stephen; Tung, Wayne
2007-01-01
The AIRSAR automated, Web-based data processing and distribution system is an integrated, end-to-end synthetic aperture radar (SAR) processing system. Designed to function under limited resources and rigorous demands, AIRSAR eliminates operational errors and provides for paperless archiving. Also, it provides a yearly tune-up of the processor on flight missions, as well as quality assurance with new radar modes and anomalous data compensation. The software fully integrates a Web-based SAR data-user request subsystem, a data processing system to automatically generate co-registered multi-frequency images from both polarimetric and interferometric data collection modes in 80/40/20 MHz bandwidth, an automated verification quality assurance subsystem, and an automatic data distribution system for use in the remote-sensor community. Features include Survey Automation Processing in which the software can automatically generate a quick-look image from an entire 90-GB SAR raw data 32-MB/s tape overnight without operator intervention. Also, the software allows product ordering and distribution via a Web-based user request system. To make AIRSAR more user friendly, it has been designed to let users search by entering the desired mission flight line (Missions Searching), or to search for any mission flight line by entering the desired latitude and longitude (Map Searching). For precision image automation processing, the software generates the products according to each data processing request stored in the database via a Queue management system. Users are able to have automatic generation of coregistered multi-frequency images as the software generates polarimetric and/or interferometric SAR data processing in ground and/or slant projection according to user processing requests for one of the 12 radar modes.
2012-01-27
example is found in games converted to serve a purpose other than entertainment , such as the development and use of games for science, technology, and...These play-session histories can then be further modded via video editing or remixing with other media (e.g., adding music ) to better enable cinematic...available OSS (e.g., the Linux Kernel on the Sony PS3 game console2) that game system hackers seek to undo. Finally, games are one of the most commonly
NASA Technical Reports Server (NTRS)
McComas, David C.; Strege, Susanne L.; Carpenter, Paul B. Hartman, Randy
2015-01-01
The core Flight System (cFS) is a flight software (FSW) product line developed by the Flight Software Systems Branch (FSSB) at NASA's Goddard Space Flight Center (GSFC). The cFS uses compile-time configuration parameters to implement variable requirements to enable portability across embedded computing platforms and to implement different end-user functional needs. The verification and validation of these requirements is proving to be a significant challenge. This paper describes the challenges facing the cFS and the results of a pilot effort to apply EXB Solution's testing approach to the cFS applications.
Integrated optomechanical analysis and testing software development at MIT Lincoln Laboratory
NASA Astrophysics Data System (ADS)
Stoeckel, Gerhard P.; Doyle, Keith B.
2013-09-01
Advanced analytical software capabilities are being developed to advance the design of prototypical hardware in the Engineering Division at MIT Lincoln Laboratory. The current effort is focused on the integration of analysis tools tailored to the work flow, organizational structure, and current technology demands. These tools are being designed to provide superior insight into the interdisciplinary behavior of optical systems and enable rapid assessment and execution of design trades to optimize the design of optomechanical systems. The custom software architecture is designed to exploit and enhance the functionality of existing industry standard commercial software, provide a framework for centralizing internally developed tools, and deliver greater efficiency, productivity, and accuracy through standardization, automation, and integration. Specific efforts have included the development of a feature-rich software package for Structural-Thermal-Optical Performance (STOP) modeling, advanced Line Of Sight (LOS) jitter simulations, and improved integration of dynamic testing and structural modeling.
Learning from examples - Generation and evaluation of decision trees for software resource analysis
NASA Technical Reports Server (NTRS)
Selby, Richard W.; Porter, Adam A.
1988-01-01
A general solution method for the automatic generation of decision (or classification) trees is investigated. The approach is to provide insights through in-depth empirical characterization and evaluation of decision trees for software resource data analysis. The trees identify classes of objects (software modules) that had high development effort. Sixteen software systems ranging from 3,000 to 112,000 source lines were selected for analysis from a NASA production environment. The collection and analysis of 74 attributes (or metrics), for over 4,700 objects, captured information about the development effort, faults, changes, design style, and implementation style. A total of 9,600 decision trees were automatically generated and evaluated. The trees correctly identified 79.3 percent of the software modules that had high development effort or faults, and the trees generated from the best parameter combinations correctly identified 88.4 percent of the modules on the average.
CLAES Product Improvement by Use of the GSFC Data Assimilation System (DAS)
NASA Technical Reports Server (NTRS)
Kumer, J. B.; Douglass, Anne (Technical Monitor)
2000-01-01
This report presents the Cryogenic Limb Array Etalon Spectrometer (CLAES) product improvement by use of the GSFC Data Assimilation System (DAS). The first task is to plug line of sight gradients derived from the CTM for 2/20/92 into the forward model of our retrieval software (RSW) in order to assess the impact on the retrieved quantities. The reporting period covers 12 May 2000 - 21 December 2000.
Optimization of Gate, Runner and Sprue in Two-Plate Family Plastic Injection Mould
NASA Astrophysics Data System (ADS)
Amran, M. A.; Hadzley, M.; Amri, S.; Izamshah, R.; Hassan, A.; Samsi, S.; Shahir, K.
2010-03-01
This paper describes the optimization size of gate, runner and sprue in two-plate family plastic injection mould. An Electronic Cash Register (ECR) plastic product was used in this study, which there are three components in electronic cast register plastic product consist of top casing, bottom casing and paper holder. The objectives of this paper are to find out the optimum size of gate, runner and sprue, to locate the optimum layout of cavities and to recognize the defect problems due to the wrong size of gate, runner and sprue. Three types of software were used in this study, which Unigraphics software as CAD tool was used to design 3D modeling, Rhinoceros software as post processing tool was used to design gate, runner and sprue and Moldex software as simulation tool was used to analyze the plastic flow. As result, some modifications were made on size of feeding system and location of cavity to eliminate the short- shot, over filling and welding line problems in two-plate family plastic injection mould.
A recent Cleanroom success story: The Redwing project
NASA Technical Reports Server (NTRS)
Hausler, Philip A.
1992-01-01
Redwing is the largest completed Cleanroom software engineering project in IBM, both in terms of lines of code and project staffing. The product provides a decision-support facility that utilizes artificial intelligence (AI) technology for predicting and preventing complex operating problems in an MVS environment. The project used the Cleanroom process for development and realized a defect rate of 2.6 errors/KLOC, measured from first execution. This represents the total amount of errors that were found in testing and installation at three field test sites. Development productivity was 486 LOC/PM, which included all development labor expended in design specification through completion of incremental testing. In short, the Redwing team produced a complex systems software product with an extraordinarily low error rate, while maintaining high productivity. All of this was accomplished by a project team using Cleanroom for the first time. An 'introductory implementation' of Cleanroom was defined and used on Redwing. This paper describes the quality and productivity results, the Redwing project, and how Cleanroom was implemented.
Simulation of Assembly Line Balancing in Automotive Component Manufacturing
NASA Astrophysics Data System (ADS)
Jamil, Muthanna; Mohd Razali, Noraini
2016-02-01
This study focuses on the simulation of assembly line balancing in an automotive component in a vendor manufacturing company. A mixed-model assembly line of charcoal canister product that is used in an engine system as fuel's vapour filter was observed and found that the current production rate of the line does not achieve customer demand even though the company practices buffer stock for two days in advance. This study was carried out by performing detailed process flow and time studies along the line. To set up a model of the line by simulation, real data was taken from a factory floor and tested for distribution fit. The data gathered was then transformed into a simulation model. After verification of the model by comparing it with the actual system, it was found that the current line efficiency is not at its optimum condition due to blockage and idle time. Various what-if analysis were applied to eliminate the cause. Proposed layout shows that the line is balanced by adding buffer to avoid the blockage. Whereas, manpower is added the stations to reduce process time therefore reducing idling time. The simulation study was carried out using ProModel software.
A Software Product Line Process to Develop Agents for the IoT
Ayala, Inmaculada; Amor, Mercedes; Fuentes, Lidia; Troya, José M.
2015-01-01
One of the most important challenges of this decade is the Internet of Things (IoT), which aims to enable things to be connected anytime, anyplace, with anything and anyone, ideally using any path/network and any service. IoT systems are usually composed of heterogeneous and interconnected lightweight devices that support applications that are subject to change in their external environment and in the functioning of these devices. The management of the variability of these changes, autonomously, is a challenge in the development of these systems. Agents are a good option for developing self-managed IoT systems due to their distributed nature, context-awareness and self-adaptation. Our goal is to enhance the development of IoT applications using agents and software product lines (SPL). Specifically, we propose to use Self-StarMASMAS, multi-agent system) agents and to define an SPL process using the Common Variability Language. In this contribution, we propose an SPL process for Self-StarMAS, paying particular attention to agents embedded in sensor motes. PMID:26140350
Off-line programming motion and process commands for robotic welding of Space Shuttle main engines
NASA Technical Reports Server (NTRS)
Ruokangas, C. C.; Guthmiller, W. A.; Pierson, B. L.; Sliwinski, K. E.; Lee, J. M. F.
1987-01-01
The off-line-programming software and hardware being developed for robotic welding of the Space Shuttle main engine are described and illustrated with diagrams, drawings, graphs, and photographs. The menu-driven workstation-based interactive programming system is designed to permit generation of both motion and process commands for the robotic workcell by weld engineers (with only limited knowledge of programming or CAD systems) on the production floor. Consideration is given to the user interface, geometric-sources interfaces, overall menu structure, weld-parameter data base, and displays of run time and archived data. Ongoing efforts to address limitations related to automatic-downhand-configuration coordinated motion, a lack of source codes for the motion-control software, CAD data incompatibility, interfacing with the robotic workcell, and definition of the welding data base are discussed.
Evaluate the Usability of the Mobile Instant Messaging Software in the Elderly.
Wen, Tzu-Ning; Cheng, Po-Liang; Chang, Po-Lun
2017-01-01
Instant messaging (IM) is one kind of online chat that provides real-time text transmission over the Internet. It becomes one of the popular communication tools. Even it is currnetly an era of smartphones, it still a great challenge to teach and promote the elderly to use smart phone. Besides, the acceptance of the elderly to use IM remains unknown. This study describes the usability and evaluates the acceptance of the IM in the elderly, who use the smartphone for the first time. This study is a quasi-experimental design study. The study period started from October, 2012 to December, 2013. There were totally 41 elderly recruited in the study. All of them were the first time to use LINE app on the smartphones. The usability was evaluated by using the Technology Acceptance Model which consisted of four constructs: cognitive usability, cognitive ease of use, attitude and willingness to use. Overall, the elderly had the best "attitude" for LINE APP communication software, with the highest rating averaging 4.07 points on four constructs, followed by an average of 4 points on "cognitive usefulness". The socres of "cognitive ease of use" and "willingness to use" scores were equal which are an average score of 3.86. It can be interpreted that (1) the elders thought that the LINE APP as an excellent communication tool for them; (2) they found the software is useful (3) it was convenient for them to communicate. However, it was necessary to additionally assist and explain the certain functions such as the options. It would play a great role in the "willingness to use". The positive acceptance of LINE APP in elderly refer to the probable similar acceptance for them to use other communication software. Encouraging the willingness the elderly to explore more technology products and understanding their behavior will be the basic knowledge to develop further software.
The (mis)use of subjective process measures in software engineering
NASA Technical Reports Server (NTRS)
Valett, Jon D.; Condon, Steven E.
1993-01-01
A variety of measures are used in software engineering research to develop an understanding of the software process and product. These measures fall into three broad categories: quantitative, characteristics, and subjective. Quantitative measures are those to which a numerical value can be assigned, for example effort or lines of code (LOC). Characteristics describe the software process or product; they might include programming language or the type of application. While such factors do not provide a quantitative measurement of a process or product, they do help characterize them. Subjective measures (as defined in this study) are those that are based on the opinion or opinions of individuals; they are somewhat unique and difficult to quantify. Capturing of subjective measure data typically involves development of some type of scale. For example, 'team experience' is one of the subjective measures that were collected and studied by the Software Engineering Laboratory (SEL). Certainly, team experience could have an impact on the software process or product; actually measuring a team's experience, however, is not a strictly mathematical exercise. Simply adding up each team member's years of experience appears inadequate. In fact, most researchers would agree that 'years' do not directly translate into 'experience.' Team experience must be defined subjectively and then a scale must be developed e.g., high experience versus low experience; or high, medium, low experience; or a different or more granular scale. Using this type of scale, a particular team's overall experience can be compared with that of other teams in the development environment. Defining, collecting, and scaling subjective measures is difficult. First, precise definitions of the measures must be established. Next, choices must be made about whose opinions will be solicited to constitute the data. Finally, care must be given to defining the right scale and level of granularity for measurement.
pyam: Python Implementation of YaM
NASA Technical Reports Server (NTRS)
Myint, Steven; Jain, Abhinandan
2012-01-01
pyam is a software development framework with tools for facilitating the rapid development of software in a concurrent software development environment. pyam provides solutions for development challenges associated with software reuse, managing multiple software configurations, developing software product lines, and multiple platform development and build management. pyam uses release-early, release-often development cycles to allow developers to integrate their changes incrementally into the system on a continual basis. It facilitates the creation and merging of branches to support the isolated development of immature software to avoid impacting the stability of the development effort. It uses modules and packages to organize and share software across multiple software products, and uses the concepts of link and work modules to reduce sandbox setup times even when the code-base is large. One sidebenefit is the enforcement of a strong module-level encapsulation of a module s functionality and interface. This increases design transparency, system stability, and software reuse. pyam is written in Python and is organized as a set of utilities on top of the open source SVN software version control package. All development software is organized into a collection of modules. pyam packages are defined as sub-collections of the available modules. Developers can set up private sandboxes for module/package development. All module/package development takes place on private SVN branches. High-level pyam commands support the setup, update, and release of modules and packages. Released and pre-built versions of modules are available to developers. Developers can tailor the source/link module mix for their sandboxes so that new sandboxes (even large ones) can be built up easily and quickly by pointing to pre-existing module releases. All inter-module interfaces are publicly exported via links. A minimal, but uniform, convention is used for building modules.
Model for Simulating a Spiral Software-Development Process
NASA Technical Reports Server (NTRS)
Mizell, Carolyn; Curley, Charles; Nayak, Umanath
2010-01-01
A discrete-event simulation model, and a computer program that implements the model, have been developed as means of analyzing a spiral software-development process. This model can be tailored to specific development environments for use by software project managers in making quantitative cases for deciding among different software-development processes, courses of action, and cost estimates. A spiral process can be contrasted with a waterfall process, which is a traditional process that consists of a sequence of activities that include analysis of requirements, design, coding, testing, and support. A spiral process is an iterative process that can be regarded as a repeating modified waterfall process. Each iteration includes assessment of risk, analysis of requirements, design, coding, testing, delivery, and evaluation. A key difference between a spiral and a waterfall process is that a spiral process can accommodate changes in requirements at each iteration, whereas in a waterfall process, requirements are considered to be fixed from the beginning and, therefore, a waterfall process is not flexible enough for some projects, especially those in which requirements are not known at the beginning or may change during development. For a given project, a spiral process may cost more and take more time than does a waterfall process, but may better satisfy a customer's expectations and needs. Models for simulating various waterfall processes have been developed previously, but until now, there have been no models for simulating spiral processes. The present spiral-process-simulating model and the software that implements it were developed by extending a discrete-event simulation process model of the IEEE 12207 Software Development Process, which was built using commercially available software known as the Process Analysis Tradeoff Tool (PATT). Typical inputs to PATT models include industry-average values of product size (expressed as number of lines of code), productivity (number of lines of code per hour), and number of defects per source line of code. The user provides the number of resources, the overall percent of effort that should be allocated to each process step, and the number of desired staff members for each step. The output of PATT includes the size of the product, a measure of effort, a measure of rework effort, the duration of the entire process, and the numbers of injected, detected, and corrected defects as well as a number of other interesting features. In the development of the present model, steps were added to the IEEE 12207 waterfall process, and this model and its implementing software were made to run repeatedly through the sequence of steps, each repetition representing an iteration in a spiral process. Because the IEEE 12207 model is founded on a waterfall paradigm, it enables direct comparison of spiral and waterfall processes. The model can be used throughout a software-development project to analyze the project as more information becomes available. For instance, data from early iterations can be used as inputs to the model, and the model can be used to estimate the time and cost of carrying the project to completion.
NASA Technical Reports Server (NTRS)
Wolf, Stephen W. D.
1988-01-01
The Wall Adjustment Strategy (WAS) software provides successful on-line control of the 2-D flexible walled test section of the Langley 0.3-m Transonic Cryogenic Tunnel. This software package allows the level of operator intervention to be regulated as necessary for research and production type 2-D testing using and Adaptive Wall Test Section (AWTS). The software is designed to accept modification for future requirements, such as 3-D testing, with a minimum of complexity. The WAS software described is an attempt to provide a user friendly package which could be used to control any flexible walled AWTS. Control system constraints influence the details of data transfer, not the data type. Then this entire software package could be used in different control systems, if suitable interface software is available. A complete overview of the software highlights the data flow paths, the modular architecture of the software and the various operating and analysis modes available. A detailed description of the software modules includes listings of the code. A user's manual is provided to explain task generation, operating environment, user options and what to expect at execution.
A Roadmap to Continuous Integration for ATLAS Software Development
NASA Astrophysics Data System (ADS)
Elmsheuser, J.; Krasznahorkay, A.; Obreshkov, E.; Undrus, A.; ATLAS Collaboration
2017-10-01
The ATLAS software infrastructure facilitates efforts of more than 1000 developers working on the code base of 2200 packages with 4 million lines of C++ and 1.4 million lines of python code. The ATLAS offline code management system is the powerful, flexible framework for processing new package versions requests, probing code changes in the Nightly Build System, migration to new platforms and compilers, deployment of production releases for worldwide access and supporting physicists with tools and interfaces for efficient software use. It maintains multi-stream, parallel development environment with about 70 multi-platform branches of nightly releases and provides vast opportunities for testing new packages, for verifying patches to existing software and for migrating to new platforms and compilers. The system evolution is currently aimed on the adoption of modern continuous integration (CI) practices focused on building nightly releases early and often, with rigorous unit and integration testing. This paper describes the CI incorporation program for the ATLAS software infrastructure. It brings modern open source tools such as Jenkins and GitLab into the ATLAS Nightly System, rationalizes hardware resource allocation and administrative operations, provides improved feedback and means to fix broken builds promptly for developers. Once adopted, ATLAS CI practices will improve and accelerate innovation cycles and result in increased confidence in new software deployments. The paper reports the status of Jenkins integration with the ATLAS Nightly System as well as short and long term plans for the incorporation of CI practices.
Tool for Analysis and Reduction of Scientific Data
NASA Technical Reports Server (NTRS)
James, Mark
2006-01-01
The Automated Scheduling and Planning Environment (ASPEN) computer program has been updated to version 3.0. ASPEN as a whole (up to version 2.0) has been summarized, and selected aspects of ASPEN have been discussed in several previous NASA Tech Briefs articles. Restated briefly, ASPEN is a modular, reconfigurable, application software framework for solving batch problems that involve reasoning about time, activities, states, and resources. Applications of ASPEN can include planning spacecraft missions, scheduling of personnel, and managing supply chains, inventories, and production lines. ASPEN 3.0 can be customized for a wide range of applications and for a variety of computing environments that include various central processing units and randomaccess memories. Domain-specific reasoning modules (e.g., modules for determining orbits for spacecraft) can easily be plugged into ASPEN 3.0. Improvements over other, similar software that have been incorporated into ASPEN 3.0 include a provision for more expressive time-line values, new parsing capabilities afforded by an ASPEN language based on Extensible Markup Language, improved search capabilities, and improved interfaces to other, utility-type software (notably including MATLAB).
ERIC Educational Resources Information Center
RESNA: Association for the Advancement of Rehabilitation Technology, Washington, DC.
This resource directory provides a selective listing of electronic networks, online databases, and bulletin boards that highlight technology-related services and products. For each resource, the following information is provided: name, address, and telephone number; description; target audience; hardware/software needs to access the system;…
Delivering Savings with Open Architecture and Product Lines
2011-04-30
p.m. Chair: Christopher Deegan , Executive Director, Program Executive Office for Integrated Warfare Systems Delivering Savings with Open...Architectures Walt Scacchi and Thomas Alspaugh, Institute for Software Research Christopher Deegan —Executive Director, Program Executive Officer...Integrated Warfare Systems (PEO IWS). Mr. Deegan directs the development, acquisition, and fleet support of 150 combat weapon system programs managed by 350
Hu, Zhi-yu; Zhang, Lei; Ma, Wei-guang; Yan, Xiao-juan; Li, Zhi-xin; Zhang, Yong-zhi; Wang, Le; Dong, Lei; Yin, Wang-bao; Jia, Suo-tang
2012-03-01
Self-designed identifying software for LIBS spectral line was introduced. Being integrated with LabVIEW, the soft ware can smooth spectral lines and pick peaks. The second difference and threshold methods were employed. Characteristic spectrum of several elements matches the NIST database, and realizes automatic spectral line identification and qualitative analysis of the basic composition of sample. This software can analyze spectrum handily and rapidly. It will be a useful tool for LIBS.
Development of on line automatic separation device for apple and sleeve
NASA Astrophysics Data System (ADS)
Xin, Dengke; Ning, Duo; Wang, Kangle; Han, Yuhang
2018-04-01
Based on STM32F407 single chip microcomputer as control core, automatic separation device of fruit sleeve is designed. This design consists of hardware and software. In hardware, it includes mechanical tooth separator and three degree of freedom manipulator, as well as industrial control computer, image data acquisition card, end effector and other structures. The software system is based on Visual C++ development environment, to achieve localization and recognition of fruit sleeve with the technology of image processing and machine vision, drive manipulator of foam net sets of capture, transfer, the designated position task. Test shows: The automatic separation device of the fruit sleeve has the advantages of quick response speed and high separation success rate, and can realize separation of the apple and plastic foam sleeve, and lays the foundation for further studying and realizing the application of the enterprise production line.
Model-Based Development of Automotive Electronic Climate Control Software
NASA Astrophysics Data System (ADS)
Kakade, Rupesh; Murugesan, Mohan; Perugu, Bhupal; Nair, Mohanan
With increasing complexity of software in today's products, writing and maintaining thousands of lines of code is a tedious task. Instead, an alternative methodology must be employed. Model-based development is one candidate that offers several benefits and allows engineers to focus on the domain of their expertise than writing huge codes. In this paper, we discuss the application of model-based development to the electronic climate control software of vehicles. The back-to-back testing approach is presented that ensures flawless and smooth transition from legacy designs to the model-based development. Simulink report generator to create design documents from the models is presented along with its usage to run the simulation model and capture the results into the test report. Test automation using model-based development tool that support the use of unique set of test cases for several testing levels and the test procedure that is independent of software and hardware platform is also presented.
U.S. Participation in the GOME and SCIAMACHY Projects
NASA Technical Reports Server (NTRS)
Chance, K. V.
1996-01-01
This report summarizes research done under NASA Grant NAGW-2541 from April 1, 1996 through March 31, 1997. The research performed during this reporting period includes development and maintenance of scientific software for the GOME retrieval algorithms, consultation on operational software development for GOME, consultation and development for SCIAMACHY near-real-time (NRT) and off-line (OL) data products, and development of infrared line-by-line atmospheric modeling and retrieval capability for SCIAMACHY. SAO also continues to participate in GOME validation studies, to the limit that can be accomplished at the present level of funding. The Global Ozone Monitoring Experiment was successfully launched on the ERS-2 satellite on April 20, 1995, and remains working in normal fashion. SCIAMACHY is currently in instrument characterization. The first two European ozone monitoring instruments (OMI), to fly on the Metop series of operational meteorological satellites being planned by Eumetsat, have been selected to be GOME-type instruments (the first, in fact, will be the refurbished GOME flight spare). K. Chance is the U.S. member of the OMI Users Advisory Group.
Core Flight System (cFS) a Low Cost Solution for SmallSats
NASA Technical Reports Server (NTRS)
McComas, David; Strege, Susanne; Wilmot, Jonathan
2015-01-01
The cFS is a FSW product line that uses a layered architecture and compile-time configuration parameters which make it portable and scalable for a wide range of platforms. The software layers that defined the application run-time environment are now under a NASA-wide configuration control board with the goal of sustaining an open-source application ecosystem.
ERIC Educational Resources Information Center
Martínez-Hernández, Cesar; Ulloa-Azpeitia, Ricardo
2017-01-01
Based on the theoretical elements of the instrumental approach to tool use known as Task-Technique-Theory (Artigue, 2002), this paper analyses and discusses the performance of graduate students enrolled in a Teacher Training program. The latter performance relates to tracing tangent lines to the curve of a quadratic function in Dynamic Geometry…
SHARP pre-release v1.0 - Current Status and Documentation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mahadevan, Vijay S.; Rahaman, Ronald O.
The NEAMS Reactor Product Line effort aims to develop an integrated multiphysics simulation capability for the design and analysis of future generations of nuclear power plants. The Reactor Product Line code suite’s multi-resolution hierarchy is being designed to ultimately span the full range of length and time scales present in relevant reactor design and safety analyses, as well as scale from desktop to petaflop computing platforms. In this report, building on a several previous report issued in September 2014, we describe our continued efforts to integrate thermal/hydraulics, neutronics, and structural mechanics modeling codes to perform coupled analysis of a representativemore » fast sodium-cooled reactor core in preparation for a unified release of the toolkit. The work reported in the current document covers the software engineering aspects of managing the entire stack of components in the SHARP toolkit and the continuous integration efforts ongoing to prepare a release candidate for interested reactor analysis users. Here we report on the continued integration effort of PROTEUS/Nek5000 and Diablo into the NEAMS framework and the software processes that enable users to utilize the capabilities without losing scientific productivity. Due to the complexity of the individual modules and their necessary/optional dependency library chain, we focus on the configuration and build aspects for the SHARP toolkit, which includes capability to autodownload dependencies and configure/install with optimal flags in an architecture-aware fashion. Such complexity is untenable without strong software engineering processes such as source management, source control, change reviews, unit tests, integration tests and continuous test suites. Details on these processes are provided in the report as a building step for a SHARP user guide that will accompany the first release, expected by Mar 2016.« less
NASA Astrophysics Data System (ADS)
Brouwer, Albert; Brown, David; Tomuta, Elena
2017-04-01
To detect nuclear explosions, waveform data from over 240 SHI stations world-wide flows into the International Data Centre (IDC) of the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO), located in Vienna, Austria. A complex pipeline of software applications processes this data in numerous ways to form event hypotheses. The software codebase comprises over 2 million lines of code, reflects decades of development, and is subject to frequent enhancement and revision. Since processing must run continuously and reliably, software changes are subjected to thorough testing before being put into production. To overcome the limitations and cost of manual testing, the Continuous Automated Testing System (CATS) has been created. CATS provides an isolated replica of the IDC processing environment, and is able to build and test different versions of the pipeline software directly from code repositories that are placed under strict configuration control. Test jobs are scheduled automatically when code repository commits are made. Regressions are reported. We present the CATS design choices and test methods. Particular attention is paid to how the system accommodates the individual testing of strongly interacting software components that lack test instrumentation.
Software for Improved Extraction of Data From Tape Storage
NASA Technical Reports Server (NTRS)
Cheng, Chiu-Fu
2003-01-01
A computer program has been written to replace the original software of Racal Storeplex Delta tape recorders, which are used at Stennis Space Center. The original software could be activated by a command- line interface only; the present software offers the option of a command-line or graphical user interface. The present software also offers the option of batch-file operation (activation by a file that contains command lines for operations performed consecutively). The present software is also more reliable than was the original software: The original software was plagued by several deficiencies that made it difficult to execute, modify, and test. In addition, when using the original software to extract data that had been recorded within specified intervals of time, the resolution with which one could control starting and stopping times was no finer than about a second (or, in some cases, several seconds). In contrast, the present software is capable of controlling playback times to within 1/100 second of times specified by the user, assuming that the tape-recorder clock is accurate to within 1/100 second.
Software for Improved Extraction of Data From Tape Storage
NASA Technical Reports Server (NTRS)
Cheng, Chiu-Fu
2002-01-01
A computer program has been written to replace the original software of Racal Storeplex Delta tape recorders, which are still used at Stennis Space Center but have been discontinued by the manufacturer. Whereas the original software could be activated by a command-line interface only, the present software offers the option of a command-line or graphical user interface. The present software also offers the option of batch-file operation (activation by a file that contains command lines for operations performed consecutively). The present software is also more reliable than was the original software: The original software was plagued by several deficiencies that made it difficult to execute, modify, and test. In addition, when using the original software to extract data that had been recorded within specified intervals of time, the resolution with which one could control starting and stopping times was no finer than about a second (or, in some cases, several seconds). In contrast, the present software is capable of controlling playback times to within 1/100 second of times specified by the user, assuming that the tape-recorder clock is accurate to within 1/100 second.
Design of on-line system for measuring and tracking time of assembly
NASA Astrophysics Data System (ADS)
Senderská, Katarína; Mareš, Albert; Evin, Emil
2016-04-01
Manual assembly performed at assembly workstations nowadays still has a unique place in different kinds of production. To increase the productivity and quality of manual assembly it is necessary to analyse the existing workplaces and find ways to improve and streamline work done at these workplaces. The article deals with the design of a model for on-line analysis of a manual assembly process. The proposed model is based on the use of sensors or the so-called button-box and the use of software for recording and evaluating data. Based on the obtained data it is then possible to evaluate the time characteristics of the assembly process, aswell as to find sources of delays and mistakes and then take appropriate action to correct them.
Changing the Lines in the Coloring Book
2004-05-18
multitude of SCADA programmers creates a multitude of idiosyncratic programs, making it very difficult to know how to hack into large numbers of them...an observed pattern of hacking that would alert authorities seems not to have been discussed.) However, some SCADAs are not physically connected to the...normal hacking . Simultaneously, it would identify software vulnerabilities in other products and design viruses and worms for attacking them. (The
NASA Technical Reports Server (NTRS)
1996-01-01
Open Sesame! is the first commercial software product that learns user's behavior, and offers automation and coaching suggestions to the user. The neural learning module looks for repetitive patterns that have not been automated; when it finds one, it creates an observation and, upon approval, automates the task. The manufacturer, Charles River Analytics, credits Langley Research Center and Johnson Space Center Small Business Innovation Research grants and the time the president and vice president spent at the two centers in the 1970s as being essential to the development of their product line.
Software Sharing Enables Smarter Content Management
NASA Technical Reports Server (NTRS)
2007-01-01
In 2004, NASA established a technology partnership with Xerox Corporation to develop high-tech knowledge management systems while providing new tools and applications that support the Vision for Space Exploration. In return, NASA provides research and development assistance to Xerox to progress its product line. The first result of the technology partnership was a new system called the NX Knowledge Network (based on Xerox DocuShare CPX). Created specifically for NASA's purposes, this system combines Netmark-practical database content management software created by the Intelligent Systems Division of NASA's Ames Research Center-with complementary software from Xerox's global research centers and DocuShare. NX Knowledge Network was tested at the NASA Astrobiology Institute, and is widely used for document management at Ames, Langley Research Center, within the Mission Operations Directorate at Johnson Space Center, and at the Jet Propulsion Laboratory, for mission-related tasks.
NASA Astrophysics Data System (ADS)
Seha, S.; Zamberi, J.; Fairu, A. J.
2017-10-01
Material handling system (MHS) is an important part for the productivity plant and has recognized as an integral part of today’s manufacturing system. Currently, MHS has growth tremendously with its technology and equipment type. Based on the case study observation, the issue involving material handling system contribute to the reduction of production efficiency. This paper aims to propose a new design of integration between material handling and manufacturing layout by investigating the influences of layout and material handling system. A method approach tool using Delmia Quest software is introduced and the simulation result is used to assess the influences of the integration between material handling system and manufacturing layout in the performance of automotive assembly line. The result show, the production of assembly line output increases more than 31% from the current system. The source throughput rate average value went up to 252 units per working hour in model 3 and show the effectiveness of the pick-to-light system as efficient storage equipment. Thus, overall result shows, the application of AGV and the pick-to-light system gave a large significant effect in the automotive assembly line. Moreover, the change of layout also shows a large significant improvement to the performance.
Khoshgoftaar, T M; Allen, E B; Hudepohl, J P; Aud, S J
1997-01-01
Society relies on telecommunications to such an extent that telecommunications software must have high reliability. Enhanced measurement for early risk assessment of latent defects (EMERALD) is a joint project of Nortel and Bell Canada for improving the reliability of telecommunications software products. This paper reports a case study of neural-network modeling techniques developed for the EMERALD system. The resulting neural network is currently in the prototype testing phase at Nortel. Neural-network models can be used to identify fault-prone modules for extra attention early in development, and thus reduce the risk of operational problems with those modules. We modeled a subset of modules representing over seven million lines of code from a very large telecommunications software system. The set consisted of those modules reused with changes from the previous release. The dependent variable was membership in the class of fault-prone modules. The independent variables were principal components of nine measures of software design attributes. We compared the neural-network model with a nonparametric discriminant model and found the neural-network model had better predictive accuracy.
The Robust Software Feedback Model: An Effective Waterfall Model Tailoring for Space SW
NASA Astrophysics Data System (ADS)
Tipaldi, Massimo; Gotz, Christoph; Ferraguto, Massimo; Troiano, Luigi; Bruenjes, Bernhard
2013-08-01
The selection of the most suitable software life cycle process is of paramount importance in any space SW project. Despite being the preferred choice, the waterfall model is often exposed to some criticism. As matter of fact, its main assumption of moving to a phase only when the preceding one is completed and perfected (and under the demanding SW schedule constraints) is not easily attainable. In this paper, a tailoring of the software waterfall model (named “Robust Software Feedback Model”) is presented. The proposed methodology sorts out these issues by combining a SW waterfall model with a SW prototyping approach. The former is aligned with the SW main production line and is based on the full ECSS-E-ST-40C life-cycle reviews, whereas the latter is carried out in advance versus the main SW streamline (so as to inject its lessons learnt into the main streamline) and is based on a lightweight approach.
Lozano-Fuentes, Saul; Elizondo-Quiroga, Darwin; Farfan-Ale, Jose Arturo; Loroño-Pino, Maria Alba; Garcia-Rejon, Julian; Gomez-Carro, Salvador; Lira-Zumbardo, Victor; Najera-Vazquez, Rosario; Fernandez-Salas, Ildefonso; Calderon-Martinez, Joaquin; Dominguez-Galera, Marco; Mis-Avila, Pedro; Morris, Natashia; Coleman, Michael; Moore, Chester G; Beaty, Barry J; Eisen, Lars
2008-09-01
Novel, inexpensive solutions are needed for improved management of vector-borne and other diseases in resource-poor environments. Emerging free software providing access to satellite imagery and simple editing tools (e.g. Google Earth) complement existing geographic information system (GIS) software and provide new opportunities for: (i) strengthening overall public health capacity through development of information for city infrastructures; and (ii) display of public health data directly on an image of the physical environment. We used freely accessible satellite imagery and a set of feature-making tools included in the software (allowing for production of polygons, lines and points) to generate information for city infrastructure and to display disease data in a dengue decision support system (DDSS) framework. Two cities in Mexico (Chetumal and Merida) were used to demonstrate that a basic representation of city infrastructure useful as a spatial backbone in a DDSS can be rapidly developed at minimal cost. Data layers generated included labelled polygons representing city blocks, lines representing streets, and points showing the locations of schools and health clinics. City blocks were colour-coded to show presence of dengue cases. The data layers were successfully imported in a format known as shapefile into a GIS software. The combination of Google Earth and free GIS software (e.g. HealthMapper, developed by WHO, and SIGEpi, developed by PAHO) has tremendous potential to strengthen overall public health capacity and facilitate decision support system approaches to prevention and control of vector-borne diseases in resource-poor environments.
The 1984 NASA/ASEE summer faculty fellowship program
NASA Technical Reports Server (NTRS)
1984-01-01
The assessment of forest productivity and associated nitrogen flux in a number of conifer ecosystems is described. As a base line study of acid precipitation in the Sierra Nevada, involved is the extraction and integration of a number of data planes describing the terrain, soils, lithology, vegetation cover and structure, and microclimate of the region. The development of automated techniques to extract topographic networks (stream canyons and ridge lines) for use as a landscrape skeleton to organize and integrate data sets into an efficient geographical information system is examined. The software is written in both FORTRAN and C, and is portable to a number of different computer environments with minimal modification.
Solid State Lighting Program (Falcon)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meeks, Steven
2012-06-30
Over the past two years, KLA-Tencor and partners successfully developed and deployed software and hardware tools that increase product yield for High Brightness LED (HBLED) manufacturing and reduce product development and factory ramp times. This report summarizes our development effort and details of how the results of the Solid State Light Program (Falcon) have started to help HBLED manufacturers optimize process control by enabling them to flag and correct identified killer defect conditions at any point of origin in the process manufacturing flow. This constitutes a quantum leap in yield management over current practice. Current practice consists of die dispositioningmore » which is just rejection of bad die at end of process based upon probe tests, loosely assisted by optical in-line monitoring for gross process deficiencies. For the first time, and as a result of our Solid State Lighting Program, our LED manufacturing partners have obtained the software and hardware tools that optimize individual process steps to control killer defects at the point in the processes where they originate. Products developed during our two year program enable optimized inspection strategies for many product lines to minimize cost and maximize yield. The Solid State Lighting Program was structured in three phases: i) the development of advanced imaging modes that achieve clear separation between LED defect types, improves signal to noise and scan rates, and minimizes nuisance defects for both front end and back end inspection tools, ii) the creation of defect source analysis (DSA) software that connect the defect maps from back-end and front-end HBLED manufacturing tools to permit the automatic overlay and traceability of defects between tools and process steps, suppress nuisance defects, and identify the origin of killer defects with process step and conditions, and iii) working with partners (Philips Lumileds) on product wafers, obtain a detailed statistical correlation of automated defect and DSA map overlay to failed die identified using end product probe test results. Results from our two year effort have led to “automated end-to-end defect detection” with full defect traceability and the ability to unambiguously correlate device killer defects to optically detected features and their point of origin within the process. Success of the program can be measured by yield improvements at our partner’s facilities and new product orders.« less
Warthog: A MOOSE-Based Application for the Direct Code Coupling of BISON and PROTEUS
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCaskey, Alexander J.; Slattery, Stuart; Billings, Jay Jay
The Nuclear Energy Advanced Modeling and Simulation (NEAMS) program from the Department of Energy's Office of Nuclear Energy provides a robust toolkit for the modeling and simulation of current and future advanced nuclear reactor designs. This toolkit provides these technologies organized across product lines: two divisions targeted at fuels and end-to-end reactor modeling, and a third for integration, coupling, and high-level workflow management. The Fuels Product Line and the Reactor Product line provide advanced computational technologies that serve each respective field well, however, their current lack of integration presents a major impediment to future improvements of simulation solution fidelity. Theremore » is a desire for the capability to mix and match tools across Product Lines in an effort to utilize the best from both to improve NEAMS modeling and simulation technologies. This report details a new effort to provide this Product Line interoperability through the development of a new application called Warthog. This application couples the BISON Fuel Performance application from the Fuels Product Line and the PROTEUS Core Neutronics application from the Reactors Product Line in an effort to utilize the best from all parts of the NEAMS toolkit and improve overall solution fidelity of nuclear fuel simulations. To achieve this, Warthog leverages as much prior work from the NEAMS program as possible, and in doing so, enables interoperability between the disparate MOOSE and SHARP frameworks, and the libMesh and MOAB mesh data formats. This report describes this work in full. We begin with a detailed look at the individual NEAMS framework technologies used and developed in the various Product Lines, and the current status of their interoperability. We then introduce the Warthog application: its overall architecture and the ways it leverages the best existing tools from across the NEAMS toolkit to enable BISON-PROTEUS integration. Furthermore, we show how Warthog leverages a tool known as DataTransferKit to seamlessly enable the transfer for solution data between disparate frameworks and mesh formats. To end, we demonstrate tests for the direct software coupling of BISON and PROTEUS using Warthog, and discuss current impediments and solutions to the construction of physically realistic input models for this coupled BISON-PROTEUS system.« less
Quality by control: Towards model predictive control of mammalian cell culture bioprocesses.
Sommeregger, Wolfgang; Sissolak, Bernhard; Kandra, Kulwant; von Stosch, Moritz; Mayer, Martin; Striedner, Gerald
2017-07-01
The industrial production of complex biopharmaceuticals using recombinant mammalian cell lines is still mainly built on a quality by testing approach, which is represented by fixed process conditions and extensive testing of the end-product. In 2004 the FDA launched the process analytical technology initiative, aiming to guide the industry towards advanced process monitoring and better understanding of how critical process parameters affect the critical quality attributes. Implementation of process analytical technology into the bio-production process enables moving from the quality by testing to a more flexible quality by design approach. The application of advanced sensor systems in combination with mathematical modelling techniques offers enhanced process understanding, allows on-line prediction of critical quality attributes and subsequently real-time product quality control. In this review opportunities and unsolved issues on the road to a successful quality by design and dynamic control implementation are discussed. A major focus is directed on the preconditions for the application of model predictive control for mammalian cell culture bioprocesses. Design of experiments providing information about the process dynamics upon parameter change, dynamic process models, on-line process state predictions and powerful software environments seem to be a prerequisite for quality by control realization. © 2017 The Authors. Biotechnology Journal published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
A Framework for Performing Verification and Validation in Reuse Based Software Engineering
NASA Technical Reports Server (NTRS)
Addy, Edward A.
1997-01-01
Verification and Validation (V&V) is currently performed during application development for many systems, especially safety-critical and mission- critical systems. The V&V process is intended to discover errors, especially errors related to critical processing, as early as possible during the development process. The system application provides the context under which the software artifacts are validated. This paper describes a framework that extends V&V from an individual application system to a product line of systems that are developed within an architecture-based software engineering environment. This framework includes the activities of traditional application-level V&V, and extends these activities into domain engineering and into the transition between domain engineering and application engineering. The framework includes descriptions of the types of activities to be performed during each of the life-cycle phases, and provides motivation for the activities.
Interactive graphics for the Macintosh: software review of FlexiGraphs.
Antonak, R F
1990-01-01
While this product is clearly unique, its usefulness to individuals outside small business environments is somewhat limited. FlexiGraphs is, however, a reasonable first attempt to design a microcomputer software package that controls data through interactive editing within a graph. Although the graphics capabilities of mainframe programs such as MINITAB (Ryan, Joiner, & Ryan, 1981) and the graphic manipulations available through exploratory data analysis (e.g., Velleman & Hoaglin, 1981) will not be surpassed anytime soon by this program, a researcher may want to add this program to a software library containing other Macintosh statistics, drawing, and graphics programs if only to obtain the easy-to-obtain curve fitting and line smoothing options. I welcome the opportunity to review the enhanced "scientific" version of FlexiGraphs that the author of the program indicates is currently under development. An MS-DOS version of the program should be available within the year.
Evolutionary Telemetry and Command Processor (TCP) architecture
NASA Technical Reports Server (NTRS)
Schneider, John R.
1992-01-01
A low cost, modular, high performance, and compact Telemetry and Command Processor (TCP) is being built as the foundation of command and data handling subsystems for the next generation of satellites. The TCP product line will support command and telemetry requirements for small to large spacecraft and from low to high rate data transmission. It is compatible with the latest TDRSS, STDN and SGLS transponders and provides CCSDS protocol communications in addition to standard TDM formats. Its high performance computer provides computing resources for hosted flight software. Layered and modular software provides common services using standardized interfaces to applications thereby enhancing software re-use, transportability, and interoperability. The TCP architecture is based on existing standards, distributed networking, distributed and open system computing, and packet technology. The first TCP application is planned for the 94 SDIO SPAS 3 mission. The architecture enhances rapid tailoring of functions thereby reducing costs and schedules developed for individual spacecraft missions.
Reuse and Interoperability of Avionics for Space Systems
NASA Technical Reports Server (NTRS)
Hodson, Robert F.
2007-01-01
The space environment presents unique challenges for avionics. Launch survivability, thermal management, radiation protection, and other factors are important for successful space designs. Many existing avionics designs use custom hardware and software to meet the requirements of space systems. Although some space vendors have moved more towards a standard product line approach to avionics, the space industry still lacks similar standards and common practices for avionics development. This lack of commonality manifests itself in limited reuse and a lack of interoperability. To address NASA s need for interoperable avionics that facilitate reuse, several hardware and software approaches are discussed. Experiences with existing space boards and the application of terrestrial standards is outlined. Enhancements and extensions to these standards are considered. A modular stack-based approach to space avionics is presented. Software and reconfigurable logic cores are considered for extending interoperability and reuse. Finally, some of the issues associated with the design of reusable interoperable avionics are discussed.
Kedalion: NASA's Adaptable and Agile Hardware/Software Integration and Test Lab
NASA Technical Reports Server (NTRS)
Mangieri, Mark L.; Vice, Jason
2011-01-01
NASA fs Kedalion engineering analysis lab at Johnson Space Center is on the forefront of validating and using many contemporary avionics hardware/software development and integration techniques, which represent new paradigms to heritage NASA culture. Kedalion has validated many of the Orion hardware/software engineering techniques borrowed from the adjacent commercial aircraft avionics solution space, with the intention to build upon such techniques to better align with today fs aerospace market. Using agile techniques, commercial products, early rapid prototyping, in-house expertise and tools, and customer collaboration, Kedalion has demonstrated that cost effective contemporary paradigms hold the promise to serve future NASA endeavors within a diverse range of system domains. Kedalion provides a readily adaptable solution for medium/large scale integration projects. The Kedalion lab is currently serving as an in-line resource for the project and the Multipurpose Crew Vehicle (MPCV) program.
Autonomous Real Time Requirements Tracing
NASA Technical Reports Server (NTRS)
Plattsmier, George I.; Stetson, Howard K.
2014-01-01
One of the more challenging aspects of software development is the ability to verify and validate the functional software requirements dictated by the Software Requirements Specification (SRS) and the Software Detail Design (SDD). Insuring the software has achieved the intended requirements is the responsibility of the Software Quality team and the Software Test team. The utilization of Timeliner-TLX(sup TM) Auto-Procedures for relocating ground operations positions to ISS automated on-board operations has begun the transition that would be required for manned deep space missions with minimal crew requirements. This transition also moves the auto-procedures from the procedure realm into the flight software arena and as such the operational requirements and testing will be more structured and rigorous. The autoprocedures would be required to meet NASA software standards as specified in the Software Safety Standard (NASASTD- 8719), the Software Engineering Requirements (NPR 7150), the Software Assurance Standard (NASA-STD-8739) and also the Human Rating Requirements (NPR-8705). The Autonomous Fluid Transfer System (AFTS) test-bed utilizes the Timeliner-TLX(sup TM) Language for development of autonomous command and control software. The Timeliner- TLX(sup TM) system has the unique feature of providing the current line of the statement in execution during real-time execution of the software. The feature of execution line number internal reporting unlocks the capability of monitoring the execution autonomously by use of a companion Timeliner-TLX(sup TM) sequence as the line number reporting is embedded inside the Timeliner-TLX(sup TM) execution engine. This negates I/O processing of this type data as the line number status of executing sequences is built-in as a function reference. This paper will outline the design and capabilities of the AFTS Autonomous Requirements Tracker, which traces and logs SRS requirements as they are being met during real-time execution of the targeted system. It is envisioned that real time requirements tracing will greatly assist the movement of autoprocedures to flight software enhancing the software assurance of auto-procedures and also their acceptance as reliable commanders
Autonomous Real Time Requirements Tracing
NASA Technical Reports Server (NTRS)
Plattsmier, George; Stetson, Howard
2014-01-01
One of the more challenging aspects of software development is the ability to verify and validate the functional software requirements dictated by the Software Requirements Specification (SRS) and the Software Detail Design (SDD). Insuring the software has achieved the intended requirements is the responsibility of the Software Quality team and the Software Test team. The utilization of Timeliner-TLX(sup TM) Auto- Procedures for relocating ground operations positions to ISS automated on-board operations has begun the transition that would be required for manned deep space missions with minimal crew requirements. This transition also moves the auto-procedures from the procedure realm into the flight software arena and as such the operational requirements and testing will be more structured and rigorous. The autoprocedures would be required to meet NASA software standards as specified in the Software Safety Standard (NASASTD- 8719), the Software Engineering Requirements (NPR 7150), the Software Assurance Standard (NASA-STD-8739) and also the Human Rating Requirements (NPR-8705). The Autonomous Fluid Transfer System (AFTS) test-bed utilizes the Timeliner-TLX(sup TM) Language for development of autonomous command and control software. The Timeliner-TLX(sup TM) system has the unique feature of providing the current line of the statement in execution during real-time execution of the software. The feature of execution line number internal reporting unlocks the capability of monitoring the execution autonomously by use of a companion Timeliner-TLX(sup TM) sequence as the line number reporting is embedded inside the Timeliner-TLX(sup TM) execution engine. This negates I/O processing of this type data as the line number status of executing sequences is built-in as a function reference. This paper will outline the design and capabilities of the AFTS Autonomous Requirements Tracker, which traces and logs SRS requirements as they are being met during real-time execution of the targeted system. It is envisioned that real time requirements tracing will greatly assist the movement of autoprocedures to flight software enhancing the software assurance of auto-procedures and also their acceptance as reliable commanders.
NASA Technical Reports Server (NTRS)
2002-01-01
MarketMiner(R) Products, a line of automated marketing analysis tools manufactured by MarketMiner, Inc., can benefit organizations that perform significant amounts of direct marketing. MarketMiner received a Small Business Innovation Research (SBIR) contract from NASA's Johnson Space Center to develop the software as a data modeling tool for space mission applications. The technology was then built into the company current products to provide decision support for business and marketing applications. With the tool, users gain valuable information about customers and prospects from existing data in order to increase sales and profitability. MarketMiner(R) is a registered trademark of MarketMiner, Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peck, T; Sparkman, D; Storch, N
''The LLNL Site-Specific Advanced Simulation and Computing (ASCI) Software Quality Engineering Recommended Practices VI.I'' document describes a set of recommended software quality engineering (SQE) practices for ASCI code projects at Lawrence Livermore National Laboratory (LLNL). In this context, SQE is defined as the process of building quality into software products by applying the appropriate guiding principles and management practices. Continual code improvement and ongoing process improvement are expected benefits. Certain practices are recommended, although projects may select the specific activities they wish to improve, and the appropriate time lines for such actions. Additionally, projects can rely on the guidance ofmore » this document when generating ASCI Verification and Validation (VSrV) deliverables. ASCI program managers will gather information about their software engineering practices and improvement. This information can be shared to leverage the best SQE practices among development organizations. It will further be used to ensure the currency and vitality of the recommended practices. This Overview is intended to provide basic information to the LLNL ASCI software management and development staff from the ''LLNL Site-Specific ASCI Software Quality Engineering Recommended Practices VI.I'' document. Additionally the Overview provides steps to using the ''LLNL Site-Specific ASCI Software Quality Engineering Recommended Practices VI.I'' document. For definitions of terminology and acronyms, refer to the Glossary and Acronyms sections in the ''LLNL Site-Specific ASCI Software Quality Engineering Recommended Practices VI.I''.« less
Mapping modern software process engineering techniques onto an HEP development environment
NASA Astrophysics Data System (ADS)
Wellisch, J. P.
2003-04-01
One of the most challenging issues faced in HEP in recent years is the question of how to capitalise on software development and maintenance experience in a continuous manner. To capitalise means in our context to evaluate and apply new process technologies as they arise, and to further evolve technologies already widely in use. It also implies the definition and adoption of standards. The CMS off-line software improvement effort aims at continual software quality improvement, and continual improvement in the efficiency of the working environment with the goal to facilitate doing great new physics. To achieve this, we followed a process improvement program based on ISO-15504, and Rational Unified Process. This experiment in software process improvement in HEP has been progressing now for a period of 3 years. Taking previous experience from ATLAS and SPIDER into account, we used a soft approach of continuous change within the limits of current culture to create of de facto software process standards within the CMS off line community as the only viable route to a successful software process improvement program in HEP. We will present the CMS approach to software process improvement in this process R&D, describe lessons learned, and mistakes made. We will demonstrate the benefits gained, and the current status of the software processes established in CMS off-line software.
Wang, Yan-Bin; Hu, Yu-Zhong; Li, Wen-Le; Zhang, Wei-Song; Zhou, Feng; Luo, Zhi
2014-10-01
In the present paper, based on the fast evaluation technique of near infrared, a method to predict the yield of atmos- pheric and vacuum line was developed, combined with H/CAMS software. Firstly, the near-infrared (NIR) spectroscopy method for rapidly determining the true boiling point of crude oil was developed. With commercially available crude oil spectroscopy da- tabase and experiments test from Guangxi Petrochemical Company, calibration model was established and a topological method was used as the calibration. The model can be employed to predict the true boiling point of crude oil. Secondly, the true boiling point based on NIR rapid assay was converted to the side-cut product yield of atmospheric/vacuum distillation unit by H/CAMS software. The predicted yield and the actual yield of distillation product for naphtha, diesel, wax and residual oil were compared in a 7-month period. The result showed that the NIR rapid crude assay can predict the side-cut product yield accurately. The near infrared analytic method for predicting yield has the advantages of fast analysis, reliable results, and being easy to online operate, and it can provide elementary data for refinery planning optimization and crude oil blending.
Software Reliability Analysis of NASA Space Flight Software: A Practical Experience
Sukhwani, Harish; Alonso, Javier; Trivedi, Kishor S.; Mcginnis, Issac
2017-01-01
In this paper, we present the software reliability analysis of the flight software of a recently launched space mission. For our analysis, we use the defect reports collected during the flight software development. We find that this software was developed in multiple releases, each release spanning across all software life-cycle phases. We also find that the software releases were developed and tested for four different hardware platforms, spanning from off-the-shelf or emulation hardware to actual flight hardware. For releases that exhibit reliability growth or decay, we fit Software Reliability Growth Models (SRGM); otherwise we fit a distribution function. We find that most releases exhibit reliability growth, with Log-Logistic (NHPP) and S-Shaped (NHPP) as the best-fit SRGMs. For the releases that experience reliability decay, we investigate the causes for the same. We find that such releases were the first software releases to be tested on a new hardware platform, and hence they encountered major hardware integration issues. Also such releases seem to have been developed under time pressure in order to start testing on the new hardware platform sooner. Such releases exhibit poor reliability growth, and hence exhibit high predicted failure rate. Other problems include hardware specification changes and delivery delays from vendors. Thus, our analysis provides critical insights and inputs to the management to improve the software development process. As NASA has moved towards a product line engineering for its flight software development, software for future space missions will be developed in a similar manner and hence the analysis results for this mission can be considered as a baseline for future flight software missions. PMID:29278255
Software Reliability Analysis of NASA Space Flight Software: A Practical Experience.
Sukhwani, Harish; Alonso, Javier; Trivedi, Kishor S; Mcginnis, Issac
2016-01-01
In this paper, we present the software reliability analysis of the flight software of a recently launched space mission. For our analysis, we use the defect reports collected during the flight software development. We find that this software was developed in multiple releases, each release spanning across all software life-cycle phases. We also find that the software releases were developed and tested for four different hardware platforms, spanning from off-the-shelf or emulation hardware to actual flight hardware. For releases that exhibit reliability growth or decay, we fit Software Reliability Growth Models (SRGM); otherwise we fit a distribution function. We find that most releases exhibit reliability growth, with Log-Logistic (NHPP) and S-Shaped (NHPP) as the best-fit SRGMs. For the releases that experience reliability decay, we investigate the causes for the same. We find that such releases were the first software releases to be tested on a new hardware platform, and hence they encountered major hardware integration issues. Also such releases seem to have been developed under time pressure in order to start testing on the new hardware platform sooner. Such releases exhibit poor reliability growth, and hence exhibit high predicted failure rate. Other problems include hardware specification changes and delivery delays from vendors. Thus, our analysis provides critical insights and inputs to the management to improve the software development process. As NASA has moved towards a product line engineering for its flight software development, software for future space missions will be developed in a similar manner and hence the analysis results for this mission can be considered as a baseline for future flight software missions.
Realizing the Living Paper using the ProvONE Model for Reproducible Research
NASA Astrophysics Data System (ADS)
Jones, M. B.; Jones, C. S.; Ludäscher, B.; Missier, P.; Walker, L.; Slaughter, P.; Schildhauer, M.; Cuevas-Vicenttín, V.
2015-12-01
Science has advanced through traditional publications that codify research results as a permenant part of the scientific record. But because publications are static and atomic, researchers can only cite and reference a whole work when building on prior work of colleagues. The open source software model has demonstrated a new approach in which strong version control in an open environment can nurture an open ecosystem of software. Developers now commonly fork and extend software giving proper credit, with less repetition, and with confidence in the relationship to original software. Through initiatives like 'Beyond the PDF', an analogous model has been imagined for open science, in which software, data, analyses, and derived products become first class objects within a publishing ecosystem that has evolved to be finer-grained and is realized through a web of linked open data. We have prototyped a Living Paper concept by developing the ProvONE provenance model for scientific workflows, with prototype deployments in DataONE. ProvONE promotes transparency and openness by describing the authenticity, origin, structure, and processing history of research artifacts and by detailing the steps in computational workflows that produce derived products. To realize the Living Paper, we decompose scientific papers into their constituent products and publish these as compound objects in the DataONE federation of archival repositories. Each individual finding and sub-product of a reseach project (such as a derived data table, a workflow or script, a figure, an image, or a finding) can be independently stored, versioned, and cited. ProvONE provenance traces link these fine-grained products within and across versions of a paper, and across related papers that extend an original analysis. This allows for open scientific publishing in which researchers extend and modify findings, creating a dynamic, evolving web of results that collectively represent the scientific enterprise. The Living Paper provides detailed metadata for properly interpreting and verifying individual research findings, for tracing the origin of ideas, for launching new lines of inquiry, and for implementing transitive credit for research and engineering.
US Participation in the GOME and SCIAMACHY Projects
NASA Technical Reports Server (NTRS)
Chance, K. V.; Geary, J. C.; Spurr, R. J. D.
1998-01-01
This report summarizes research done under NASA Grant NAGW-2541 through September 30, 1997. The research performed under this grant includes development and maintenance of scientific software for the GOME retrieval algorithms, consultation on operational software development for GOME, sensitivity and instrument studies to define GOME and SCIAMACHY instruments, consultation on optical and detector issues for both GOME and SCIAMACHY, consultation and development for SCIAMACHY near-real-time (NRT) and off-line (OL) data products, and development of infrared line-by-line atmospheric modeling and retrieval capability for SCIAMACHY. The European Space Agency selected the SAO to participate in GOME validation and science studies, part of the overall ERS AO. This provided access to all GOME data; The SAO activities that are carried out as a result of selection by ESA were funded by the present grant. The Global Ozone Monitoring Experiment was successfully launched on the ERS- 2 satellite on April 20, 1995, and remains working in normal fashion. SCIAMACHY is currently scheduled for launch in early 2000. The first two European ozone monitoring instruments (OMI), to fly on the q series of operational meteorological satellites being planned by Eumetsat, have been selected to be GOME-type instruments (the first, in fact, will be the refurbished GOME flight spare). K. Chance is the U.S. member of the OMI Users Advisory Group.
NASA Technical Reports Server (NTRS)
Chance, K. V.
2001-01-01
This report summarizes research done under NASA Grant NAG5-3461 from November 1, 1996 through December 31, 2000. The research performed during this reporting period includes development and maintenance of scientific software for the GOME retrieval algorithms, consultation on operational software development for GOME, sensitivity and instrument studies to help finalize the definition of the SCIAMACHY instrument, leading the development of the SCIAMACHY Scientific Requirements Document for Data and Algorithm Development, consultation and development for SCIAMACHY near-real-time (NRT) and off-line (OL) data products, radiative transfer model development for utilization in GOME, SCIAMACHY and other programs, development of infrared line-by-line atmospheric modeling and retrieval capability for SCIAMACHY, and participation in GOME and SCIAMACHY validation studies. The Global Ozone Monitoring Experiment was successfully launched on the ERS-2 satellite on April 20, 1995, and remains working in normal fashion. SCIAMACHY is currently planned for launch in late 2001 on the ESA Envisat satellite. Three GOME-2 instruments are now scheduled to fly on the Metop series of operational meteorological satellites (Eumetsat). K. Chance is a member of the reconstituted GOME Scientific Advisory Group, which will guide the GOME-2 program as well as the continuing ERS-2 GOME program.
NASA Astrophysics Data System (ADS)
Al-Jader, M. A.; Cullen, J. D.; Shaw, Andy; Al-Shamma'a, A. I.
2011-08-01
Currently there are about 4300 weld points on the average steel vehicle. Errors and problems due to tip damage and wear can cause great losses due to production line downtime. Current industrial monitoring systems check the quality of the nugget after processing 15 cars average once every two weeks. The nuggets are examined off line using a destructive process, which takes approximately 10 days to complete causing a long delay in the production process. In this paper a simulation results using software package, SORPAS, will be presented to determined the sustainability factors in spot welding process including Voltage, Current, Force, Water cooling rates, Material thicknesses and usage. The experimental results of various spot welding processes will be investigated and reported. The correlation of experimental results shows that SORPAS simulations can be used as an off line measurement to reduce factory energy usage. This paper also provides an overview of electrode current selection and its variance over the lifetime of the electrode tip, and describes the proposed analysis system for the selection of welding parameters for the spot welding process, as the electrode tip wears.
LSDCat: Detection and cataloguing of emission-line sources in integral-field spectroscopy datacubes
NASA Astrophysics Data System (ADS)
Herenz, Edmund Christian; Wisotzki, Lutz
2017-06-01
We present a robust, efficient, and user-friendly algorithm for detecting faint emission-line sources in large integral-field spectroscopic datacubes together with the public release of the software package Line Source Detection and Cataloguing (LSDCat). LSDCat uses a three-dimensional matched filter approach, combined with thresholding in signal-to-noise, to build a catalogue of individual line detections. In a second pass, the detected lines are grouped into distinct objects, and positions, spatial extents, and fluxes of the detected lines are determined. LSDCat requires only a small number of input parameters, and we provide guidelines for choosing appropriate values. The software is coded in Python and capable of processing very large datacubes in a short time. We verify the implementation with a source insertion and recovery experiment utilising a real datacube taken with the MUSE instrument at the ESO Very Large Telescope. The LSDCat software is available for download at http://muse-vlt.eu/science/tools and via the Astrophysics Source Code Library at http://ascl.net/1612.002
Upgrading Custom Simulink Library Components for Use in Newer Versions of Matlab
NASA Technical Reports Server (NTRS)
Stewart, Camiren L.
2014-01-01
The Spaceport Command and Control System (SCCS) at Kennedy Space Center (KSC) is a control system for monitoring and launching manned launch vehicles. Simulations of ground support equipment (GSE) and the launch vehicle systems are required throughout the life cycle of SCCS to test software, hardware, and procedures to train the launch team. The simulations of the GSE at the launch site in conjunction with off-line processing locations are developed using Simulink, a piece of Commercial Off-The-Shelf (COTS) software. The simulations that are built are then converted into code and ran in a simulation engine called Trick, a Government off-the-shelf (GOTS) piece of software developed by NASA. In the world of hardware and software, it is not uncommon to see the products that are utilized be upgraded and patched or eventually fade away into an obsolete status. In the case of SCCS simulation software, Matlab, a MathWorks product, has released a number of stable versions of Simulink since the deployment of the software on the Development Work Stations in the Linux environment (DWLs). The upgraded versions of Simulink has introduced a number of new tools and resources that, if utilized fully and correctly, will save time and resources during the overall development of the GSE simulation and its correlating documentation. Unfortunately, simply importing the already built simulations into the new Matlab environment will not suffice as it will produce results that may not be expected as they were in the version that is currently being utilized. Thus, an upgrade execution plan was developed and executed to fully upgrade the simulation environment to one of the latest versions of Matlab.
Combining analysis with optimization at Langley Research Center. An evolutionary process
NASA Technical Reports Server (NTRS)
Rogers, J. L., Jr.
1982-01-01
The evolutionary process of combining analysis and optimization codes was traced with a view toward providing insight into the long term goal of developing the methodology for an integrated, multidisciplinary software system for the concurrent analysis and optimization of aerospace structures. It was traced along the lines of strength sizing, concurrent strength and flutter sizing, and general optimization to define a near-term goal for combining analysis and optimization codes. Development of a modular software system combining general-purpose, state-of-the-art, production-level analysis computer programs for structures, aerodynamics, and aeroelasticity with a state-of-the-art optimization program is required. Incorporation of a modular and flexible structural optimization software system into a state-of-the-art finite element analysis computer program will facilitate this effort. This effort results in the software system used that is controlled with a special-purpose language, communicates with a data management system, and is easily modified for adding new programs and capabilities. A 337 degree-of-freedom finite element model is used in verifying the accuracy of this system.
NASA Technical Reports Server (NTRS)
Avila, Edwin M. Martinez; Muniz, Ricardo; Szafran, Jamie; Dalton, Adam
2011-01-01
Lines of code (LOC) analysis is one of the methods used to measure programmer productivity and estimate schedules of programming projects. The Launch Control System (LCS) had previously used this method to estimate the amount of work and to plan development efforts. The disadvantage of using LOC as a measure of effort is that one can only measure 30% to 35% of the total effort of software projects involves coding [8]. In the application, instead of using the LOC we are using function point for a better estimation of hours in each software to develop. Because of these disadvantages, Jamie Szafran of the System Software Branch of Control And Data Systems (NE-C3) at Kennedy Space Canter developed a web application called Function Point Analysis (FPA) Depot. The objective of this web application is that the LCS software architecture team can use the data to more accurately estimate the effort required to implement customer requirements. This paper describes the evolution of the domain model used for function point analysis as project managers continually strive to generate more accurate estimates.
CrossTalk: The Journal of Defense Software Engineering. Volume 24, Number 4, July/August 2011
2011-07-01
Project Management Tool (SSPMT), JASMINE , and ALADDIN, respectively [11, 12]. SSPMT is a web-based Six Sigma project management sup- porting tool...PSP/TSP data gathered from JASMINE and ALADDIN, SSPMT performs each step of DMAIC and provides analytic results. JASMINE and ALADDIN are web-based...done by using JASMINE . JASMINE collects an individual developer’s work product information such as Source Lines of Code (SLOC), fault counts, and
A Comparison of Air Force Data Systems
1993-08-01
a software cost model, SPQR . This model was chosen because it provides a straightforward means of modeling the enhancements as they V i VII-25 I would...estimated by SPQR (23,917) by $69 per hour for a total of $1,650,273. An additional 10 percent was added for generating or modifying the Middleware...equipment3 SLOC source lines of code SPO System Program Office SPQR System Product Quality Reporting SSC Standard Systems Center SSI system-to-system
Focus on Resiliency: A Process-Oriented Approach to Security
2005-11-01
by ANSI Std Z39-18 © 2005 Carnegie Mellon University CSI v1.0 2 Agenda About the SEI Characterizing the problem Security, resiliency, and risk A...2005 Carnegie Mellon University CSI v1.0 5 SEI Technical Programs Product Line Systems Dynamic Systems Software Engineering Process Management...University CSI v1.0 7 What is the problem? Is your organization’s security capability sufficient to identify and manage risks that result from failed
A flexible tool for diagnosing water, energy, and entropy budgets in climate models
NASA Astrophysics Data System (ADS)
Lembo, Valerio; Lucarini, Valerio
2017-04-01
We have developed a new flexible software for studying the global energy budget, the hydrological cycle, and the material entropy production of global climate models. The program receives as input radiative, latent and sensible energy fluxes, with the requirement that the variable names are in agreement with the Climate and Forecast (CF) conventions for the production of NetCDF datasets. Annual mean maps, meridional sections and time series are computed by means of Climate Data Operators (CDO) collection of command line operators developed at Max-Planck Institute for Meteorology (MPI-M). If a land-sea mask is provided, the program also computes the required quantities separately on the continents and oceans. Depending on the user's choice, the program also calls the MATLAB software to compute meridional heat transports and location and intensities of the peaks in the two hemispheres. We are currently planning to adapt the program in order to be included in the Earth System Model eValuation Tool (ESMValTool) community diagnostics.
Elemental misinterpretation in automated analysis of LIBS spectra.
Hübert, Waldemar; Ankerhold, Georg
2011-07-01
In this work, the Stark effect is shown to be mainly responsible for wrong elemental allocation by automated laser-induced breakdown spectroscopy (LIBS) software solutions. Due to broadening and shift of an elemental emission line affected by the Stark effect, its measured spectral position might interfere with the line position of several other elements. The micro-plasma is generated by focusing a frequency-doubled 200 mJ pulsed Nd/YAG laser on an aluminum target and furthermore on a brass sample in air at atmospheric pressure. After laser pulse excitation, we have measured the temporal evolution of the Al(II) ion line at 281.6 nm (4s(1)S-3p(1)P) during the decay of the laser-induced plasma. Depending on laser pulse power, the center of the measured line is red-shifted by 130 pm (490 GHz) with respect to the exact line position. In this case, the well-known spectral line positions of two moderate and strong lines of other elements coincide with the actual shifted position of the Al(II) line. Consequently, a time-resolving software analysis can lead to an elemental misinterpretation. To avoid a wrong interpretation of LIBS spectra in automated analysis software for a given LIBS system, we recommend using larger gate delays incorporating Stark broadening parameters and using a range of tolerance, which is non-symmetric around the measured line center. These suggestions may help to improve time-resolving LIBS software promising a smaller probability of wrong elemental identification and making LIBS more attractive for industrial applications.
Source Lines Counter (SLiC) Version 4.0
NASA Technical Reports Server (NTRS)
Monson, Erik W.; Smith, Kevin A.; Newport, Brian J.; Gostelow, Roli D.; Hihn, Jairus M.; Kandt, Ronald K.
2011-01-01
Source Lines Counter (SLiC) is a software utility designed to measure software source code size using logical source statements and other common measures for 22 of the programming languages commonly used at NASA and the aerospace industry. Such metrics can be used in a wide variety of applications, from parametric cost estimation to software defect analysis. SLiC has a variety of unique features such as automatic code search, automatic file detection, hierarchical directory totals, and spreadsheet-compatible output. SLiC was written for extensibility; new programming language support can be added with minimal effort in a short amount of time. SLiC runs on a variety of platforms including UNIX, Windows, and Mac OSX. Its straightforward command-line interface allows for customization and incorporation into the software build process for tracking development metrics. T
Adaptive cyber-attack modeling system
NASA Astrophysics Data System (ADS)
Gonsalves, Paul G.; Dougherty, Edward T.
2006-05-01
The pervasiveness of software and networked information systems is evident across a broad spectrum of business and government sectors. Such reliance provides an ample opportunity not only for the nefarious exploits of lone wolf computer hackers, but for more systematic software attacks from organized entities. Much effort and focus has been placed on preventing and ameliorating network and OS attacks, a concomitant emphasis is required to address protection of mission critical software. Typical software protection technique and methodology evaluation and verification and validation (V&V) involves the use of a team of subject matter experts (SMEs) to mimic potential attackers or hackers. This manpower intensive, time-consuming, and potentially cost-prohibitive approach is not amenable to performing the necessary multiple non-subjective analyses required to support quantifying software protection levels. To facilitate the evaluation and V&V of software protection solutions, we have designed and developed a prototype adaptive cyber attack modeling system. Our approach integrates an off-line mechanism for rapid construction of Bayesian belief network (BN) attack models with an on-line model instantiation, adaptation and knowledge acquisition scheme. Off-line model construction is supported via a knowledge elicitation approach for identifying key domain requirements and a process for translating these requirements into a library of BN-based cyber-attack models. On-line attack modeling and knowledge acquisition is supported via BN evidence propagation and model parameter learning.
Neural network to diagnose lining condition
NASA Astrophysics Data System (ADS)
Yemelyanov, V. A.; Yemelyanova, N. Y.; Nedelkin, A. A.; Zarudnaya, M. V.
2018-03-01
The paper presents data on the problem of diagnosing the lining condition at the iron and steel works. The authors describe the neural network structure and software that are designed and developed to determine the lining burnout zones. The simulation results of the proposed neural networks are presented. The authors note the low learning and classification errors of the proposed neural networks. To realize the proposed neural network, the specialized software has been developed.
New method of noncontact temperature measurement in on-line textile production
NASA Astrophysics Data System (ADS)
Cheng, Xianping; Song, Xing-Li; Deng, Xing-Zhong
1993-09-01
Based on the condition of textile production the method of infrared non-contact temperature measurement is adcpted in the heat-setting and drying heat-treatment process . This method is used to monitor the moving cloth. The temperature of the cloth is displayed rapidly and exactly. The principle of the temperature measurement is analysed theoretically in this paper. Mathematical analysis and calculation are used for introducing signal transmitting method. Adopted method of combining software with hardware the temperature is corrected and compensated with the aid of a single-chip microcomputer. The results of test indicate that the application of temperature measurement instrument provides reliable parameters in the quality control. And it is an important measure on improving the quality of products.
Software Quality Assurance Metrics
NASA Technical Reports Server (NTRS)
McRae, Kalindra A.
2004-01-01
Software Quality Assurance (SQA) is a planned and systematic set of activities that ensures conformance of software life cycle processes and products conform to requirements, standards and procedures. In software development, software quality means meeting requirements and a degree of excellence and refinement of a project or product. Software Quality is a set of attributes of a software product by which its quality is described and evaluated. The set of attributes includes functionality, reliability, usability, efficiency, maintainability, and portability. Software Metrics help us understand the technical process that is used to develop a product. The process is measured to improve it and the product is measured to increase quality throughout the life cycle of software. Software Metrics are measurements of the quality of software. Software is measured to indicate the quality of the product, to assess the productivity of the people who produce the product, to assess the benefits derived from new software engineering methods and tools, to form a baseline for estimation, and to help justify requests for new tools or additional training. Any part of the software development can be measured. If Software Metrics are implemented in software development, it can save time, money, and allow the organization to identify the caused of defects which have the greatest effect on software development. The summer of 2004, I worked with Cynthia Calhoun and Frank Robinson in the Software Assurance/Risk Management department. My task was to research and collect, compile, and analyze SQA Metrics that have been used in other projects that are not currently being used by the SA team and report them to the Software Assurance team to see if any metrics can be implemented in their software assurance life cycle process.
Introduction to the Security Engineering Risk Analysis (SERA) Framework
2014-11-01
military aircraft has increased from 8% to 80%. At the same time, the size of software in military aircraft has grown from 1,000 lines of code in the F...4A to 1.7 million lines of code in the F-22. This growth trend is expected to con- tinue over time [NASA 2009]. As software exerts more control of...their root causes can be traced to the software’s requirements, architecture, design, or code . Studies have shown that the cost of addressing a software
Ada software productivity prototypes: A case study
NASA Technical Reports Server (NTRS)
Hihn, Jairus M.; Habib-Agahi, Hamid; Malhotra, Shan
1988-01-01
A case study of the impact of Ada on a Command and Control project completed at the Jet Propulsion Laboratory (JPL) is given. The data for this study was collected as part of a general survey of software costs and productivity at JPL and other NASA sites. The task analyzed is a successful example of the use of rapid prototyping as applied to command and control for the U.S. Air Force and provides the U.S. Air Force Military Airlift Command with the ability to track aircraft, air crews and payloads worldwide. The task consists of a replicated database at several globally distributed sites. The local databases at each site can be updated within seconds after changes are entered at any one site. The system must be able to handle up to 400,000 activities per day. There are currently seven sites, each with a local area network of computers and a variety of user displays; the local area networks are tied together into a single wide area network. Using data obtained for eight modules, totaling approximately 500,000 source lines of code, researchers analyze the differences in productivities between subtasks. Factors considered are percentage of Ada used in coding, years of programmer experience, and the use of Ada tools and modern programming practices. The principle findings are the following. Productivity is very sensitive to programmer experience. The use of Ada software tools and the use of modern programming practices are important; without such use Ada is just a large complex language which can cause productivity to decrease. The impact of Ada on development effort phases is consistent with earlier reports at the project level but not at the module level.
NASA Technical Reports Server (NTRS)
Rabideau, Gregg; Chien, Steve; Knight, Russell; Schaffer, Steven; Tran, Daniel; Cichy, Benjamin; Sherwood, Robert
2006-01-01
The Automated Scheduling and Planning Environment (ASPEN) computer program has been updated to version 3.0. ASPEN is a modular, reconfigurable, application software framework for solving batch problems that involve reasoning about time, activities, states, and resources. Applications of ASPEN can include planning spacecraft missions, scheduling of personnel, and managing supply chains, inventories, and production lines. ASPEN 3.0 can be customized for a wide range of applications and for a variety of computing environments that include various central processing units and random access memories.
2003-09-01
resolution M&S concept for integrating heterogeneous M&S into the hierarchy has existed since the early 1980s [DH92a, DH92b]. 25...groups [PAD78]. The need for credible M&S grew in the Nation’s private and public sectors. By 1980 , information from computer-based simulations...formal) identified in the [GMS+96 and RPG00]. We noted that systemic issues identi- fied in by reports, studies, and assessments the early 1980s
openECA Platform and Analytics Alpha Test Results
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robertson, Russell
The objective of the Open and Extensible Control and Analytics (openECA) Platform for Phasor Data project is to develop an open source software platform that significantly accelerates the production, use, and ongoing development of real-time decision support tools, automated control systems, and off-line planning systems that (1) incorporate high-fidelity synchrophasor data and (2) enhance system reliability while enabling the North American Electric Reliability Corporation (NERC) operating functions of reliability coordinator, transmission operator, and/or balancing authority to be executed more effectively.
An Automated System for the Maintenance of Multiform Documentation
NASA Astrophysics Data System (ADS)
Rousseau, Bertrand; Ruggier, Mario; Smith, Matthiew
Software documentation for the user often exists in several forms including paper, electronic, on-line help, etc. We have build a system to help with the writing and maintenance of such kinds of documentation which relies on the FrameMaker product. As an example, we show how it is used to maintain the ADAMO documentation, delivered in 4 incarnations on paper, WWW hypertext, KUIP and running examples. The use of the system results in both time saving and quality improvements.
Hospital cost accounting: implementing the system successfully.
Burik, D; Duvall, T J
1985-05-01
To successfully implement a cost accounting system, certain key steps should be undertaken. These steps include developing and installing software; developing cost center budgets and inter-cost center allocations; developing service item standard costs; generating cost center level and patient level standard cost reports and reconciling these costs to actual costs; generating product line profitability reports and reconciling these reports to the financial statements; and providing ad hoc reporting capabilities. By following these steps, potential problems in the implementation process can be anticipated and avoided.
NASA Technical Reports Server (NTRS)
1990-01-01
In 1981 Wayne Erickson founded Microrim, Inc, a company originally focused on marketing a microcomputer version of RIM (Relational Information Manager). Dennis Comfort joined the firm and is now vice president, development. The team developed an advanced spinoff from the NASA system they had originally created, a microcomputer database management system known as R:BASE 4000. Microrim added many enhancements and developed a series of R:BASE products for various environments. R:BASE is now the second largest selling line of microcomputer database management software in the world.
openECA Platform and Analytics Beta Demonstration Results
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robertson, Russell
The objective of the Open and Extensible Control and Analytics (openECA) Platform for Phasor Data project is to develop an open source software platform that significantly accelerates the production, use, and ongoing development of real-time decision support tools, automated control systems, and off-line planning systems that (1) incorporate high-fidelity synchrophasor data and (2) enhance system reliability while enabling the North American Electric Reliability Corporation (NERC) operating functions of reliability coordinator, transmission operator, and/or balancing authority to be executed more effectively.
The winding road to being a code monkey
NASA Astrophysics Data System (ADS)
Sarahan, Michael
2017-09-01
I am now a software engineer at a company that provides data analytics services, and helps support the open source data science community. I have been a computer nerd for a very long time, but it was my CEU experience at Texas A&M with Sherry Yennello (2003-2005) that helped me put my nerd skills to productive use. My project then was simulation of pulse shape discrimination electronics, and it was an excellent introduction to core computational concerns, such as digitization: when you see a line on the screen, that's not really how the computer sees it. I wandered in graduate school through a chemistry program into using electron microscopes. My programming interest got me into image and signal processing, which led naturally to jobs in analyzing data, and also in acquiring data. Throughout, it was always difficult just to make software work. I got pretty good at making it work. That's what I do for a living now - package software so that it is easy for other people to do great science with.
Online catalog access and distribution of remotely sensed information
NASA Astrophysics Data System (ADS)
Lutton, Stephen M.
1997-09-01
Remote sensing is providing voluminous data and value added information products. Electronic sensors, communication electronics, computer software, hardware, and network communications technology have matured to the point where a distributed infrastructure for remotely sensed information is a reality. The amount of remotely sensed data and information is making distributed infrastructure almost a necessity. This infrastructure provides data collection, archiving, cataloging, browsing, processing, and viewing for applications from scientific research to economic, legal, and national security decision making. The remote sensing field is entering a new exciting stage of commercial growth and expansion into the mainstream of government and business decision making. This paper overviews this new distributed infrastructure and then focuses on describing a software system for on-line catalog access and distribution of remotely sensed information.
Stereo Navi 2.0: software for stereotaxic surgery of the common marmoset (Callithrix jacchus).
Tokuno, Hironobu; Tanaka, Ikuko; Umitsu, Yoshitomo; Nakamura, Yasuhisa
2009-11-01
Recently, we reported our web-accessible digital brain atlas of the common marmoset (Callithrix jacchus) at http://marmoset-brain.org:2008. Using digital images obtained during construction of this website, we developed stand-alone software for navigation of electrodes or injection needles for stereotaxic electrophysiological or anatomical experiments in vivo. This software enables us to draw lines on exchangeable section images, measure the length and angle of lines, superimpose a stereotaxic reference grid on the image, and send the image to the system clipboard. The software, Stereo Navi 2.0, is freely available at our brain atlas website.
The Effects of Development Team Skill on Software Product Quality
NASA Technical Reports Server (NTRS)
Beaver, Justin M.; Schiavone, Guy A.
2006-01-01
This paper provides an analysis of the effect of the skill/experience of the software development team on the quality of the final software product. A method for the assessment of software development team skill and experience is proposed, and was derived from a workforce management tool currently in use by the National Aeronautics and Space Administration. Using data from 26 smallscale software development projects, the team skill measures are correlated to 5 software product quality metrics from the ISO/IEC 9126 Software Engineering Product Quality standard. in the analysis of the results, development team skill is found to be a significant factor in the adequacy of the design and implementation. In addition, the results imply that inexperienced software developers are tasked with responsibilities ill-suited to their skill level, and thus have a significant adverse effect on the quality of the software product. Keywords: software quality, development skill, software metrics
Kukushkin, V I; Satusheva, E V; Aleksandrov, M T; Morozova, O A; Pashkov, E P; Ambartsumyan, O A; Amosova, V A
2015-01-01
Determination of the effect of microorganisms on spoilage of meat products during various temperature regimes of storage by integral indexes of luminescent lines in their spectra and development of an algorithm of microorganism indication by an express method using laser Raman-luminescent spectroscopy. Minced meat from beef and pork was used. Determination of quantity of mesophilic aerobic and opportunistic-anaerobic microorganisms was carried out by serial 10-fold dilutions with subsequent parallel seeding into Rida count total 24 plates and Petri dishes with 5% blood agar. Sample study was carried out in luminescent software-hardware complex Enspectr L405 (a variant of Enspectr M software-hardware complexes). Meat spoilage was established to be caused to a large degree by Pseudomonas genus (P. fluorescens, P. putida, P. fragi et al.) bacterial growth. Raman-luminescent spectra of bacteria that compose microflora, characterizing and accompanying beef and pork spoilage, were measured and recorded into a database. The results obtained will allow to use this technique in the future for both express-indication and differentiation of microorganisms and express-evaluation of quality of meat products at all stages of their manufacturing, storage, transport and realization.
SOFIA: a flexible source finder for 3D spectral line data
NASA Astrophysics Data System (ADS)
Serra, Paolo; Westmeier, Tobias; Giese, Nadine; Jurek, Russell; Flöer, Lars; Popping, Attila; Winkel, Benjamin; van der Hulst, Thijs; Meyer, Martin; Koribalski, Bärbel S.; Staveley-Smith, Lister; Courtois, Hélène
2015-04-01
We introduce SOFIA, a flexible software application for the detection and parametrization of sources in 3D spectral line data sets. SOFIA combines for the first time in a single piece of software a set of new source-finding and parametrization algorithms developed on the way to future H I surveys with ASKAP (WALLABY, DINGO) and APERTIF. It is designed to enable the general use of these new algorithms by the community on a broad range of data sets. The key advantages of SOFIA are the ability to: search for line emission on multiple scales to detect 3D sources in a complete and reliable way, taking into account noise level variations and the presence of artefacts in a data cube; estimate the reliability of individual detections; look for signal in arbitrarily large data cubes using a catalogue of 3D coordinates as a prior; provide a wide range of source parameters and output products which facilitate further analysis by the user. We highlight the modularity of SOFIA, which makes it a flexible package allowing users to select and apply only the algorithms useful for their data and science questions. This modularity makes it also possible to easily expand SOFIA in order to include additional methods as they become available. The full SOFIA distribution, including a dedicated graphical user interface, is publicly available for download.
NASA Astrophysics Data System (ADS)
Misceo, Monica; Buonamici, Roberto; Buttol, Patrizia; Naldesi, Luciano; Grimaldi, Filomena; Rinaldi, Caterina
2004-12-01
TESPI (Tool for Environmental Sound Product Innovation) is the prototype of a software tool developed within the framework of the "eLCA" project. The project, (www.elca.enea.it)financed by the European Commission, is realising "On line green tools and services for Small and Medium sized Enterprises (SMEs)". The implementation by SMEs of environmental product innovation (as fostered by the European Integrated Product Policy, IPP) needs specific adaptation to their economic model, their knowledge of production and management processes and their relationships with innovation and the environment. In particular, quality and costs are the main driving forces of innovation in European SMEs, and well known barriers exist to the adoption of an environmental approach in the product design. Starting from these considerations, the TESPI tool has been developed to support the first steps of product design taking into account both the quality and the environment. Two main issues have been considered: (i) classic Quality Function Deployment (QFD) can hardly be proposed to SMEs; (ii) the environmental aspects of the product life cycle need to be integrated with the quality approach. TESPI is a user friendly web-based tool, has a training approach and applies to modular products. Users are guided through the investigation of the quality aspects of their product (customer"s needs and requirements fulfilment) and the identification of the key environmental aspects in the product"s life cycle. A simplified check list allows analyzing the environmental performance of the product. Help is available for a better understanding of the analysis criteria. As a result, the significant aspects for the redesign of the product are identified.
NASA Technical Reports Server (NTRS)
Bindschadler, R.; Choi, H.; Wichlacz, A.; Bingham, R.; Bohlander, J.; Brunt, K.; Corr, H.; Drews, R.; Fricker, H.; Hall, M.;
2011-01-01
Two ice-dynamic transitions of the Antarctic ice sheet - the boundary of grounded ice features and the freely-floating boundary - are mapped at 15-m resolution by participants of the International Polar Year project ASAID using customized software combining Landsat-7 imagery and ICESat/GLAS laser altimetry. The grounded ice boundary is 53 610 km long; 74% abuts to floating ice shelves or outlet glaciers, 19% is adjacent to open or sea-ice covered ocean, and 7% of the boundary ice terminates on land. The freely-floating boundary, called here the hydrostatic line, is the most landward position on ice shelves that expresses the full amplitude of oscillating ocean tides. It extends 27 521 km and is discontinuous. Positional (one-sigma) accuracies of the grounded ice boundary vary an order of magnitude ranging from +/- 52m for the land and open-ocean terminating segments to +/- 502m for the outlet glaciers. The hydrostatic line is less well positioned with errors over 2 km. Elevations along each line are selected from 6 candidate digital elevation models based on their agreement with ICESat elevation values and surface shape inferred from the Landsat imagery. Elevations along the hydrostatic line are converted to ice thicknesses by applying a firn-correction factor and a flotation criterion. BEDMAP-compiled data and other airborne data are compared to the ASAID elevations and ice thicknesses to arrive at quantitative (one-sigma) uncertainties of surface elevations of +/-3.6, +/-9.6, +/-11.4, +/-30 and +/-100m for five ASAID-assigned confidence levels. Over one-half of the surface elevations along the grounded ice boundary and over one-third of the hydrostatic line elevations are ranked in the highest two confidence categories. A comparison between ASAID-calculated ice shelf thicknesses and BEDMAP-compiled data indicate a thin-ice bias of 41.2+/-71.3m for the ASAID ice thicknesses. The relationship between the seaward offset of the hydrostatic line from the grounded ice boundary only weakly matches a prediction based on beam theory. The mapped products along with the customized software to generate them and a variety of intermediate products are available from the National Snow and Ice Data Center.
Monitoring of the secondary drying in freeze-drying of pharmaceuticals.
Fissore, Davide; Pisano, Roberto; Barresi, Antonello A
2011-02-01
This paper is focused on the in-line monitoring of the secondary drying phase of a lyophilization process. An innovative software sensor is presented to estimate reliably the residual moisture in the product and the time required to complete secondary drying, that is, to reach the target value of the residual moisture or of the desorption rate. Such results are obtained by coupling a mathematical model of the process and the in-line measurement of the solvent desorption rate and by means of the pressure rise test or another sensors (e.g., windmills, laser sensors) that can measure the vapor flux in the drying chamber. The proposed method does not require extracting any vial during the operation or using expensive sensors to measure off-line the residual moisture. Moreover, it does not require any preliminary experiment to determine the relationship between the desorption rate and residual moisture in the product. The effectiveness of the proposed approach is demonstrated by means of experiments carried out in a pilot-scale apparatus: in this case, some vials were extracted from the drying chamber and the moisture content was measured to validate the estimations provided by the soft-sensor. Copyright © 2010 Wiley-Liss, Inc.
The determination of measures of software reliability
NASA Technical Reports Server (NTRS)
Maxwell, F. D.; Corn, B. C.
1978-01-01
Measurement of software reliability was carried out during the development of data base software for a multi-sensor tracking system. The failure ratio and failure rate were found to be consistent measures. Trend lines could be established from these measurements that provide good visualization of the progress on the job as a whole as well as on individual modules. Over one-half of the observed failures were due to factors associated with the individual run submission rather than with the code proper. Possible application of these findings for line management, project managers, functional management, and regulatory agencies is discussed. Steps for simplifying the measurement process and for use of these data in predicting operational software reliability are outlined.
Simulating Terrestrial Gamma Ray Flashes due to cosmic ray shower electrons and positrons
NASA Astrophysics Data System (ADS)
Connell, Paul
2017-04-01
The University of Valencia has developed a software simulator LEPTRACK to simulate the relativistic runaway electron avalanches, RREA, that are presumed to be the cause of Terrestrial Gamma Ray Flashes and their powerful accompanying Ionization/Excitation Flashes. We show here results of LEPTRACK simulations of RREA by the interaction of MeV energy electrons/positrons and photons in cosmic ray showers traversing plausible electric field geometries expected in storm clouds. The input beams of MeV shower products were created using the CORSIKA software package from the Karlsruhe Institute of Technology. We present images, videos and plots showing the different Ionization, Excitation and gamma-ray photon density fields produced, along with their time and spatial profile evolution, which depend critically on where the line of shower particles intercept the electric field geometry. We also show a new effect of incoming positrons in the shower, which make up a significant fraction of shower products, in particular their apparent "orbiting" within a high altitude negative induced shielding charge layer, which has been conjectured to produce a signature microwave emission, as well as a short range 511 keV annihilation line. The interesting question posed is if this conjectured positron emission can be observed and correlated with TGF orbital observations to show if a TGF originates in the macro E-fields of storm clouds or the micro E-fields of lightning leaders where this positron "orbiting" is not likely to occur.
Getting started on metrics - Jet Propulsion Laboratory productivity and quality
NASA Technical Reports Server (NTRS)
Bush, M. W.
1990-01-01
A review is presented to describe the effort and difficulties of reconstructing fifteen years of JPL software history. In 1987 the collection and analysis of project data were started with the objective of creating laboratory-wide measures of quality and productivity for software development. As a result of this two-year Software Product Assurance metrics study, a rough measurement foundation for software productivity and software quality, and an order-of-magnitude quantitative baseline for software systems and subsystems are now available.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-07
... Products, Components Thereof, and Related Software; Notice of Institution of Investigation; Institution of... importation of certain GPS navigation products, components thereof, and related software by reason of... importation of certain GPS navigation products, components thereof, and related software that infringe one or...
Ergonomics and workplace design: application of Ergo-UAS System in Fiat Group Automobiles.
Vitello, M; Galante, L G; Capoccia, M; Caragnano, G
2012-01-01
Since 2008 Fiat Group Automobiles has introduced Ergo-UAS system for the balancing of production lines and to detect ergonomic issues. Ergo-UAS system integrates 2 specific methods: MTM-UAS for time measurement and EAWS as ergonomic method to evaluate biomechanical effort for each workstation. Fiat is using a software system to manage time evaluation and ergo characterization of production cycle (UAS) to perform line balancing and obtain allowance factor in all Italian car manufacturing plant. For new car models, starting from New Panda, FGA is applying Ergo-UAS for workplace design since the earliest phase of product development. This means that workplace design is based on information about new product, new layout, new work organization and is performed by a multidisciplinary team (Work Place Integration Team), focusing on several aspects of product and process: safety, quality and productivity. This allows to find and solve ergonomic threats before the start of production, by means of a strict cooperation between product development, engineering and design, manufacturing. Three examples of workstation design are presented in which application of Ergo-UAS was determinant to find out initial excessive levels of biomechanical load and helped the process designer to improve the workstations and define limits of acceptability. Technical activities (on product or on process), or organizational changes, that have been implemented in order to solve the problems are presented. A comparison between "before" and "new" ergonomic scores necessary to bring workstations in acceptable conditions were made.
The Software Line-up: What Reviewers Look for When Evaluating Software.
ERIC Educational Resources Information Center
ELECTRONIC Learning, 1982
1982-01-01
Contains a check list to aid teachers in evaluating software used in computer-assisted instruction on microcomputers. The evaluation form contains three sections: program description, program evaluation, and overall evaluation. A brief description of a software evaluation program in use at the Granite School District in Utah is included. (JJD)
Automated Antibody De Novo Sequencing and Its Utility in Biopharmaceutical Discovery
NASA Astrophysics Data System (ADS)
Sen, K. Ilker; Tang, Wilfred H.; Nayak, Shruti; Kil, Yong J.; Bern, Marshall; Ozoglu, Berk; Ueberheide, Beatrix; Davis, Darryl; Becker, Christopher
2017-05-01
Applications of antibody de novo sequencing in the biopharmaceutical industry range from the discovery of new antibody drug candidates to identifying reagents for research and determining the primary structure of innovator products for biosimilar development. When murine, phage display, or patient-derived monoclonal antibodies against a target of interest are available, but the cDNA or the original cell line is not, de novo protein sequencing is required to humanize and recombinantly express these antibodies, followed by in vitro and in vivo testing for functional validation. Availability of fully automated software tools for monoclonal antibody de novo sequencing enables efficient and routine analysis. Here, we present a novel method to automatically de novo sequence antibodies using mass spectrometry and the Supernovo software. The robustness of the algorithm is demonstrated through a series of stress tests.
Wireless Sensor Node for Autonomous Monitoring and Alerts in Remote Environments
NASA Technical Reports Server (NTRS)
Panangadan, Anand V. (Inventor); Monacos, Steve P. (Inventor)
2015-01-01
A method, apparatus, system, and computer program products provides personal alert and tracking capabilities using one or more nodes. Each node includes radio transceiver chips operating at different frequency ranges, a power amplifier, sensors, a display, and embedded software. The chips enable the node to operate as either a mobile sensor node or a relay base station node while providing a long distance relay link between nodes. The power amplifier enables a line-of-sight communication between the one or more nodes. The sensors provide a GPS signal, temperature, and accelerometer information (used to trigger an alert condition). The embedded software captures and processes the sensor information, provides a multi-hop packet routing protocol to relay the sensor information to and receive alert information from a command center, and to display the alert information on the display.
Community Tools for Cartographic and Photogrammetric Processing of Mars Express HRSC Images
NASA Astrophysics Data System (ADS)
Kirk, R. L.; Howington-Kraus, E.; Edmundson, K.; Redding, B.; Galuszka, D.; Hare, T.; Gwinner, K.
2017-07-01
The High Resolution Stereo Camera (HRSC) on the Mars Express orbiter (Neukum et al. 2004) is a multi-line pushbroom scanner that can obtain stereo and color coverage of targets in a single overpass, with pixel scales as small as 10 m at periapsis. Since commencing operations in 2004 it has imaged 77 % of Mars at 20 m/pixel or better. The instrument team uses the Video Image Communication And Retrieval (VICAR) software to produce and archive a range of data products from uncalibrated and radiometrically calibrated images to controlled digital topographic models (DTMs) and orthoimages and regional mosaics of DTM and orthophoto data (Gwinner et al. 2009; 2010b; 2016). Alternatives to this highly effective standard processing pipeline are nevertheless of interest to researchers who do not have access to the full VICAR suite and may wish to make topographic products or perform other (e. g., spectrophotometric) analyses prior to the release of the highest level products. We have therefore developed software to ingest HRSC images and model their geometry in the USGS Integrated Software for Imagers and Spectrometers (ISIS3), which can be used for data preparation, geodetic control, and analysis, and the commercial photogrammetric software SOCET SET (® BAE Systems; Miller and Walker 1993; 1995) which can be used for independent production of DTMs and orthoimages. The initial implementation of this capability utilized the then-current ISIS2 system and the generic pushbroom sensor model of SOCET SET, and was described in the DTM comparison of independent photogrammetric processing by different elements of the HRSC team (Heipke et al. 2007). A major drawback of this prototype was that neither software system then allowed for pushbroom images in which the exposure time changes from line to line. Except at periapsis, HRSC makes such timing changes every few hundred lines to accommodate changes of altitude and velocity in its elliptical orbit. As a result, it was necessary to split observations into blocks of constant exposure time, greatly increasing the effort needed to control the images and collect DTMs. Here, we describe a substantially improved HRSC processing capability that incorporates sensor models with varying line timing in the current ISIS3 system (Sides 2017) and SOCET SET. This enormously reduces the work effort for processing most images and eliminates the artifacts that arose from segmenting them. In addition, the software takes advantage of the continuously evolving capabilities of ISIS3 and the improved image matching module NGATE (Next Generation Automatic Terrain Extraction, incorporating area and feature based algorithms, multi-image and multi-direction matching) of SOCET SET, thus greatly reducing the need for manual editing of DTM errors. We have also developed a procedure for geodetically controlling the images to Mars Orbiter Laser Altimeter (MOLA) data by registering a preliminary stereo topographic model to MOLA by using the point cloud alignment (pc_align) function of the NASA Ames Stereo Pipeline (ASP; Moratto et al. 2010). This effectively converts inter-image tiepoints into ground control points in the MOLA coordinate system. The result is improved absolute accuracy and a significant reduction in work effort relative to manual measurement of ground control. The ISIS and ASP software used are freely available; SOCET SET, is a commercial product. By the end of 2017 we expect to have ported our SOCET SET HRSC sensor model to the Community Sensor Model (CSM; Community Sensor Model Working Group 2010; Hare and Kirk 2017) standard utilized by the successor photogrammetric system SOCET GXP that is currently offered by BAE. In early 2018, we are also working with BAE to release the CSM source code under a BSD or MIT open source license. We illustrate current HRSC processing capabilities with three examples, of which the first two come from the DTM comparison of 2007. Candor Chasma (h1235_0001) was a near-periapse observation with constant exposure time that could be processed relatively easily at that time. We show qualitative and quantitative improvements in DTM resolution and precision as well as greatly reduced need for manual editing, and illustrate some of the photometric applications possible in ISIS. At the Nanedi Valles site we are now able to process all 3 long-arc orbits (h0894_0000, h0905_0000 and h0927_0000) without segmenting the images. Finally, processing image set h4235_0001, which covers the landing site of the Mars Science Laboratory (MSL) rover and its rugged science target of Aeolus Mons in Gale crater, provides a rare opportunity to evaluate DTM resolution and precision because extensive High Resolution Imaging Science Experiment (HiRISE) DTMs are available (Golombek et al. 2012). The HiRISE products have 50x smaller pixel scale so that discrepancies can mostly be attributed to HRSC. We use the HiRISE DTMs to compare the resolution and precision of our HRSC DTMs with the (evolving) standard products. We find that the vertical precision of HRSC DTMs is comparable to the pixel scale but the horizontal resolution may be 15-30 image pixels, depending on processing. This is significantly coarser than the lower limit of 3-5 pixels based on the minimum size for image patches to be matched. Stereo DTMs registered to MOLA altimetry by surface fitting typically deviate by 10thinsp;m or less in mean elevation. Estimates of the RMS deviation are strongly influenced by the sparse sampling of the altimetry, but range from
Methodology for Software Reliability Prediction. Volume 1.
1987-11-01
SPACECRAFT 0 MANNED SPACECRAFT B ATCH SYSTEM AIRBORNE AVIONICS 0 UNMANNED EVENT C014TROL a REAL TIME CLOSED 0 UNMANNED SPACECRAFT LOOP OPERATINS SPACECRAFT...software reliability. A Software Reliability Measurement Framework was established which spans the life cycle of a software system and includes the...specification, prediction, estimation, and assessment of software reliability. Data from 59 systems , representing over 5 million lines of code, were
Reflections of Computing Experiences in a Steel Factory in the Early 1960s
NASA Astrophysics Data System (ADS)
Järvinen, Pertti
We can best see many things from a historical perspective. What were the first pioneers doing in the information technology departments of Finnish manufacturing companies? In early 1960s, I had a special chance to work in a steel industry that had long traditions to use rather advanced tools and methods to intensify their productivity. The first computer in our company had such novel properties as movable disk packs making a direct access of stored data possible. In this paper, we describe the following issues and innovations in some depth. These include (a) transitioning from the punched card machines to a new computer era, (b) using advanced programming language to intensify production of new computer software, (c) drawing pictures by using a line printer, (d) supporting steel making with mathematical software, (e) storing executable programs to the disk memory and calling and moving them from there to the core memory for running, and (f) building a simple report generator. I will also pay attention to the breakthrough in those innovations and in this way demonstrate how some computing solutions were growing at that time.
Predicting Software Suitability Using a Bayesian Belief Network
NASA Technical Reports Server (NTRS)
Beaver, Justin M.; Schiavone, Guy A.; Berrios, Joseph S.
2005-01-01
The ability to reliably predict the end quality of software under development presents a significant advantage for a development team. It provides an opportunity to address high risk components earlier in the development life cycle, when their impact is minimized. This research proposes a model that captures the evolution of the quality of a software product, and provides reliable forecasts of the end quality of the software being developed in terms of product suitability. Development team skill, software process maturity, and software problem complexity are hypothesized as driving factors of software product quality. The cause-effect relationships between these factors and the elements of software suitability are modeled using Bayesian Belief Networks, a machine learning method. This research presents a Bayesian Network for software quality, and the techniques used to quantify the factors that influence and represent software quality. The developed model is found to be effective in predicting the end product quality of small-scale software development efforts.
How the NWC handles software as product
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vinson, D.
1997-11-01
This tutorial provides a hands-on view of how the Nuclear Weapons Complex project should be handling (or planning to handle) software as a product in response to Engineering Procedure 401099. The SQAS has published the document SQAS96-002, Guidelines for NWC Processes for Handling Software Product, that will be the basis for the tutorial. The primary scope of the tutorial is on software products that result from weapons and weapons-related projects, although the information presented is applicable to many software projects. Processes that involve the exchange, review, or evaluation of software product between or among NWC sites, DOE, and external customersmore » will be described.« less
Cost Estimation of Software Development and the Implications for the Program Manager
1992-06-01
Software Lifecycle Model (SLIM), the Jensen System-4 model, the Software Productivity, Quality, and Reliability Estimator ( SPQR \\20), the Constructive...function models in current use are the Software Productivity, Quality, and Reliability Estimator ( SPQR /20) and the Software Architecture Sizing and...Estimator ( SPQR /20) was developed by T. Capers Jones of Software Productivity Research, Inc., in 1985. The model is intended to estimate the outcome
Complexity Measure for the Prototype System Description Language (PSDL)
2002-06-01
Albrecht, A. and Gaffney , J., Software Function Source Lines of Code and Development Effort Prediction, IEEE Transactions on Software Engineering...Through Meausrement”; Proceedings of the IEEE, Vol. 77, No. 4, April 89. Schach, Stephen, R., Software Engineering, Second Edition, IRWIN, Burr Ridge
NASA Technical Reports Server (NTRS)
1983-01-01
Kennedy Space Center's primary institutional computer is a 4 megabyte IBM 4341 with 3.175 billion characters of IBM 3350 disc storage. This system utilizes the Software AG product known as ADABAS with the on line user oriented features of NATURAL and COMPLETE as a Data Base Management System (DBMS). It is operational under the OS/VSI and is currently supporting batch/on line applications such as Personnel, Training, Physical Space Management, Procurement, Office Equipment Maintenance, and Equipment Visibility. A third and by far the largest DBMS application is known as the Shuttle Inventory Management System (SIMS) which is operational on a Honeywell 6660 (dedicated) computer system utilizing Honeywell Integrated Data Storage I (IDSI) as the DBMS. The SIMS application is designed to provide central supply system acquisition, inventory control, receipt, storage, and issue of spares, supplies, and materials.
Development of the TFTR neutral beam injection system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prichard, Jr., B. A.
1977-01-01
The TFTR Neutral Beam Lines are designed to inject 20 MW of 120 keV neutral deuterium atoms into the plasma. This is accomplished using 12 sources, 65 amperes each, mounted in 4 beam lines. The 120 kV sources and a prototype beam line are being developed. The implementation of these beam lines has required the development of several associated pieces of hardware. 200 kV switch tubes for the power supplies are being developed for modulation and regulation of the accelerating supplies. A 90 cm metallic seal gate valve capable of sealing against atmosphere in either direction is being developed formore » separating the torus and beam line vacuum systems. A 70 x 80 cm fast shutter valve is also being developed to limit tritium migration from the torus into the beam line. Internal to the beam line a calorimeter, ion dump and deflection magnet have been designed to handle three beams, and optical diagnostics utilizing the doppler broadening and doppler shift of light emitted from the accelerated beam are being developed. The control and monitoring of the 12 sources will be done via the TFTR computer control system (CICADA) as will other parts of the machine, and software is being developed to condition and operate the sources automatically. The prototype beam line is scheduled to begin operation in the fall of 1978 and all four production beam lines on TFTR in 1982.« less
A Comparison of Authoring Software for Developing Mathematics Self-Learning Software Packages.
ERIC Educational Resources Information Center
Suen, Che-yin; Pok, Yang-ming
Four years ago, the authors started to develop a self-paced mathematics learning software called NPMaths by using an authoring package called Tencore. However, NPMaths had some weak points. A development team was hence formed to develop similar software called Mathematics On Line. This time the team used another development language called…
Automated, Certified Program-rewriting for Software Security Enforcement
2012-03-05
VLC ), pages 257-260, Oak Brook, Illinois, Oc- tober 2010. [14] Aditi A. Patwardhan. Security-aware program visualization for analyz- ing in-lined...January 2010. [17] Meera Sridhar and Kevin W. Hamlen. Flexible in-lined reference moni- tor certification: Challenges and future directions. In...pages 55-60, Austin, Texas, January 2011. [18] Bhavani Thuraisingham and Kevin W. Hamlen. Challenges and future directions of software technology
Leak detection by mass balance effective for Norman Wells line
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liou, J.C.P.
Mass-balance calculations for leak detection have been shown as effective as a leading software system, in a comparison based on a major Canadian crude-oil pipeline. The calculations and NovaCorp`s Leakstop software each detected 4% (approximately) or greater leaks on Interprovincial Pipe Line (IPL) Inc.`s Norman Wells pipeline. Insufficient data exist to assess performances of the two methods for leaks smaller than 4%. Pipeline leak detection using such software-based systems are common. Their effectiveness is measured by how small and how quickly a leak can be detected. Algorithms used and measurement uncertainties determine leak detectability.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-17
..., Components Thereof, Associated Software, and Products Containing the Same; Notice of Investigation AGENCY: U... scanning devices, components thereof, associated software, and products containing the same by reason of... after importation of certain biometric scanning devices, components thereof, associated software, or...
Information models of software productivity - Limits on productivity growth
NASA Technical Reports Server (NTRS)
Tausworthe, Robert C.
1992-01-01
Research into generalized information-metric models of software process productivity establishes quantifiable behavior and theoretical bounds. The models establish a fundamental mathematical relationship between software productivity and the human capacity for information traffic, the software product yield (system size), information efficiency, and tool and process efficiencies. An upper bound is derived that quantifies average software productivity and the maximum rate at which it may grow. This bound reveals that ultimately, when tools, methodologies, and automated assistants have reached their maximum effective state, further improvement in productivity can only be achieved through increasing software reuse. The reuse advantage is shown not to increase faster than logarithmically in the number of reusable features available. The reuse bound is further shown to be somewhat dependent on the reuse policy: a general 'reuse everything' policy can lead to a somewhat slower productivity growth than a specialized reuse policy.
Hu, E; Liao, T. W.; Tiersch, T. R.
2013-01-01
Emerging commercial-level technology for aquatic sperm cryopreservation has not been modeled by computer simulation. Commercially available software (ARENA, Rockwell Automation, Inc. Milwaukee, WI) was applied to simulate high-throughput sperm cryopreservation of blue catfish (Ictalurus furcatus) based on existing processing capabilities. The goal was to develop a simulation model suitable for production planning and decision making. The objectives were to: 1) predict the maximum output for 8-hr workday; 2) analyze the bottlenecks within the process, and 3) estimate operational costs when run for daily maximum output. High-throughput cryopreservation was divided into six major steps modeled with time, resources and logic structures. The modeled production processed 18 fish and produced 1164 ± 33 (mean ± SD) 0.5-ml straws containing one billion cryopreserved sperm. Two such production lines could support all hybrid catfish production in the US and 15 such lines could support the entire channel catfish industry if it were to adopt artificial spawning techniques. Evaluations were made to improve efficiency, such as increasing scale, optimizing resources, and eliminating underutilized equipment. This model can serve as a template for other aquatic species and assist decision making in industrial application of aquatic germplasm in aquaculture, stock enhancement, conservation, and biomedical model fishes. PMID:25580079
Crowell, Kevin L; Slysz, Gordon W; Baker, Erin S; LaMarche, Brian L; Monroe, Matthew E; Ibrahim, Yehia M; Payne, Samuel H; Anderson, Gordon A; Smith, Richard D
2013-11-01
The addition of ion mobility spectrometry to liquid chromatography-mass spectrometry experiments requires new, or updated, software tools to facilitate data processing. We introduce a command line software application LC-IMS-MS Feature Finder that searches for molecular ion signatures in multidimensional liquid chromatography-ion mobility spectrometry-mass spectrometry (LC-IMS-MS) data by clustering deisotoped peaks with similar monoisotopic mass, charge state, LC elution time and ion mobility drift time values. The software application includes an algorithm for detecting and quantifying co-eluting chemical species, including species that exist in multiple conformations that may have been separated in the IMS dimension. LC-IMS-MS Feature Finder is available as a command-line tool for download at http://omics.pnl.gov/software/LC-IMS-MS_Feature_Finder.php. The Microsoft.NET Framework 4.0 is required to run the software. All other dependencies are included with the software package. Usage of this software is limited to non-profit research to use (see README). rds@pnnl.gov. Supplementary data are available at Bioinformatics online.
NASA Astrophysics Data System (ADS)
Zemánek, Ivan; Havlíček, Václav
2006-09-01
A new universal control and measuring system for classic and amorphous soft magnetic materials single/on-line strip testing has been developed at the Czech Technical University in Prague. The measuring system allows to measure magnetization characteristic and specific power losses of different tested materials (strips) at AC magnetization of arbitrary magnetic flux density waveform at wide range of frequencies 20 Hz-20 kHz. The measuring system can be used for both single strip testing in laboratories and on-line strip testing during the production process. The measuring system is controlled by two-stage master-slave control system consisting of the external PC (master) completed by three special A/D measuring plug-in boards, and local executing control unit (slave) with one-chip microprocessor 8051, connected with PC by the RS232 serial line. The "user friendly" powerful control software implemented on the PC and the effective program code for the microprocessor give possibility for full automatic measurement with high measuring power and high measuring accuracy.
Strategies for a Creative Future with Computer Science, Quality Design and Communicability
NASA Astrophysics Data System (ADS)
Cipolla Ficarra, Francisco V.; Villarreal, Maria
In the current work is presented the importance of the two-way triad between computer science, design and communicability. It is demonstrated how the principles of quality of software engineering are not universal since they are disappearing inside university training. Besides, a short analysis of the term "creativity" males apparent the existence of plagiarism as a human factor that damages the future of communicability applied to the on-line and off-line contents of the open software. A set of measures and guidelines are presented so that the triad works again correctly in the next years to foster the qualitative design of the interactive systems on-line and/or off-line.
Technical design and system implementation of region-line primitive association framework
NASA Astrophysics Data System (ADS)
Wang, Min; Xing, Jinjin; Wang, Jie; Lv, Guonian
2017-08-01
Apart from regions, image edge lines are an important information source, and they deserve more attention in object-based image analysis (OBIA) than they currently receive. In the region-line primitive association framework (RLPAF), we promote straight-edge lines as line primitives to achieve powerful OBIAs. Along with regions, straight lines become basic units for subsequent extraction and analysis of OBIA features. This study develops a new software system called remote-sensing knowledge finder (RSFinder) to implement RLPAF for engineering application purposes. This paper introduces the extended technical framework, a comprehensively designed feature set, key technology, and software implementation. To our knowledge, RSFinder is the world's first OBIA system based on two types of primitives, namely, regions and lines. It is fundamentally different from other well-known region-only-based OBIA systems, such as eCogntion and ENVI feature extraction module. This paper has important reference values for the development of similarly structured OBIA systems and line-involved extraction algorithms of remote sensing information.
MER Surface Phase; Blurring the Line Between Fault Protection and What is Supposed to Happen
NASA Technical Reports Server (NTRS)
Reeves, Glenn E.
2008-01-01
An assessment on the limitations of communication with MER rovers and how such constraints drove the system design, flight software and fault protection architecture, blurring the line between traditional fault protection and expected nominal behavior, and requiring the most novel autonomous and semi-autonomous elements of the vehicle software including communication, surface mobility, attitude knowledge acquisition, fault protection, and the activity arbitration service.
NASA Astrophysics Data System (ADS)
Gibson, Justus L.; Stencel, Robert E.; Ketzeback, William; Barentine, John; Coughlin, Jeffrey; Leadbeater, Robin; Saurage, Gabrelle
2018-06-01
Worldwide interest in the recent eclipse of epsilon Aurigae resulted in the generation of several extensive data sets, including high resolution spectroscopic monitoring. This lead to the discovery, among other things, of the existence of a mass transfer stream, seen notably during third contact. We explored spectroscopic facets of the mass transfer stream during third contact, using high resolution spectra obtained with the ARCES and TripleSpec instruments at Apache Point Observatory. One hundred and sixteen epochs of data were obtained between 2009 and 2012, and equivalent widths and line velocities measured for high versus low eccentricity accretion disk lines. These datasets also enable greater detail to be measured of the mid-eclipse enhancement of the He I 10830Å line, and the discovery of the P Cygni shape of the Pa-β line at third contact. We found evidence of higher speed material, associated with the mass transfer stream, persisting between third and fourth eclipse contacts. We visualized the disk and stream interaction using SHAPE software, and used CLOUDY software to estimate that the source of the enhanced He I 10830A absorption arises from a region with nH = 1011 cm-3 and temperature of 20,000 K, consistent with a mid-B type central star. Van Rensbergen binary star evolutionary models are somewhat consistent with the current binary parameters for their case of a 9 plus 8 solar mass initial binary, evolving into a 2.3 and 14.11 solar mass end product after 35 Myr. With these results, it is possible to make predictions which suggest that continued monitoring prior to the next eclipse (2036) will help resolve standing questions about the mass and age of this binary.
NASA Astrophysics Data System (ADS)
Shen, Tzu-Chiang; Ovando, Nicolás.; Bartsch, Marcelo; Simmond, Max; Vélez, Gastón; Robles, Manuel; Soto, Rubén.; Ibsen, Jorge; Saldias, Christian
2012-09-01
ALMA is the first astronomical project being constructed and operated under industrial approach due to the huge amount of elements involved. In order to achieve the maximum through put during the engineering and scientific commissioning phase, several production lines have been established to work in parallel. This decision required modification in the original system architecture in which all the elements are controlled and operated within a unique Standard Test Environment (STE). The advance in the network industry and together with the maturity of virtualization paradigm allows us to provide a solution which can replicate the STE infrastructure without changing their network address definition. This is only possible with Virtual Routing and Forwarding (VRF) and Virtual LAN (VLAN) concepts. The solution allows dynamic reconfiguration of antennas and other hardware across the production lines with minimum time and zero human intervention in the cabling. We also push the virtualization even further, classical rack mount servers are being replaced and consolidated by blade servers. On top of them virtualized server are centrally administrated with VMWare ESX. Hardware costs and system administration effort will be reduced considerably. This mechanism has been established and operated successfully during the last two years. This experience gave us confident to propose a solution to divide the main operation array into subarrays using the same concept which will introduce huge flexibility and efficiency for ALMA operation and eventually may simplify the complexity of ALMA core observing software since there will be no need to deal with subarrays complexity at software level.
NASA Astrophysics Data System (ADS)
Yang, Yunlei; Hou, Muzhou; Luo, Jianshu; Liu, Taohua
2018-06-01
With the increasing demands for vast amounts of data and high-speed signal transmission, the use of multi-conductor transmission lines is becoming more common. The impact of transmission lines on signal transmission is thus a key issue affecting the performance of high-speed digital systems. To solve the problem of lossless two-conductor transmission line equations (LTTLEs), a neural network model and algorithm are explored in this paper. By selecting the product of two triangular basis functions as the activation function of hidden layer neurons, we can guarantee the separation of time, space, and phase orthogonality. By adding the initial condition to the neural network, an improved extreme learning machine (IELM) algorithm for solving the network weight is obtained. This is different to the traditional method for converting the initial condition into the iterative constraint condition. Calculation software for solving the LTTLEs based on the IELM algorithm is developed. Numerical experiments show that the results are consistent with those of the traditional method. The proposed neural network algorithm can find the terminal voltage of the transmission line and also the voltage of any observation point. It is possible to calculate the value at any given point by using the neural network model to solve the transmission line equation.
Benazzi, F; Gernaey, K V; Jeppsson, U; Katebi, R
2007-08-01
In this paper, a new approach for on-line monitoring and detection of abnormal readily biodegradable substrate (S(s)) and slowly biodegradable substrate (X(s)) concentrations, for example due to input of toxic loads from the sewer, or due to influent substrate shock load, is proposed. Considering that measurements of S(s) and X(s) concentrations are not available in real wastewater treatment plants, the S(s) / X(s) software sensor can activate an alarm with a response time of about 60 and 90 minutes, respectively, based on the dissolved oxygen measurement. The software sensor implementation is based on an extended Kalman filter observer and disturbances are modelled using fast Fourier transform and spectrum analyses. Three case studies are described. The first one illustrates the fast and accurate convergence of the extended Kalman filter algorithm, which is achieved in less than 2 hours. Furthermore, the difficulties of estimating X(s) when off-line analysis is not available are depicted, and the S(s) / X(s) software sensor performances when no measurements of S(s) and X(s) are available are illustrated. Estimation problems related to the death-regeneration concept of the activated sludge model no.1 and possible application of the software sensor in wastewater monitoring are discussed.
The Elements of an Effective Software Development Plan - Software Development Process Guidebook
2011-11-11
standards and practices required for all XMPL software development. This SDP implements the <corporate> Standard Software Process (SSP). as tailored...Developing and integrating reusable software products • Approach to managing COTS/Reuse software implementation • COTS/Reuse software selection...final selection and submit to change board for approval MAINTENANCE Monitor current products for obsolescence or end of support Track new
Solution In-Line Alpha Counter (SILAC) Instruction Manual-Version 4.00
DOE Office of Scientific and Technical Information (OSTI.GOV)
Steven M. Alferink; Joel E. Farnham; Malcolm M. Fowler
2002-06-01
The Solution In-Line Alpha Counter (SILAC) provides near real-time alpha activity measurements of aqueous solutions in gloveboxes located in the Plutonium Facility (TA-55) at Los Alamos National Laboratory (LANL). The SILAC detector and its interface software were first developed by Joel Farnham at LANL [1]. This instruction manual describes the features of the SILAC interface software and contains the schematic and fabrication instructions for the detector.
Development of online NIR urine analyzing system based on AOTF
NASA Astrophysics Data System (ADS)
Wan, Feng; Sun, Zhendong; Li, Xiaoxia
2006-09-01
In this paper, some key techniques on development of on-line MR urine analyzing system based on AOTF (Acousto - Optics Tunable Filter) are introduced. Problems about designing the optical system including collimation of incident light and working distance (the shortest distance for separating incident light and diffracted light) are analyzed and researched. DDS (Direct Digital Synthesizer) controlled by microprocessor is used to realize the wavelength scan. The experiment results show that this MR urine analyzing system based on. AOTF has 10000 - 4000cm -1 wavelength range and O.3ms wavelength transfer rate. Compare with the conventional Fourier Transform NIP. spectrophotometer for analyzing multi-components in urine, this system features low cost, small volume and on-line measurement function. Unscrambler software (multivariate statistical software by CAMO Inc. Norway) is selected as the software for processing the data. This system can realize on line quantitative analysis of protein, urea and creatinine in urine.
The GRIDView Visualization Package
NASA Astrophysics Data System (ADS)
Kent, B. R.
2011-07-01
Large three-dimensional data cubes, catalogs, and spectral line archives are increasingly important elements of the data discovery process in astronomy. Visualization of large data volumes is of vital importance for the success of large spectral line surveys. Examples of data reduction utilizing the GRIDView software package are shown. The package allows users to manipulate data cubes, extract spectral profiles, and measure line properties. The package and included graphical user interfaces (GUIs) are designed with pipeline infrastructure in mind. The software has been used with great success analyzing spectral line and continuum data sets obtained from large radio survey collaborations. The tools are also important for multi-wavelength cross-correlation studies and incorporate Virtual Observatory client applications for overlaying database information in real time as cubes are examined by users.
Data-Driven Decision Making as a Tool to Improve Software Development Productivity
ERIC Educational Resources Information Center
Brown, Mary Erin
2013-01-01
The worldwide software project failure rate, based on a survey of information technology software manager's view of user satisfaction, product quality, and staff productivity, is estimated to be between 24% and 36% and software project success has not kept pace with the advances in hardware. The problem addressed by this study was the limited…
Understanding Acceptance of Software Metrics--A Developer Perspective
ERIC Educational Resources Information Center
Umarji, Medha
2009-01-01
Software metrics are measures of software products and processes. Metrics are widely used by software organizations to help manage projects, improve product quality and increase efficiency of the software development process. However, metrics programs tend to have a high failure rate in organizations, and developer pushback is one of the sources…
TreeRipper web application: towards a fully automated optical tree recognition software.
Hughes, Joseph
2011-05-20
Relationships between species, genes and genomes have been printed as trees for over a century. Whilst this may have been the best format for exchanging and sharing phylogenetic hypotheses during the 20th century, the worldwide web now provides faster and automated ways of transferring and sharing phylogenetic knowledge. However, novel software is needed to defrost these published phylogenies for the 21st century. TreeRipper is a simple website for the fully-automated recognition of multifurcating phylogenetic trees (http://linnaeus.zoology.gla.ac.uk/~jhughes/treeripper/). The program accepts a range of input image formats (PNG, JPG/JPEG or GIF). The underlying command line c++ program follows a number of cleaning steps to detect lines, remove node labels, patch-up broken lines and corners and detect line edges. The edge contour is then determined to detect the branch length, tip label positions and the topology of the tree. Optical Character Recognition (OCR) is used to convert the tip labels into text with the freely available tesseract-ocr software. 32% of images meeting the prerequisites for TreeRipper were successfully recognised, the largest tree had 115 leaves. Despite the diversity of ways phylogenies have been illustrated making the design of a fully automated tree recognition software difficult, TreeRipper is a step towards automating the digitization of past phylogenies. We also provide a dataset of 100 tree images and associated tree files for training and/or benchmarking future software. TreeRipper is an open source project licensed under the GNU General Public Licence v3.
Space and Missile Systems Center Standard: Software Development
2015-01-16
maintenance , or any other activity or combination of activities resulting in products . Within this standard, requirements to “develop,” “define...integration, reuse, reengineering, maintenance , or any other activity that results in products ). The term “developer” encompasses all software team...activities that results in software products . Software development includes new development, modification, reuse, reengineering, maintenance , and any other
2014-08-01
technologies and processes to achieve a required level of confidence that software systems and services function in the intended manner. 1.3 Security Example...that took three high-voltage lines out of service and a software fail- ure (a race condition3) that disabled the computing service that notified the... service had failed. Instead of analyzing the details of the alarm server failure, the reviewers asked why the following software assurance claim had
Software package for performing experiments about the convolutionally encoded Voyager 1 link
NASA Technical Reports Server (NTRS)
Cheng, U.
1989-01-01
A software package enabling engineers to conduct experiments to determine the actual performance of long constraint-length convolutional codes over the Voyager 1 communication link directly from the Jet Propulsion Laboratory (JPL) has been developed. Using this software, engineers are able to enter test data from the Laboratory in Pasadena, California. The software encodes the data and then sends the encoded data to a personal computer (PC) at the Goldstone Deep Space Complex (GDSC) over telephone lines. The encoded data are sent to the transmitter by the PC at GDSC. The received data, after being echoed back by Voyager 1, are first sent to the PC at GDSC, and then are sent back to the PC at the Laboratory over telephone lines for decoding and further analysis. All of these operations are fully integrated and are completely automatic. Engineers can control the entire software system from the Laboratory. The software encoder and the hardware decoder interface were developed for other applications, and have been modified appropriately for integration into the system so that their existence is transparent to the users. This software provides: (1) data entry facilities, (2) communication protocol for telephone links, (3) data displaying facilities, (4) integration with the software encoder and the hardware decoder, and (5) control functions.
2011-01-01
normalized to parallel controls. Flow Cytometry and Confocal Microscopy Upon exposure to 10-ns EP, aliquots of the cellular suspension were added to a tube...Survival data was processed and plotted using GrapherH software (Golden Software, Golden, Colorado). Flow cytometry results were processed in C6 software...Accuri Cytometers, Inc., Ann Arbor, MI) and FCSExpress software (DeNovo Software, Los Angeles, CA). Final analysis and presentation of flow cytometry
Fourteen Years of R/qtl: Just Barely Sustainable
Broman, Karl W.
2014-01-01
R/qtl is an R package for mapping quantitative trait loci (genetic loci that contribute to variation in quantitative traits) in experimental crosses. Its development began in 2000. There have been 38 software releases since 2001. The latest release contains 35k lines of R code and 24k lines of C code, plus 15k lines of code for the documentation. Challenges in the development and maintenance of the software are discussed. A key to the success of R/qtl is that it remains a central tool for the chief developer's own research work, and so its maintenance is of selfish importance. PMID:25364504
Virtual rough samples to test 3D nanometer-scale scanning electron microscopy stereo photogrammetry.
Villarrubia, J S; Tondare, V N; Vladár, A E
2016-01-01
The combination of scanning electron microscopy for high spatial resolution, images from multiple angles to provide 3D information, and commercially available stereo photogrammetry software for 3D reconstruction offers promise for nanometer-scale dimensional metrology in 3D. A method is described to test 3D photogrammetry software by the use of virtual samples-mathematical samples from which simulated images are made for use as inputs to the software under test. The virtual sample is constructed by wrapping a rough skin with any desired power spectral density around a smooth near-trapezoidal line with rounded top corners. Reconstruction is performed with images simulated from different angular viewpoints. The software's reconstructed 3D model is then compared to the known geometry of the virtual sample. Three commercial photogrammetry software packages were tested. Two of them produced results for line height and width that were within close to 1 nm of the correct values. All of the packages exhibited some difficulty in reconstructing details of the surface roughness.
Sustaining Software-Intensive Systems
2006-05-01
2.2 Multi- Service Operational Test and Evaluation .......................................4 2.3 Stable Software Baseline...or equivalent document • completed Multi- Service Operational Test and Evaluation (MOT&E) for the potential production software package (or OT&E if...not multi- service ) • stable software production baseline • complete and current software documentation • Authority to Operate (ATO) for an
Implementation of RS-485 Communication between PLC and PC of Distributed Control System Based on VB
NASA Astrophysics Data System (ADS)
Lian Zhang, Chuan; Da Huang, Zhi; Qing Zhou, Gui; Chong, Kil To
2015-05-01
This paper focuses on achieving RS-485 communication between programmable logical controller (PLC) and PC based on visual basic 6.0 (VB6.0) on an experimental automatic production line. Mitsubishi FX2N PLCs and a PC are chosen as slave stations and main station, respectively. Monitoring software is developed using VB6.0 for data input/output, flow control and online parameters setting. As a result, all functions are fulfilled with robust performance. It is concluded from results that one PC can monitor several PLCs using RS-485 communication.
Automatic Inspection In Car Industry : User Point-Of-View
NASA Astrophysics Data System (ADS)
Salesse, Robert
1986-11-01
Many equipments for automatic inspection with vision have been incorporated in production lines of nearly all car manufacturers. RENAULT also has now a three years of experience with automated vision and some rules have been established. Our most important contributions have been : - Examples of applications, some now operating, some waiting for integration in complete systems. - How to establish a good "request to quote" ? - How to examine and compare suppliers'offers ? What selection criterias and important questions to ask for ? - What can be expected from the new vision equipment and what are the needs in hardware and software.
orthAgogue: an agile tool for the rapid prediction of orthology relations.
Ekseth, Ole Kristian; Kuiper, Martin; Mironov, Vladimir
2014-03-01
The comparison of genes and gene products across species depends on high-quality tools to determine the relationships between gene or protein sequences from various species. Although some excellent applications are available and widely used, their performance leaves room for improvement. We developed orthAgogue: a multithreaded C application for high-speed estimation of homology relations in massive datasets, operated via a flexible and easy command-line interface. The orthAgogue software is distributed under the GNU license. The source code and binaries compiled for Linux are available at https://code.google.com/p/orthagogue/.
NASA Astrophysics Data System (ADS)
Yang, Qi; Zou, Dehua; Zhang, Jianjun; Li, Hui; Chen, Jianping; Li, Jinliang
2017-05-01
Four transmission lines on the same tower are widely used because of their obvious economic and social benefits. But it also has high power supply reliability, so the choice of reasonable maintenance mode is particularly important. In this paper, we deducted the maintenance influence of the energized line to non-energized line, calculated and analyzed protection measures of non-energized singular line of 500kV double-circuit transmission line on the same tower with ATP software, and calculated field intensity distribution of typical operating position of the energized double-circuit transmission line with the finite element software. The calculation shows that when using the outage maintenance method, hanging both ground current and personal security line can reduce the current flowing through the operator’s body effectively. When using the live maintenance method, the field intensity of operator body strengths up to 383.69kV/m, The operator needs to wear shielding cloth with at least 43.08 dB shielding efficiency, in order to meet the security requirements.
The HERSCHEL/PACS early Data Products
NASA Astrophysics Data System (ADS)
Wieprecht, E.; Wetzstein, M.; Huygen, R.; Vandenbussche, B.; De Meester, W.
2006-07-01
ESA's Herschel Space Observatory to be launched in 2007, is the first space observatory covering the full far-infrared and submillimeter wavelength range (60 - 670 microns). The Photodetector Array Camera & Spectrometer (PACS) is one of the three science instruments. It contains two Ge:Ga photoconductor arrays and two bolometer arrays to perform imaging line spectroscopy and imaging photometry in the 60 - 210 micron wavelength band. The HERSCHEL ground segment (Herschel Common Science System - HCSS) is implemented using JAVA technology and written in a common effort by the HERSCHEL Science Center and the three instrument teams. The PACS Common Software System (PCSS) is based on the HCSS and used for the online and offline analysis of PACS data. For telemetry bandwidth reasons PACS science data are partially processed on board, compressed, cut into telemetry packets and transmitted to the ground. These steps are instrument mode dependent. We will present the software model which allows to reverse the discrete on board processing steps and evaluate the data. After decompression and reconstruction the detector data and instrument status information are organized in two main PACS Products. The design of these JAVA classes considers the individual sampling rates, data formats, memory and performance optimization aspects and comfortable user interfaces.
Reliability measurement during software development. [for a multisensor tracking system
NASA Technical Reports Server (NTRS)
Hecht, H.; Sturm, W. A.; Trattner, S.
1977-01-01
During the development of data base software for a multi-sensor tracking system, reliability was measured. The failure ratio and failure rate were found to be consistent measures. Trend lines were established from these measurements that provided good visualization of the progress on the job as a whole as well as on individual modules. Over one-half of the observed failures were due to factors associated with the individual run submission rather than with the code proper. Possible application of these findings for line management, project managers, functional management, and regulatory agencies is discussed. Steps for simplifying the measurement process and for use of these data in predicting operational software reliability are outlined.
Lessons learned in deploying software estimation technology and tools
NASA Technical Reports Server (NTRS)
Panlilio-Yap, Nikki; Ho, Danny
1994-01-01
Developing a software product involves estimating various project parameters. This is typically done in the planning stages of the project when there is much uncertainty and very little information. Coming up with accurate estimates of effort, cost, schedule, and reliability is a critical problem faced by all software project managers. The use of estimation models and commercially available tools in conjunction with the best bottom-up estimates of software-development experts enhances the ability of a product development group to derive reasonable estimates of important project parameters. This paper describes the experience of the IBM Software Solutions (SWS) Toronto Laboratory in selecting software estimation models and tools and deploying their use to the laboratory's product development groups. It introduces the SLIM and COSTAR products, the software estimation tools selected for deployment to the product areas, and discusses the rationale for their selection. The paper also describes the mechanisms used for technology injection and tool deployment, and concludes with a discussion of important lessons learned in the technology and tool insertion process.
Code of Federal Regulations, 2012 CFR
2012-10-01
... cohesion. Component means an electronic element, device, or appliance (including hardware or software) that... and software version, is documented and maintained through the life-cycle of the products in use. Executive software means software common to all installations of a given electronic product. It generally is...
Code of Federal Regulations, 2013 CFR
2013-10-01
... cohesion. Component means an electronic element, device, or appliance (including hardware or software) that... and software version, is documented and maintained through the life-cycle of the products in use. Executive software means software common to all installations of a given electronic product. It generally is...
Code of Federal Regulations, 2014 CFR
2014-10-01
... cohesion. Component means an electronic element, device, or appliance (including hardware or software) that... and software version, is documented and maintained through the life-cycle of the products in use. Executive software means software common to all installations of a given electronic product. It generally is...
SAGA: A project to automate the management of software production systems
NASA Technical Reports Server (NTRS)
Campbell, Roy H.; Beckman-Davies, C. S.; Benzinger, L.; Beshers, G.; Laliberte, D.; Render, H.; Sum, R.; Smith, W.; Terwilliger, R.
1986-01-01
Research into software development is required to reduce its production cost and to improve its quality. Modern software systems, such as the embedded software required for NASA's space station initiative, stretch current software engineering techniques. The requirements to build large, reliable, and maintainable software systems increases with time. Much theoretical and practical research is in progress to improve software engineering techniques. One such technique is to build a software system or environment which directly supports the software engineering process, i.e., the SAGA project, comprising the research necessary to design and build a software development which automates the software engineering process. Progress under SAGA is described.
Malone, James; Brown, Andy; Lister, Allyson L; Ison, Jon; Hull, Duncan; Parkinson, Helen; Stevens, Robert
2014-01-01
Biomedical ontologists to date have concentrated on ontological descriptions of biomedical entities such as gene products and their attributes, phenotypes and so on. Recently, effort has diversified to descriptions of the laboratory investigations by which these entities were produced. However, much biological insight is gained from the analysis of the data produced from these investigations, and there is a lack of adequate descriptions of the wide range of software that are central to bioinformatics. We need to describe how data are analyzed for discovery, audit trails, provenance and reproducibility. The Software Ontology (SWO) is a description of software used to store, manage and analyze data. Input to the SWO has come from beyond the life sciences, but its main focus is the life sciences. We used agile techniques to gather input for the SWO and keep engagement with our users. The result is an ontology that meets the needs of a broad range of users by describing software, its information processing tasks, data inputs and outputs, data formats versions and so on. Recently, the SWO has incorporated EDAM, a vocabulary for describing data and related concepts in bioinformatics. The SWO is currently being used to describe software used in multiple biomedical applications. The SWO is another element of the biomedical ontology landscape that is necessary for the description of biomedical entities and how they were discovered. An ontology of software used to analyze data produced by investigations in the life sciences can be made in such a way that it covers the important features requested and prioritized by its users. The SWO thus fits into the landscape of biomedical ontologies and is produced using techniques designed to keep it in line with user's needs. The Software Ontology is available under an Apache 2.0 license at http://theswo.sourceforge.net/; the Software Ontology blog can be read at http://softwareontology.wordpress.com.
2014-01-01
Motivation Biomedical ontologists to date have concentrated on ontological descriptions of biomedical entities such as gene products and their attributes, phenotypes and so on. Recently, effort has diversified to descriptions of the laboratory investigations by which these entities were produced. However, much biological insight is gained from the analysis of the data produced from these investigations, and there is a lack of adequate descriptions of the wide range of software that are central to bioinformatics. We need to describe how data are analyzed for discovery, audit trails, provenance and reproducibility. Results The Software Ontology (SWO) is a description of software used to store, manage and analyze data. Input to the SWO has come from beyond the life sciences, but its main focus is the life sciences. We used agile techniques to gather input for the SWO and keep engagement with our users. The result is an ontology that meets the needs of a broad range of users by describing software, its information processing tasks, data inputs and outputs, data formats versions and so on. Recently, the SWO has incorporated EDAM, a vocabulary for describing data and related concepts in bioinformatics. The SWO is currently being used to describe software used in multiple biomedical applications. Conclusion The SWO is another element of the biomedical ontology landscape that is necessary for the description of biomedical entities and how they were discovered. An ontology of software used to analyze data produced by investigations in the life sciences can be made in such a way that it covers the important features requested and prioritized by its users. The SWO thus fits into the landscape of biomedical ontologies and is produced using techniques designed to keep it in line with user’s needs. Availability The Software Ontology is available under an Apache 2.0 license at http://theswo.sourceforge.net/; the Software Ontology blog can be read at http://softwareontology.wordpress.com. PMID:25068035
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-21
... INTERNATIONAL TRADE COMMISSION [Investigation No. 337-TA-810] Certain Navigation Products, Components Thereof, and Related Software; Determination Not To Review an Initial Determination Granting a... United States after importation of certain navigation products, components thereof, and related software...
UTM TCL2 Software Requirements
NASA Technical Reports Server (NTRS)
Smith, Irene S.; Rios, Joseph L.; McGuirk, Patrick O.; Mulfinger, Daniel G.; Venkatesan, Priya; Smith, David R.; Baskaran, Vijayakumar; Wang, Leo
2017-01-01
The Unmanned Aircraft Systems (UAS) Traffic Management (UTM) Technical Capability Level (TCL) 2 software implements the UTM TCL 2 software requirements described herein. These software requirements are linked to the higher level UTM TCL 2 System Requirements. Each successive TCL implements additional UTM functionality, enabling additional use cases. TCL 2 demonstrated how to enable expanded multiple operations by implementing automation for beyond visual line-of-sight, tracking operations, and operations flying over sparsely populated areas.
On-line content creation for photo products: understanding what the user wants
NASA Astrophysics Data System (ADS)
Fageth, Reiner
2015-03-01
This paper describes how videos can be implemented into printed photo books and greeting cards. We will show that - surprisingly or not- pictures from videos are similarly used such as classical images to tell compelling stories. Videos can be taken with nearly every camera, digital point and shoot cameras, DSLRs as well as smartphones and more and more with so-called action cameras mounted on sports devices. The implementation of videos while generating QR codes and relevant pictures out of the video stream via a software implementation was contents in last years' paper. This year we present first data about what contents is displayed and how the users represent their videos in printed products, e.g. CEWE PHOTOBOOKS and greeting cards. We report the share of the different video formats used.
NASA Astrophysics Data System (ADS)
Zemek, Peter G.; Plowman, Steven V.
2010-04-01
Advances in hardware have miniaturized the emissions spectrometer and associated optics, rendering them easily deployed in the field. Such systems are also suitable for vehicle mounting, and can provide high quality data and concentration information in minutes. Advances in software have accompanied this hardware evolution, enabling the development of portable point-and-click OP-FTIR systems that weigh less than 16 lbs. These systems are ideal for first-responders, military, law enforcement, forensics, and screening applications using optical remote sensing (ORS) methodologies. With canned methods and interchangeable detectors, the new generation of OP-FTIR technology is coupled to the latest forward reference-type model software to provide point-and-click technology. These software models have been established for some time. However, refined user-friendly models that use active, passive, and solar occultation methodologies now allow the user to quickly field-screen and quantify plumes, fence-lines, and combustion incident scenarios in high-temporal-resolution. Synthetic background generation is now redundant as the models use highly accurate instrument line shape (ILS) convolutions and several other parameters, in conjunction with radiative transfer model databases to model a single calibration spectrum to collected sample spectra. Data retrievals are performed directly on single beam spectra using non-linear classical least squares (NLCLS). Typically, the Hitran line database is used to generate the initial calibration spectrum contained within the software.
An Imaging And Graphics Workstation For Image Sequence Analysis
NASA Astrophysics Data System (ADS)
Mostafavi, Hassan
1990-01-01
This paper describes an application-specific engineering workstation designed and developed to analyze imagery sequences from a variety of sources. The system combines the software and hardware environment of the modern graphic-oriented workstations with the digital image acquisition, processing and display techniques. The objective is to achieve automation and high throughput for many data reduction tasks involving metric studies of image sequences. The applications of such an automated data reduction tool include analysis of the trajectory and attitude of aircraft, missile, stores and other flying objects in various flight regimes including launch and separation as well as regular flight maneuvers. The workstation can also be used in an on-line or off-line mode to study three-dimensional motion of aircraft models in simulated flight conditions such as wind tunnels. The system's key features are: 1) Acquisition and storage of image sequences by digitizing real-time video or frames from a film strip; 2) computer-controlled movie loop playback, slow motion and freeze frame display combined with digital image sharpening, noise reduction, contrast enhancement and interactive image magnification; 3) multiple leading edge tracking in addition to object centroids at up to 60 fields per second from both live input video or a stored image sequence; 4) automatic and manual field-of-view and spatial calibration; 5) image sequence data base generation and management, including the measurement data products; 6) off-line analysis software for trajectory plotting and statistical analysis; 7) model-based estimation and tracking of object attitude angles; and 8) interface to a variety of video players and film transport sub-systems.
Integrated Control System Engineering Support.
1984-12-01
interference susceptibility. " Study multiplex bus loading requirements. Flight Control Software 0 " Demonstrate efficiencies of modular software and...Major technical thrusts include the development of: (a) task-tailored mutimode con- trol laws incorporating direct force and weapon line pointing
CADDIS Volume 4. Data Analysis: Download Software
Overview of the data analysis tools available for download on CADDIS. Provides instructions for downloading and installing CADStat, access to Microsoft Excel macro for computing SSDs, a brief overview of command line use of R, a statistical software.
Manipulation and handling processes off-line programming and optimization with use of K-Roset
NASA Astrophysics Data System (ADS)
Gołda, G.; Kampa, A.
2017-08-01
Contemporary trends in development of efficient, flexible manufacturing systems require practical implementation of modern “Lean production” concepts for maximizing customer value through minimizing all wastes in manufacturing and logistics processes. Every FMS is built on the basis of automated and robotized production cells. Except flexible CNC machine tools and other equipments, the industrial robots are primary elements of the system. In the studies, authors look for wastes of time and cost in real tasks of robots, during manipulation processes. According to aspiration for optimization of handling and manipulation processes with use of the robots, the application of modern off-line programming methods and computer simulation, is the best solution and it is only way to minimize unnecessary movements and other instructions. The modelling process of robotized production cell and offline programming of Kawasaki robots in AS-Language will be described. The simulation of robotized workstation will be realized with use of virtual reality software K-Roset. Authors show the process of industrial robot’s programs improvement and optimization in terms of minimizing the number of useless manipulator movements and unnecessary instructions. This is realized in order to shorten the time of production cycles. This will also reduce costs of handling, manipulations and technological process.
Calibration and testing of a Raman hyperspectral imaging system to reveal powdered food adulteration
Lohumi, Santosh; Lee, Hoonsoo; Kim, Moon S.; Qin, Jianwei; Kandpal, Lalit Mohan; Bae, Hyungjin; Rahman, Anisur
2018-01-01
The potential adulteration of foodstuffs has led to increasing concern regarding food safety and security, in particular for powdered food products where cheap ground materials or hazardous chemicals can be added to increase the quantity of powder or to obtain the desired aesthetic quality. Due to the resulting potential health threat to consumers, the development of a fast, label-free, and non-invasive technique for the detection of adulteration over a wide range of food products is necessary. We therefore report the development of a rapid Raman hyperspectral imaging technique for the detection of food adulteration and for authenticity analysis. The Raman hyperspectral imaging system comprises of a custom designed laser illumination system, sensing module, and a software interface. Laser illumination system generates a 785 nm laser line of high power, and the Gaussian like intensity distribution of laser beam is shaped by incorporating an engineered diffuser. The sensing module utilize Rayleigh filters, imaging spectrometer, and detector for collection of the Raman scattering signals along the laser line. A custom-built software to acquire Raman hyperspectral images which also facilitate the real time visualization of Raman chemical images of scanned samples. The developed system was employed for the simultaneous detection of Sudan dye and Congo red dye adulteration in paprika powder, and benzoyl peroxide and alloxan monohydrate adulteration in wheat flour at six different concentrations (w/w) from 0.05 to 1%. The collected Raman imaging data of the adulterated samples were analyzed to visualize and detect the adulterant concentrations by generating a binary image for each individual adulterant material. The results obtained based on the Raman chemical images of adulterants showed a strong correlation (R>0.98) between added and pixel based calculated concentration of adulterant materials. This developed Raman imaging system thus, can be considered as a powerful analytical technique for the quality and authenticity analysis of food products. PMID:29708973
Lohumi, Santosh; Lee, Hoonsoo; Kim, Moon S; Qin, Jianwei; Kandpal, Lalit Mohan; Bae, Hyungjin; Rahman, Anisur; Cho, Byoung-Kwan
2018-01-01
The potential adulteration of foodstuffs has led to increasing concern regarding food safety and security, in particular for powdered food products where cheap ground materials or hazardous chemicals can be added to increase the quantity of powder or to obtain the desired aesthetic quality. Due to the resulting potential health threat to consumers, the development of a fast, label-free, and non-invasive technique for the detection of adulteration over a wide range of food products is necessary. We therefore report the development of a rapid Raman hyperspectral imaging technique for the detection of food adulteration and for authenticity analysis. The Raman hyperspectral imaging system comprises of a custom designed laser illumination system, sensing module, and a software interface. Laser illumination system generates a 785 nm laser line of high power, and the Gaussian like intensity distribution of laser beam is shaped by incorporating an engineered diffuser. The sensing module utilize Rayleigh filters, imaging spectrometer, and detector for collection of the Raman scattering signals along the laser line. A custom-built software to acquire Raman hyperspectral images which also facilitate the real time visualization of Raman chemical images of scanned samples. The developed system was employed for the simultaneous detection of Sudan dye and Congo red dye adulteration in paprika powder, and benzoyl peroxide and alloxan monohydrate adulteration in wheat flour at six different concentrations (w/w) from 0.05 to 1%. The collected Raman imaging data of the adulterated samples were analyzed to visualize and detect the adulterant concentrations by generating a binary image for each individual adulterant material. The results obtained based on the Raman chemical images of adulterants showed a strong correlation (R>0.98) between added and pixel based calculated concentration of adulterant materials. This developed Raman imaging system thus, can be considered as a powerful analytical technique for the quality and authenticity analysis of food products.
Reporting Differences Between Spacecraft Sequence Files
NASA Technical Reports Server (NTRS)
Khanampompan, Teerapat; Gladden, Roy E.; Fisher, Forest W.
2010-01-01
A suite of computer programs, called seq diff suite, reports differences between the products of other computer programs involved in the generation of sequences of commands for spacecraft. These products consist of files of several types: replacement sequence of events (RSOE), DSN keyword file [DKF (wherein DSN signifies Deep Space Network)], spacecraft activities sequence file (SASF), spacecraft sequence file (SSF), and station allocation file (SAF). These products can include line numbers, request identifications, and other pieces of information that are not relevant when generating command sequence products, though these fields can result in the appearance of many changes to the files, particularly when using the UNIX diff command to inspect file differences. The outputs of prior software tools for reporting differences between such products include differences in these non-relevant pieces of information. In contrast, seq diff suite removes the fields containing the irrelevant pieces of information before processing to extract differences, so that only relevant differences are reported. Thus, seq diff suite is especially useful for reporting changes between successive versions of the various products and in particular flagging difference in fields relevant to the sequence command generation and review process.
A software architecture for automating operations processes
NASA Technical Reports Server (NTRS)
Miller, Kevin J.
1994-01-01
The Operations Engineering Lab (OEL) at JPL has developed a software architecture based on an integrated toolkit approach for simplifying and automating mission operations tasks. The toolkit approach is based on building adaptable, reusable graphical tools that are integrated through a combination of libraries, scripts, and system-level user interface shells. The graphical interface shells are designed to integrate and visually guide a user through the complex steps in an operations process. They provide a user with an integrated system-level picture of an overall process, defining the required inputs and possible output through interactive on-screen graphics. The OEL has developed the software for building these process-oriented graphical user interface (GUI) shells. The OEL Shell development system (OEL Shell) is an extension of JPL's Widget Creation Library (WCL). The OEL Shell system can be used to easily build user interfaces for running complex processes, applications with extensive command-line interfaces, and tool-integration tasks. The interface shells display a logical process flow using arrows and box graphics. They also allow a user to select which output products are desired and which input sources are needed, eliminating the need to know which program and its associated command-line parameters must be executed in each case. The shells have also proved valuable for use as operations training tools because of the OEL Shell hypertext help environment. The OEL toolkit approach is guided by several principles, including the use of ASCII text file interfaces with a multimission format, Perl scripts for mission-specific adaptation code, and programs that include a simple command-line interface for batch mode processing. Projects can adapt the interface shells by simple changes to the resources configuration file. This approach has allowed the development of sophisticated, automated software systems that are easy, cheap, and fast to build. This paper will discuss our toolkit approach and the OEL Shell interface builder in the context of a real operations process example. The paper will discuss the design and implementation of a Ulysses toolkit for generating the mission sequence of events. The Sequence of Events Generation (SEG) system provides an adaptable multimission toolkit for producing a time-ordered listing and timeline display of spacecraft commands, state changes, and required ground activities.
NASA Astrophysics Data System (ADS)
Tasinato, Nicola; Pietropolli Charmet, Andrea; Stoppa, Paolo; Giorgianni, Santi
2010-03-01
In this work the self-broadening coefficients and the integrated line intensities for a number of ro-vibrational transitions of vinyl fluoride have been determined for the first time by means of TDL spectroscopy. The spectra recorded in the atmospheric window around 8.7 µm appear very crowded with a density of about 90 lines per cm-1. In order to fit these spectral features a new fitting software has been implemented. The program, which is designed for laser spectroscopy, can fit many lines simultaneously on the basis of different theoretical profiles (Doppler, Lorentz, Voigt, Galatry and Nelkin-Ghatak). Details of the object oriented implementation of the application are given. The reliability of the program is demonstrated by determining the line parameters of some ro-vibrational lines of sulphur dioxide in the ν1 band region around 9 µm. Then the software is used for the line profile analysis of vinyl fluoride. The experimental line shapes show deviations from the Voigt profile, which can be well modelled by using a Dicke narrowed line shape function. This leads to the determination of the self-narrowing coefficient within the framework of the strong collision model.
Optical sensor for real-time weld defect detection
NASA Astrophysics Data System (ADS)
Ancona, Antonio; Maggipinto, Tommaso; Spagnolo, Vincenzo; Ferrara, Michele; Lugara, Pietro M.
2002-04-01
In this work we present an innovative optical sensor for on- line and non-intrusive welding process monitoring. It is based on the spectroscopic analysis of the optical VIS emission of the welding plasma plume generated in the laser- metal interaction zone. Plasma electron temperature has been measured for different chemical species composing the plume. Temperature signal evolution has been recorded and analyzed during several CO2-laser welding processes, under variable operating conditions. We have developed a suitable software able to real time detect a wide range of weld defects like crater formation, lack of fusion, excessive penetration, seam oxidation. The same spectroscopic approach has been applied for electric arc welding process monitoring. We assembled our optical sensor in a torch for manual Gas Tungsten Arc Welding procedures and tested the prototype in a manufacturing industry production line. Even in this case we found a clear correlation between the signal behavior and the welded joint quality.
ERIC Educational Resources Information Center
Watkins, Beverly T.
1992-01-01
Course Technology Inc. has developed 10 products combining textbooks with commercial software for college accounting, business, computer science, and statistics courses. Five of the products use Lotus 1-2-3 spreadsheet software. The products have been positively received by teachers and students. (DB)
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-22
... INTERNATIONAL TRADE COMMISSION [Investigation No. 337-TA-783] Certain GPS Navigation Products, Components Thereof, and Related Software; Termination of Investigation on the Basis of Settlement AGENCY: U.S... GPS navigation products, components thereof, and related software, by reason of the infringement of...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-09
..., Associated Software, and Products Containing the Same AGENCY: U.S. International Trade Commission. ACTION..., components thereof, associated software, and products containing the same by reason of infringement of..., components thereof, associated software, and products containing the same that infringe one or more of claims...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-27
... INTERNATIONAL TRADE COMMISSION [Investigation No. 337-TA-677] In the Matter of: Certain Course Management System Software Products; Notice of Commission Determination Not To Review an Initial... course management system software products that infringe certain claims of United States Patent No. 6,988...
Comelli, M; Benes, M; Bampo, A; Villalta, R
2007-01-01
The Regional Environment Protection Agency of Friuli Venezia Giulia (ARPA FVG, Italy) has performed an analysis on existing software designed to calculate magnetic induction field generated by power lines. As far as the agency's requirements are concerned the tested programs display some difficulties in the immediate processing of electrical and geometrical data supplied by plant owners, and in certain cases turn out to be inadequate in representing complex configurations of power lines. Phidel, an innovative software, tackles and works out all the above-mentioned problems. Therefore, the obtained results, when compared with those of other programs, are the closest to experimental measurements. The output data can be employed both in the GIS and Excel environments, allowing the immediate overlaying of digital cartography and the determining of the 3 and 10 muT bands, in compliance with the Italian Decree of the President of the Council of Ministers of 8 July 2003.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-02
... Software and Firmware, and Components Thereof and Products Containing the Same; Institution of..., related software and firmware, and components thereof and products containing the same by reason of... after importation of certain cameras and mobile devices, related software and firmware, and components...
MOPEX: a software package for astronomical image processing and visualization
NASA Astrophysics Data System (ADS)
Makovoz, David; Roby, Trey; Khan, Iffat; Booth, Hartley
2006-06-01
We present MOPEX - a software package for astronomical image processing and display. The package is a combination of command-line driven image processing software written in C/C++ with a Java-based GUI. The main image processing capabilities include creating mosaic images, image registration, background matching, point source extraction, as well as a number of minor image processing tasks. The combination of the image processing and display capabilities allows for much more intuitive and efficient way of performing image processing. The GUI allows for the control over the image processing and display to be closely intertwined. Parameter setting, validation, and specific processing options are entered by the user through a set of intuitive dialog boxes. Visualization feeds back into further processing by providing a prompt feedback of the processing results. The GUI also allows for further analysis by accessing and displaying data from existing image and catalog servers using a virtual observatory approach. Even though originally designed for the Spitzer Space Telescope mission, a lot of functionalities are of general usefulness and can be used for working with existing astronomical data and for new missions. The software used in the package has undergone intensive testing and benefited greatly from effective software reuse. The visualization part has been used for observation planning for both the Spitzer and Herschel Space Telescopes as part the tool Spot. The visualization capabilities of Spot have been enhanced and integrated with the image processing functionality of the command-line driven MOPEX. The image processing software is used in the Spitzer automated pipeline processing, which has been in operation for nearly 3 years. The image processing capabilities have also been tested in off-line processing by numerous astronomers at various institutions around the world. The package is multi-platform and includes automatic update capabilities. The software package has been developed by a small group of software developers and scientists at the Spitzer Science Center. It is available for distribution at the Spitzer Science Center web page.
Features of commercial computer software systems for medical examiners and coroners.
Hanzlick, R L; Parrish, R G; Ing, R
1993-12-01
There are many ways of automating medical examiner and coroner offices, one of which is to purchase commercial software products specifically designed for death investigation. We surveyed four companies that offer such products and requested information regarding each company and its hardware, software, operating systems, peripheral devices, applications, networking options, programming language, querying capability, coding systems, prices, customer support, and number and size of offices using the product. Although the four products (CME2, ForenCIS, InQuest, and Medical Examiner's Software System) are similar in many respects and each can be installed on personal computers, there are differences among the products with regard to cost, applications, and the other features. Death investigators interested in office automation should explore these products to determine the usefulness of each in comparison with the others and in comparison with general-purpose, off-the-shelf databases and software adaptable to death investigation needs.
The development of mathematics courseware for learning line and angle
NASA Astrophysics Data System (ADS)
Halim, Noor Dayana Abd; Han, Ong Boon; Abdullah, Zaleha; Yusup, Junaidah
2015-05-01
Learning software is a teaching aid which is often used in schools to increase students' motivation, attract students' attention and also improve the quality of teaching and learning process. However, the development of learning software should be followed the phases in Instructional Design (ID) Model, therefore the process can be carried out systematic and orderly. Thus, this concept paper describes the application of ADDIE model in the development of mathematics learning courseware for learning Line and Angle named CBL-Math. ADDIE model consists of five consecutive phases which are Analysis, Design, Development, Implementation and Evaluation. Each phase must be properly planned in order to achieve the objectives stated. Other than to describe the processes occurring in each phase, this paper also demonstrating how cognitive theory of multimedia learning principles are integrated in the developed courseware. The principles that applied in the courseware reduce the students' cognitive load while learning the topic of line and angle. With well prepared development process and the integration of appropriate principles, it is expected that the developed software can help students learn effectively and also increase students' achievement in the topic of Line and Angle.
Measuring the impact of computer resource quality on the software development process and product
NASA Technical Reports Server (NTRS)
Mcgarry, Frank; Valett, Jon; Hall, Dana
1985-01-01
The availability and quality of computer resources during the software development process was speculated to have measurable, significant impact on the efficiency of the development process and the quality of the resulting product. Environment components such as the types of tools, machine responsiveness, and quantity of direct access storage may play a major role in the effort to produce the product and in its subsequent quality as measured by factors such as reliability and ease of maintenance. During the past six years, the NASA Goddard Space Flight Center has conducted experiments with software projects in an attempt to better understand the impact of software development methodologies, environments, and general technologies on the software process and product. Data was extracted and examined from nearly 50 software development projects. All were related to support of satellite flight dynamics ground-based computations. The relationship between computer resources and the software development process and product as exemplified by the subject NASA data was examined. Based upon the results, a number of computer resource-related implications are provided.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Turgeon, Jennifer L.; Minana, Molly A.; Hackney, Patricia
2009-01-01
The purpose of the Sandia National Laboratories (SNL) Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. Quality is defined in the US Department of Energy/National Nuclear Security Agency (DOE/NNSA) Quality Criteria, Revision 10 (QC-1) as 'conformance to customer requirements and expectations'. This quality plan defines the SNL ASC Program software quality engineering (SQE) practices and provides a mapping of these practices to the SNL Corporate Process Requirement (CPR) 001.3.6; 'Corporate Software Engineering Excellence'. This plan also identifies ASC management's and themore » software project teams responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals. This SNL ASC Software Quality Plan establishes the signatories commitments to improving software products by applying cost-effective SQE practices. This plan enumerates the SQE practices that comprise the development of SNL ASC's software products and explains the project teams opportunities for tailoring and implementing the practices.« less
NASA PC software evaluation project
NASA Technical Reports Server (NTRS)
Dominick, Wayne D. (Editor); Kuan, Julie C.
1986-01-01
The USL NASA PC software evaluation project is intended to provide a structured framework for facilitating the development of quality NASA PC software products. The project will assist NASA PC development staff to understand the characteristics and functions of NASA PC software products. Based on the results of the project teams' evaluations and recommendations, users can judge the reliability, usability, acceptability, maintainability and customizability of all the PC software products. The objective here is to provide initial, high-level specifications and guidelines for NASA PC software evaluation. The primary tasks to be addressed in this project are as follows: to gain a strong understanding of what software evaluation entails and how to organize a structured software evaluation process; to define a structured methodology for conducting the software evaluation process; to develop a set of PC software evaluation criteria and evaluation rating scales; and to conduct PC software evaluations in accordance with the identified methodology. Communication Packages, Network System Software, Graphics Support Software, Environment Management Software, General Utilities. This report represents one of the 72 attachment reports to the University of Southwestern Louisiana's Final Report on NASA Grant NGT-19-010-900. Accordingly, appropriate care should be taken in using this report out of context of the full Final Report.
Computer-based mechanical design of overhead lines
NASA Astrophysics Data System (ADS)
Rusinaru, D.; Bratu, C.; Dinu, R. C.; Manescu, L. G.
2016-02-01
Beside the performance, the safety level according to the actual standards is a compulsory condition for distribution grids’ operation. Some of the measures leading to improvement of the overhead lines reliability ask for installations’ modernization. The constraints imposed to the new lines components refer to the technical aspects as thermal stress or voltage drop, and look for economic efficiency, too. The mechanical sizing of the overhead lines is after all an optimization problem. More precisely, the task in designing of the overhead line profile is to size poles, cross-arms and stays and locate poles along a line route so that the total costs of the line's structure to be minimized and the technical and safety constraints to be fulfilled.The authors present in this paper an application for the Computer-Based Mechanical Design of the Overhead Lines and the features of the corresponding Visual Basic program, adjusted to the distribution lines. The constraints of the optimization problem are adjusted to the existing weather and loading conditions of Romania. The outputs of the software application for mechanical design of overhead lines are: the list of components chosen for the line: poles, cross-arms, stays; the list of conductor tension and forces for each pole, cross-arm and stay for different weather conditions; the line profile drawings.The main features of the mechanical overhead lines design software are interactivity, local optimization function and high-level user-interface
NASA Technical Reports Server (NTRS)
Sherman, Mark; Kodis, John; Bedet, Jean-Jacques; Wacker, Chris; Woytek, Joanne; Lynnes, Chris
1996-01-01
The Goddard Space Flight Center (GSFC) version 0 Distributed Active Archive Center (DAAC) has been developed to support existing and pre Earth Observing System (EOS) Earth science datasets, facilitate the scientific research, and test EOS data and information system (EOSDIS) concepts. To ensure that no data is ever lost, each product received at GSFC DAAC is archived on two different media, VHS and digital linear tape (DLT). The first copy is made on VHS tape and is under the control of UniTree. The second and third copies are made to DLT and VHS media under a custom built software package named 'Archer'. While Archer provides only a subset of the functions available with commercial software like UniTree, it supports migration between near-line and off-line media and offers much greater performance and flexibility to satisfy the specific needs of a data center. Archer is specifically designed to maximize total system throughput, rather than focusing on the turn-around time for individual files. The commercial off the shelf software (COTS) hierarchical storage management (HSM) products evaluated were mainly concerned with transparent, interactive, file access to the end-user, rather than a batch-orientated, optimizable (based on known data file characteristics) data archive and retrieval system. This is critical to the distribution requirements of the GSFC DAAC where orders for 5000 or more files at a time are received. Archer has the ability to queue many thousands of file requests and to sort these requests into internal processing schedules that optimize overall throughput. Specifically, mount and dismount, tape load and unload cycles, and tape motion are minimized. This feature did not seem to be available in many COTS pacages. Archer also uses a generic tar tape format that allows tapes to be read by many different systems rather than the proprietary format found in most COTS packages. This paper discusses some of the specific requirements at GSFC DAAC, the motivations for implementing the Archer system, and presents a discussion of the Archer design that resulted.
Measuring the software process and product: Lessons learned in the SEL
NASA Technical Reports Server (NTRS)
Basili, V. R.
1985-01-01
The software development process and product can and should be measured. The software measurement process at the Software Engineering Laboratory (SEL) has taught a major lesson: develop a goal-driven paradigm (also characterized as a goal/question/metric paradigm) for data collection. Project analysis under this paradigm leads to a design for evaluating and improving the methodology of software development and maintenance.
NASA Technical Reports Server (NTRS)
Hancock, David W., III
1999-01-01
This document provides the Software Management Plan for the GLAS Standard Data Software (SDS) supporting the GLAS instrument of the EOS ICESat Spacecraft. The SDS encompasses the ICESat Science Investigator-led Processing System (I-SIPS) Software and the Instrument Support Terminal (IST) Software. For the I-SIPS Software, the SDS will produce Level 0, Level 1, and Level 2 data products as well as the associated product quality assessments and descriptive information. For the IST Software, the SDS will accommodate the GLAS instrument support areas of engineering status, command, performance assessment, and instrument health status.
Wang, X; Shu, X; Li, Z; Huo, W; Zou, L; Tang, Y; Li, L
2018-01-27
Skin imaging analysis, acting as a supplement to noninvasive bioengineering devices, has been widely used in medical cosmetology and cosmetic product evaluation. The main aim of this study is to assess the differences and correlations in measuring skin spots, wrinkles, vascular features, porphyrin, and pore between two commercially available image analysis software. Seventy healthy women were included in the study. Before taking pictures, the dermatologist evaluated subjects' skin conditions. Test sites included the forehead, cheek, and periorbital skin. A 2 × 2 cm cardboard was used to make a mark on the skin surface. Pictures were taken using VISIA ® under three kinds light conditions and analyzed using VISIA ® and IPP ® respectively. (1) Skin pore, red area, ultraviolet spot, brown spot, porphyrin, and wrinkle measured with VISIA ® were correlated with those measured with IPP ® (P < .01). (2) Spot, wrinkle, fine line, brown spot, and red area analyzed with VISIA ® were correlated with age on the forehead and periorbital skin (P < .05). L-value, Crow's feet, ultraviolet spot, brown spot, and red area analyzed with IPP ® were correlated with age on the periorbital skin (P < .05). (3) L-value, spot, wrinkle, fine line, porphyrin, red area, and pore analyzed with VISIA ® and IPP ® showed correlations with the subjective evaluation scores (P < .05). VISIA ® and IPP ® showed acceptable correlation in measuring various skin conditions. VISIA ® showed a high sensibility when measured on the forehead skin. IPP ® is available as an alternative software program to evaluate skin features. © 2018 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Efficiency improvements of offline metrology job creation
NASA Astrophysics Data System (ADS)
Zuniga, Victor J.; Carlson, Alan; Podlesny, John C.; Knutrud, Paul C.
1999-06-01
Progress of the first lot of a new design through the production line is watched very closely. All performance metrics, cycle-time, in-line measurement results and final electrical performance are critical. Rapid movement of this lot through the line has serious time-to-market implications. Having this material waiting at a metrology operation for an engineer to create a measurement job plan wastes valuable turnaround time. Further, efficient use of a metrology system is compromised by the time required to create and maintain these measurement job plans. Thus, having a method to develop metrology job plans prior to the actual running of the material through the manufacture area can significantly improve both cycle time and overall equipment efficiency. Motorola and Schlumberger have worked together to develop and test such a system. The Remote Job Generator (RJG) created job plans for new device sin a manufacturing process from an NT host or workstation, offline. This increases available system tim effort making production measurements, decreases turnaround time on job plan creation and editing, and improves consistency across job plans. Most importantly this allows job plans for new devices to be available before the first wafers of the device arrive at the tool for measurement. The software also includes a database manager which allows updates of existing job plans to incorporate measurement changes required by process changes or measurement optimization. This paper will review the result of productivity enhancements through the increased metrology utilization and decreased cycle time associated with the use of RJG. Finally, improvements in process control through better control of Job Plans across different devices and layers will be discussed.
Metal mirror TMA, telescopes of the JSS product line: design and analysis
NASA Astrophysics Data System (ADS)
Kirschstein, Steffen; Koch, Amelia; Schöneich, Jürgen; Döngi, Frank
2005-09-01
For the increasing market of low-cost multispectral pushbroom scanners for spaceborne Earth remote sensing the Jena-Optronik GmbH have developed the JSS product line. They are typically operated on micro-satellites with strong resources constraints. This leads to instrument designs optimised with respect to minimum size and mass, power consumption, and cost. From various customer requirements, Jena-Optronik has derived the JSS product line of low-cost optical spaceborne scanners in the visible wavelength range. Three-mirror anastigmat (TMA) telescope designs have become a widespread design solution for fields of view from 2 to 12 deg. The design solution chosen by Jena-Optronik is based on all-aluminium telescopes. Novel ultra-precision milling and polishing techniques now give the opportunity to achieve the necessary optical surface quality for applications in the visible range. The TMA telescope optics design of the JSS-56 imager will be accommodated onboard the RapidEye spacecraft. The JSS-56 TMA with a F-number of 4.3 realised a swath width of 78km with a Ground pixel resolution of 6.5m × 6.5m. The aluminium mirrors are Ni coated to achieve a suitable surface polish quality. This paper discusses typical requirements for the thermal design the bimetallic effects of the mirrors. To achieve a nearly diffracted limited imaging the typical surface irregularities due to the turning process have to be addressed in the ray tracing models. Analysis and integration of real mirror data in the ZEMAX design software are demonstrated here and compared with build-in standard tolerance concepts.
Software Development Standard for Mission Critical Systems
2014-03-17
new development, modification, reuse, reengineering, maintenance , or any other activity or combination of activities resulting in products . Within...develops” includes new development, modification, integration, reuse, reengineering, maintenance , or any other activity that results in products ... Maintenance organization. The organization that is responsible for modifying and otherwise sustaining the software and other software products and
Software technology insertion: A study of success factors
NASA Technical Reports Server (NTRS)
Lydon, Tom
1990-01-01
Managing software development in large organizations has become increasingly difficult due to increasing technical complexity, stricter government standards, a shortage of experienced software engineers, competitive pressure for improved productivity and quality, the need to co-develop hardware and software together, and the rapid changes in both hardware and software technology. The 'software factory' approach to software development minimizes risks while maximizing productivity and quality through standardization, automation, and training. However, in practice, this approach is relatively inflexible when adopting new software technologies. The methods that a large multi-project software engineering organization can use to increase the likelihood of successful software technology insertion (STI), especially in a standardized engineering environment, are described.
Reuse Metrics for Object Oriented Software
NASA Technical Reports Server (NTRS)
Bieman, James M.
1998-01-01
One way to increase the quality of software products and the productivity of software development is to reuse existing software components when building new software systems. In order to monitor improvements in reuse, the level of reuse must be measured. In this NASA supported project we (1) derived a suite of metrics which quantify reuse attributes for object oriented, object based, and procedural software, (2) designed prototype tools to take these measurements in Ada, C++, Java, and C software, (3) evaluated the reuse in available software, (4) analyzed the relationship between coupling, cohesion, inheritance, and reuse, (5) collected object oriented software systems for our empirical analyses, and (6) developed quantitative criteria and methods for restructuring software to improve reusability.
Data services providing by the Ukrainian NODC (MHI NASU)
NASA Astrophysics Data System (ADS)
Eremeev, V.; Godin, E.; Khaliulin, A.; Ingerov, A.; Zhuk, E.
2009-04-01
At modern stage of the World Ocean study information support of investigation based on ad-vanced computer technologies becomes of particular importance. These abstracts are devoted to presentation of several data services developed in the Ukrainian NODC on the base of the Ma-rine Environmental and Information Technologies Department of MHI NASU. The Data Quality Control Service Using experience of international collaboration in the field of data collection and quality check we have developed the quality control (QC) software providing both preliminary(automatic) and expert(manual) data quality check procedures. The current version of the QC software works for the Mediterranean and Black seas and includes the climatic arrays for hydrological and few hydrochemical parameters based on such products as MEDAR/MEDATLAS II, Physical Oceanography of the Black Sea and Climatic Atlas of Oxygen and Hydrogen Sulfide in the Black sea. The data quality check procedure includes metadata control and hydrological and hydrochemical data control. Metadata control provides checking of duplicate cruises and pro-files, date and chronology, ship velocity, station location, sea depth and observation depth. Data QC procedure includes climatic (or range for parameters with small number of observations) data QC, density inversion check for hydrological data and searching for spikes. Using of cli-matic fields and profiles prepared by regional oceanography experts leads to more reliable results of data quality check procedure. The Data Access Services The Ukrainian NODC provides two products for data access - on-line software and data access module for the MHI NASU local net. This software allows select-ing data on rectangle area, on date, on months, on cruises. The result of query is metadata which are presented in the table and the visual presentation of stations on the map. It is possible to see both metadata and data. For this purpose it is necessary to select station in the table of metadata or on the map. There is also an opportunity to export data in ODV format. The product is avail-able on http://www.ocean.nodc.org.ua/DataAccess.php The local net version provides access to the oceanological database of the MHI NASU. The cur-rent version allows selecting data by spatial and temporal limits, depth, values of parameters, quality flags and works for the Mediterranean and Black seas. It provides visualization of meta-data and data, statistics of data selection, data export into several data formats. The Operational Data Management Services The collaborators of the MHI Experimental Branch developed a system of obtaining information on water pressure and temperature, as well as on atmospheric pressure. Sea level observations are also conducted. The obtained data are transferred online. The interface for operation data access was developed. It allows to select parameters (sea level, water temperature, atmospheric pressure, wind and wa-ter pressure) and time interval to see parameter graphics. The product is available on http://www.ocean.nodc.org.ua/Katsively.php . The Climatic products The current version of the Climatic Atlas includes maps on such pa-rameters as temperature, salinity, density, heat storage, dynamic heights, upper boundary of hy-drogen sulfide and lower boundary of oxygen for the Black sea basin. Maps for temperature, sa-linity, density were calculated on 19 standard depths and averaged monthly for depths 0 - 300 m and annually for lower depth values. The climatic maps of upper boundary of hydrogen sulfide and lower boundary of oxygen were averaged by decades from 20 till 90 of the XX century and by seasons. Two versions of climatic atlas viewer - on-line and desktop for presentation of the climatic maps were developed. They provide similar functions of selection and viewing maps by parameter, month and depth and saving maps in various formats. On-line version of atlas is available on http://www.ocean.nodc.org.ua/Main_Atlas.php .
Scalable software architecture for on-line multi-camera video processing
NASA Astrophysics Data System (ADS)
Camplani, Massimo; Salgado, Luis
2011-03-01
In this paper we present a scalable software architecture for on-line multi-camera video processing, that guarantees a good trade off between computational power, scalability and flexibility. The software system is modular and its main blocks are the Processing Units (PUs), and the Central Unit. The Central Unit works as a supervisor of the running PUs and each PU manages the acquisition phase and the processing phase. Furthermore, an approach to easily parallelize the desired processing application has been presented. In this paper, as case study, we apply the proposed software architecture to a multi-camera system in order to efficiently manage multiple 2D object detection modules in a real-time scenario. System performance has been evaluated under different load conditions such as number of cameras and image sizes. The results show that the software architecture scales well with the number of camera and can easily works with different image formats respecting the real time constraints. Moreover, the parallelization approach can be used in order to speed up the processing tasks with a low level of overhead.
NASA Technical Reports Server (NTRS)
1989-01-01
001 is an integrated tool suited for automatically developing ultra reliable models, simulations and software systems. Developed and marketed by Hamilton Technologies, Inc. (HTI), it has been applied in engineering, manufacturing, banking and software tools development. The software provides the ability to simplify the complex. A system developed with 001 can be a prototype or fully developed with production quality code. It is free of interface errors, consistent, logically complete and has no data or control flow errors. Systems can be designed, developed and maintained with maximum productivity. Margaret Hamilton, President of Hamilton Technologies, also directed the research and development of USE.IT, an earlier product which was the first computer aided software engineering product in the industry to concentrate on automatically supporting the development of an ultrareliable system throughout its life cycle. Both products originated in NASA technology developed under a Johnson Space Center contract.
Product-oriented Software Certification Process for Software Synthesis
NASA Technical Reports Server (NTRS)
Nelson, Stacy; Fischer, Bernd; Denney, Ewen; Schumann, Johann; Richardson, Julian; Oh, Phil
2004-01-01
The purpose of this document is to propose a product-oriented software certification process to facilitate use of software synthesis and formal methods. Why is such a process needed? Currently, software is tested until deemed bug-free rather than proving that certain software properties exist. This approach has worked well in most cases, but unfortunately, deaths still occur due to software failure. Using formal methods (techniques from logic and discrete mathematics like set theory, automata theory and formal logic as opposed to continuous mathematics like calculus) and software synthesis, it is possible to reduce this risk by proving certain software properties. Additionally, software synthesis makes it possible to automate some phases of the traditional software development life cycle resulting in a more streamlined and accurate development process.
[Example of product development by industry and research solidarity].
Seki, Masayoshi
2014-01-01
When the industrial firms develop the product, the research result from research institutions is used or to reflect the ideas from users on the developed product would be significant in order to improve the product. To state the software product which developed jointly as an example to describe the adopted development technique and its result, and to consider the modality of the industry solidarity seen from the company side and joint development. The software development methods have the merit and demerit and necessary to choose the optimal development technique by the system which develops. We have been jointly developed the dose distribution browsing software. As the software development method, we adopted the prototype model. In order to display the dose distribution information, it is necessary to load four objects which are CT-Image, Structure Set, RT-Plan, and RT-Dose, are displayed in a composite manner. The prototype model which is the development technique was adopted by this joint development was optimal especially to develop the dose distribution browsing software. In a prototype model, since the detail design was created based on the program source code after the program was finally completed, there was merit on the period shortening of document written and consist in design and implementation. This software eventually opened to the public as an open source. Based on this developed prototype software, the release version of the dose distribution browsing software was developed. Developing this type of novelty software, it normally takes two to three years, but since the joint development was adopted, it shortens the development period to one year. Shortening the development period was able to hold down to the minimum development cost for a company and thus, this will be reflected to the product price. The specialists make requests on the product from user's point of view are important, but increase in specialists as professionals for product development will increase the expectations to develop a product to meet the users demand.
Specialized computer system to diagnose critical lined equipment
NASA Astrophysics Data System (ADS)
Yemelyanov, V. A.; Yemelyanova, N. Y.; Morozova, O. A.; Nedelkin, A. A.
2018-05-01
The paper presents data on the problem of diagnosing the lining condition at the iron and steel works. The authors propose and describe the structure of the specialized computer system to diagnose critical lined equipment. The relative results of diagnosing lining condition by the basic system and the proposed specialized computer system are presented. To automate evaluation of lining condition and support in making decisions regarding the operation mode of the lined equipment, the specialized software has been developed.
NASA Astrophysics Data System (ADS)
Boling, M. E.
1989-09-01
Prototypes were assembled pursuant to recommendations made in report K/DSRD-96, Issues and Approaches for Electronic Document Approval and Transmittal Using Digital Signatures and Text Authentication, and to examine and discover the possibilities for integrating available hardware and software to provide cost effective systems for digital signatures and text authentication. These prototypes show that on a LAN, a multitasking, windowed, mouse/keyboard menu-driven interface can be assembled to provide easy and quick access to bit-mapped images of documents, electronic forms and electronic mail messages with a means to sign, encrypt, deliver, receive or retrieve and authenticate text and signatures. In addition they show that some of this same software may be used in a classified environment using host to terminal transactions to accomplish these same operations. Finally, a prototype was developed demonstrating that binary files may be signed electronically and sent by point to point communication and over ARPANET to remote locations where the authenticity of the code and signature may be verified. Related studies on the subject of electronic signatures and text authentication using public key encryption were done within the Department of Energy. These studies include timing studies of public key encryption software and hardware and testing of experimental user-generated host resident software for public key encryption. This software used commercially available command-line source code. These studies are responsive to an initiative within the Office of the Secretary of Defense (OSD) for the protection of unclassified but sensitive data. It is notable that these related studies are all built around the same commercially available public key encryption products from the private sector and that the software selection was made independently by each study group.
SAGA: A project to automate the management of software production systems
NASA Technical Reports Server (NTRS)
Campbell, R. H.; Badger, W.; Beckman, C. S.; Beshers, G.; Hammerslag, D.; Kimball, J.; Kirslis, P. A.; Render, H.; Richards, P.; Terwilliger, R.
1984-01-01
The project to automate the management of software production systems is described. The SAGA system is a software environment that is designed to support most of the software development activities that occur in a software lifecycle. The system can be configured to support specific software development applications using given programming languages, tools, and methodologies. Meta-tools are provided to ease configuration. Several major components of the SAGA system are completed to prototype form. The construction methods are described.
Modernization of software quality assurance
NASA Technical Reports Server (NTRS)
Bhaumik, Gokul
1988-01-01
The customers satisfaction depends not only on functional performance, it also depends on the quality characteristics of the software products. An examination of this quality aspect of software products will provide a clear, well defined framework for quality assurance functions, which improve the life-cycle activities of software development. Software developers must be aware of the following aspects which have been expressed by many quality experts: quality cannot be added on; the level of quality built into a program is a function of the quality attributes employed during the development process; and finally, quality must be managed. These concepts have guided our development of the following definition for a Software Quality Assurance function: Software Quality Assurance is a formal, planned approach of actions designed to evaluate the degree of an identifiable set of quality attributes present in all software systems and their products. This paper is an explanation of how this definition was developed and how it is used.
Software Engineering Laboratory (SEL) data and information policy
NASA Technical Reports Server (NTRS)
Mcgarry, Frank
1991-01-01
The policies and overall procedures that are used in distributing and in making available products of the Software Engineering Laboratory (SEL) are discussed. The products include project data and measures, project source code, reports, and software tools.
Pricing Software and Information on CD-ROM.
ERIC Educational Resources Information Center
Gibbins, Patrick
1987-01-01
Examines the relationships between purchases of optical data disk products, publishers, and software suppliers. The discussion covers current pricing strategies for optical data disk software and information products, and possible future developments in marketing and pricing. (CLB)
A communication channel model of the software process
NASA Technical Reports Server (NTRS)
Tausworthe, R. C.
1988-01-01
Reported here is beginning research into a noisy communication channel analogy of software development process productivity, in order to establish quantifiable behavior and theoretical bounds. The analogy leads to a fundamental mathematical relationship between human productivity and the amount of information supplied by the developers, the capacity of the human channel for processing and transmitting information, the software product yield (object size), the work effort, requirements efficiency, tool and process efficiency, and programming environment advantage. Also derived is an upper bound to productivity that shows that software reuse is the only means than can lead to unbounded productivity growth; practical considerations of size and cost of reusable components may reduce this to a finite bound.
A communication channel model of the software process
NASA Technical Reports Server (NTRS)
Tausworthe, Robert C.
1988-01-01
Beginning research into a noisy communication channel analogy of software development process productivity, in order to establish quantifiable behavior and theoretical bounds is discussed. The analogy leads to a fundamental mathematical relationship between human productivity and the amount of information supplied by the developers, the capacity of the human channel for processing and transmitting information, the software product yield (object size) the work effort, requirements efficiency, tool and process efficiency, and programming environment advantage. An upper bound to productivity is derived that shows that software reuse is the only means that can lead to unbounded productivity growth; practical considerations of size and cost of reusable components may reduce this to a finite bound.
Analytical modeling of helium turbomachinery using FORTRAN 77
NASA Astrophysics Data System (ADS)
Balaji, Purushotham
Advanced Generation IV modular reactors, including Very High Temperature Reactors (VHTRs), utilize helium as the working fluid, with a potential for high efficiency power production utilizing helium turbomachinery. Helium is chemically inert and nonradioactive which makes the gas ideal for a nuclear power-plant environment where radioactive leaks are a high concern. These properties of helium gas helps to increase the safety features as well as to decrease the aging process of plant components. The lack of sufficient helium turbomachinery data has made it difficult to study the vital role played by the gas turbine components of these VHTR powered cycles. Therefore, this research work focuses on predicting the performance of helium compressors. A FORTRAN77 program is developed to simulate helium compressor operation, including surge line prediction. The resulting design point and off design performance data can be used to develop compressor map files readable by Numerical Propulsion Simulation Software (NPSS). This multi-physics simulation software that was developed for propulsion system analysis has found applications in simulating power-plant cycles.
NASA Technical Reports Server (NTRS)
1981-01-01
The software package evaluation was designed to analyze commercially available, field-proven, production control or manufacturing resource planning management technology and software package. The analysis was conducted by comparing SRB production control software requirements and conceptual system design to software package capabilities. The methodology of evaluation and the findings at each stage of evaluation are described. Topics covered include: vendor listing; request for information (RFI) document; RFI response rate and quality; RFI evaluation process; and capabilities versus requirements.
Tokamak-independent software analysis suite for multi-spectral line-polarization MSE diagnostics
Scott, S. D.; Mumgaard, R. T.
2016-07-20
A tokamak-independent analysis suite has been developed to process data from Motional Stark Effect (mse) diagnostics. The software supports multi-spectral line-polarization mse diagnostics which simultaneously measure emission at the mse σ and π lines as well as at two "background" wavelengths that are displaced from the mse spectrum by a few nanometers. This analysis accurately estimates the amplitude of partially polarized background light at the σ and π wavelengths even in situations where the background light changes rapidly in time and space, a distinct improvement over traditional "time-interpolation" background estimation. The signal amplitude at many frequencies is computed using amore » numerical-beat algorithm which allows the retardance of the mse photo-elastic modulators (pem's) to be monitored during routine operation. It also allows the use of summed intensities at multiple frequencies in the calculation of polarization direction, which increases the effective signal strength and reduces sensitivity to pem retardance drift. The software allows the polarization angles to be corrected for calibration drift using a system that illuminates the mse diagnostic with polarized light at four known polarization angles within ten seconds of a plasma discharge. As a result, the software suite is modular, parallelized, and portable to other facilities.« less
Tokamak-independent software analysis suite for multi-spectral line-polarization MSE diagnostics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scott, S. D.; Mumgaard, R. T.
A tokamak-independent analysis suite has been developed to process data from Motional Stark Effect (mse) diagnostics. The software supports multi-spectral line-polarization mse diagnostics which simultaneously measure emission at the mse σ and π lines as well as at two "background" wavelengths that are displaced from the mse spectrum by a few nanometers. This analysis accurately estimates the amplitude of partially polarized background light at the σ and π wavelengths even in situations where the background light changes rapidly in time and space, a distinct improvement over traditional "time-interpolation" background estimation. The signal amplitude at many frequencies is computed using amore » numerical-beat algorithm which allows the retardance of the mse photo-elastic modulators (pem's) to be monitored during routine operation. It also allows the use of summed intensities at multiple frequencies in the calculation of polarization direction, which increases the effective signal strength and reduces sensitivity to pem retardance drift. The software allows the polarization angles to be corrected for calibration drift using a system that illuminates the mse diagnostic with polarized light at four known polarization angles within ten seconds of a plasma discharge. As a result, the software suite is modular, parallelized, and portable to other facilities.« less
The Live Access Server - A Web-Services Framework for Earth Science Data
NASA Astrophysics Data System (ADS)
Schweitzer, R.; Hankin, S. C.; Callahan, J. S.; O'Brien, K.; Manke, A.; Wang, X. Y.
2005-12-01
The Live Access Server (LAS) is a general purpose Web-server for delivering services related to geo-science data sets. Data providers can use the LAS architecture to build custom Web interfaces to their scientific data. Users and client programs can then access the LAS site to search the provider's on-line data holdings, make plots of data, create sub-sets in a variety of formats, compare data sets and perform analysis on the data. The Live Access server software has continued to evolve by expanding the types of data (in-situ observations and curvilinear grids) it can serve and by taking advantages of advances in software infrastructure both in the earth sciences community (THREDDS, the GrADS Data Server, the Anagram framework and Java netCDF 2.2) and in the Web community (Java Servlet and the Apache Jakarta frameworks). This presentation will explore the continued evolution of the LAS architecture towards a complete Web-services-based framework. Additionally, we will discuss the redesign and modernization of some of the support tools available to LAS installers. Soon after the initial implementation, the LAS architecture was redesigned to separate the components that are responsible for the user interaction (the User Interface Server) from the components that are responsible for interacting with the data and producing the output requested by the user (the Product Server). During this redesign, we changed the implementation of the User Interface Server from CGI and JavaScript to the Java Servlet specification using Apache Jakarta Velocity backed by a database store for holding the user interface widget components. The User Interface server is now quite flexible and highly configurable because we modernized the components used for the implementation. Meanwhile, the implementation of the Product Server has remained a Perl CGI-based system. Clearly, the time has come to modernize this part of the LAS architecture. Before undertaking such a modernization it is important to understand what we hope to gain. Specifically we would like to make it even easier to add new output products into our core system based on the Ferret analysis and visualization package. By carefully factoring the tasks needed to create a product we will be able to create new products simply by adding a description of the product into the configuration and by writing the Ferret script needed to create the product. No code will need to be added to the Product Server to bring the new product on-line. The new architecture should be faster at extracting and processing configuration information needed to address each request. Finally, the new Product Server architecture should make it even easier to pass specialized configuration information to the Product Server to deal with unanticipated special data structures or processing requirements.
Endnote Web tutorial for BJCVS/RBCCV
de Oliveira, Marcos Aurélio Barboza; dos Santos, Carlos Alberto; Brandi, Antônio Carlos; Botelho, Paulo Henrique Husseini; Sciarra, Adília Maria Pires; Braile, Domingo Marcolino
2015-01-01
At present, many useful tools for reference management are available for use. They can be either off-line softwares or accessible Websites to all users in the internet. Their target is to facilitate the production of scientific text. But, to accomplish that, the featured bibliographic style should be effectively inserted, and the program has to be free. Here in this tutorial, we present Endnote Web®, a bibliographic reference management program comprising these two requirements: it contains the Brazilian Journal of Cardiovascular Surgery reference format and its use is free for charge after sign-in in IP registered terminal in Web of Science®. PMID:26107457
Mapping forest types in Worcester County, Maryland, using LANDSAT data
NASA Technical Reports Server (NTRS)
Burtis, J., Jr.; Witt, R. G.
1981-01-01
The feasibility of mapping Level 2 forest cover types for a county-sized area on Maryland's Eastern Shore was demonstrated. A Level 1 land use/land cover classification was carried out for all of Worcester County as well. A June 1978 LANDSAT scene was utilized in a classification which employed two software packages on different computers (IDIMS on an HP 3000 and ASTEP-II on a Univac 1108). A twelve category classification scheme was devised for the study area. Resulting products include black and white line printer maps, final color coded classification maps, digitally enhanced color imagery and tabulated acreage statistics for all land use and land cover types.
Advancements for continuous miners
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fiscor, S.
2007-06-15
Design changes and new technology make the modern continuous miner more user friendly. Two of the major manufacturers, Joy Mining Machinery and DBT, both based near Pittsburgh, PA, USA, have recently acquired other OEMs to offer a greater product line. Joy's biggest development in terms of improving cutting time is the FACEBOSS Control System which has an operator assistance element and Joy Surface Reporting Software (JSRP). Joy's WetHead continuous miners have excellent performance. DBT is researching ways to make the machines more reliable with new drive systems. It has also been experimenting with water sprays to improve dust suppression. 4more » photos.« less
Status report of the SRT radiotelescope control software: the DISCOS project
NASA Astrophysics Data System (ADS)
Orlati, A.; Bartolini, M.; Buttu, M.; Fara, A.; Migoni, C.; Poppi, S.; Righini, S.
2016-08-01
The Sardinia Radio Telescope (SRT) is a 64-m fully-steerable radio telescope. It is provided with an active surface to correct for gravitational deformations, allowing observations from 300 MHz to 100 GHz. At present, three receivers are available: a coaxial LP-band receiver (305-410 MHz and 1.5-1.8 GHz), a C-band receiver (5.7-7.7 GHz) and a 7-feed K-band receiver (18-26.5 GHz). Several back-ends are also available in order to perform the different data acquisition and analysis procedures requested by scientific projects. The design and development of the SRT control software started in 2004, and now belongs to a wider project called DISCOS (Development of the Italian Single-dish COntrol System), which provides a common infrastructure to the three Italian radio telescopes (Medicina, Noto and SRT dishes). DISCOS is based on the Alma Common Software (ACS) framework, and currently consists of more than 500k lines of code. It is organized in a common core and three specific product lines, one for each telescope. Recent developments, carried out after the conclusion of the technical commissioning of the instrument (October 2013), consisted in the addition of several new features in many parts of the observing pipeline, spanning from the motion control to the digital back-ends for data acquisition and data formatting; we brie y describe such improvements. More importantly, in the last two years we have supported the astronomical validation of the SRT radio telescope, leading to the opening of the first public call for proposals in late 2015. During this period, while assisting both the engineering and the scientific staff, we massively employed the control software and were able to test all of its features: in this process we received our first feedback from the users and we could verify how the system performed in a real-life scenario, drawing the first conclusions about the overall system stability and performance. We examine how the system behaves in terms of network load and system load, how it reacts to failures and errors, and what components and services seem to be the most critical parts of our architecture, showing how the ACS framework impacts on these aspects. Moreover, the exposure to public utilization has highlighted the major flaws in our development and software management process, which had to be tuned and improved in order to achieve faster release cycles in response to user feedback, and safer deploy operations. In this regard we show how the introduction of testing practices, along with continuous integration, helped us to meet higher quality standards. Having identified the most critical aspects of our software, we conclude showing our intentions for the future development of DISCOS, both in terms of software features and software infrastructures.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Minana, Molly A.; Sturtevant, Judith E.; Heaphy, Robert
2005-01-01
The purpose of the Sandia National Laboratories (SNL) Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. Quality is defined in DOE/AL Quality Criteria (QC-1) as conformance to customer requirements and expectations. This quality plan defines the ASC program software quality practices and provides mappings of these practices to the SNL Corporate Process Requirements (CPR 1.3.2 and CPR 1.3.6) and the Department of Energy (DOE) document, ASCI Software Quality Engineering: Goals, Principles, and Guidelines (GP&G). This quality plan identifies ASC management andmore » software project teams' responsibilities for cost-effective software engineering quality practices. The SNL ASC Software Quality Plan establishes the signatories commitment to improving software products by applying cost-effective software engineering quality practices. This document explains the project teams opportunities for tailoring and implementing the practices; enumerates the practices that compose the development of SNL ASC's software products; and includes a sample assessment checklist that was developed based upon the practices in this document.« less
Line length dependencies in interconnect optimization
NASA Astrophysics Data System (ADS)
Kadoch, Daniel; Duane, Michael; Lee, Yohan
1997-09-01
Metal line delay has become increasingly important for ULSI devices. Numerous expressions and software tools have been developed to describe interconnect delay as a function of the geometry and layout. Although many of these formulas have line length effects, this has not been explored in depth. Most software tools are either geared towards circuit designers, or involve more complex and CPU-intensive 3D modeling. In this work, PISCES (a 2D device simulator) was used to extract metal capacitance per unit length. We extend this approach for various lengths by creating a ladder network of the RC components and simulating in SPICE, or using simple closed-form Elmore delay equations. A new key result is that there are optimum metal line width/space for a fixed pitch and height/space ratios that are metal length dependent. For metal lines shorter than about 1500 micrometers , it is better to have narrower metal lines, and for lengths less than 500 micrometers , shrinking metal height is desirable because the penalty in resistance is more than compensated by the decrease in capacitance. For longer lines, the time delay is dominated by resistance, and wider, taller lines are better. Increasing metal spacing or reducing dielectric constant were beneficial for both long and short metal lines.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-03
..., Components Thereof, and Related Software; Institution of Investigation AGENCY: U.S. International Trade... navigation products, components thereof, and related software by reason of infringement of certain claims of... related software that infringe one or more of claims 1, 2, 11, and 16 of the '565 patent; claim 1 of the...
Automatic Generation of Just-in-Time Online Assessments from Software Design Models
ERIC Educational Resources Information Center
Zualkernan, Imran A.; El-Naaj, Salim Abou; Papadopoulos, Maria; Al-Amoudi, Budoor K.; Matthews, Charles E.
2009-01-01
Computer software is pervasive in today's society. The rate at which new versions of computer software products are released is phenomenal when compared to the release rate of new products in traditional industries such as aircraft building. This rapid rate of change can partially explain why most certifications in the software industry are…
On-line measurement of diameter of hot-rolled steel tube
NASA Astrophysics Data System (ADS)
Zhu, Xueliang; Zhao, Huiying; Tian, Ailing; Li, Bin
2015-02-01
In order to design a online diameter measurement system for Hot-rolled seamless steel tube production line. On one hand, it can play a stimulate part in the domestic pipe measuring technique. On the other hand, it can also make our domestic hot rolled seamless steel tube enterprises gain a strong product competitiveness with low input. Through the analysis of various detection methods and techniques contrast, this paper choose a CCD camera-based online caliper system design. The system mainly includes the hardware measurement portion and the image processing section, combining with software control technology and image processing technology, which can complete online measurement of heat tube diameter. Taking into account the complexity of the actual job site situation, it can choose a relatively simple and reasonable layout. The image processing section mainly to solve the camera calibration and the application of a function in Matlab, to achieve the diameter size display directly through the algorithm to calculate the image. I build a simulation platform in the design last phase, successfully, collect images for processing, to prove the feasibility and rationality of the design and make error in less than 2%. The design successfully using photoelectric detection technology to solve real work problems
Carbon and Nitrogen Enrichment Patterns in Planetary Nebulae
NASA Astrophysics Data System (ADS)
Dufour, Reginald
2011-10-01
The goal of this project is to assess the role played in carbon production by low and intermediate mass stars {LIMS}, i.e. the progenitors of planetary nebulae {PNe}. One of the most pressing problems in galactic chemical evolution today is understanding the relative roles of LIMS {1-8 M_sun} versus massive stars {8-120 M_sun} in affecting the cosmic level of the element C. We are launching a fresh, ambitious project whose purpose is to employ STIS to obtain UV spectra of unprecedented-quality of 10 carefully chosen, bright solar metallicity PNe spanning a broad range in progenitor mass. Line strength measurements of important emission lines of C, N, and O such as OIII] 1660-6, NIII] 1747-54, CIII] 1907-9, and {when He++ is strong} CIV] 1550 and OIV] 1400 in each object will be used along with our own in-house abundance software to determine ion and element abundances for these three species. In turn, these results will be used to assess stellar yields {productivity rates} available in the literature. Favored yield sets will be used to calculate our own chemical evolution models in order to assess directly the importance of intermediate-mass stars in the cosmic evolution of C.
AlQahtani, Nabeeh A; Haralur, Satheesh B; AlMaqbol, Mohammad; AlMufarrij, Ali Jubran; Al Dera, Ahmed Ali; Al-Qarni, Mohammed
2016-04-01
To determine the occurrence of smile line and maxillary tooth shape in the Saudi Arabian subpopulation, and to estimate the association between these parameters with gingival biotype. On the fulfillment of selection criteria, total 315 patients belong to Saudi Arabian ethnic group were randomly selected. Two frontal photographs of the patients were acquired. The tooth morphology, gingival angle, and smile line classification were determined with ImageJ image analyzing software. The gingival biotype was assessed by probe transparency method. The obtained data were analyzed with SPSS 19 (IBM Corporation, New York, USA) software to determine the frequency and association between other parameters and gingival biotype. Among the clinical parameters evaluated, the tapering tooth morphology (56.8%), thick gingival biotype (53%), and average smile line (57.5%) was more prevalent. The statistically significant association was found between thick gingival biotype and the square tooth, high smile line. The high gingival angle was associated with thin gingival biotype. The study results indicate the existence of an association between tooth shape, smile line, and gingival angle with gingival biotype.
Application of External Axis in Robot-Assisted Thermal Spraying
NASA Astrophysics Data System (ADS)
Deng, Sihao; Fang, Dandan; Cai, Zhenhua; Liao, Hanlin; Montavon, Ghislain
2012-12-01
Currently, industrial robots are widely used in the process of thermal spraying because of their high efficiency, security, and repeatability. Although robots are found suitable for use in industrial productions, they have some natural disadvantages because of their six-axis mechanical linkages. When a robot performs a series of stages of production, it could be hard to move from one to another because a few axes reach their limit value. For this reason, an external axis should be added to the robot system to extend the reachable space of the robots. This article concerns the application of external axis on ABB robots in thermal spraying and the different methods of off-line programming with external axis in the virtual environment. The developed software toolkit was applied to coat real workpiece with a complex geometry in atmospheric plasma spraying).
Intelligent Software for System Design and Documentation
NASA Technical Reports Server (NTRS)
2002-01-01
In an effort to develop a real-time, on-line database system that tracks documentation changes in NASA's propulsion test facilities, engineers at Stennis Space Center teamed with ECT International of Brookfield, WI, through the NASA Dual-Use Development Program to create the External Data Program and Hyperlink Add-on Modules for the promis*e software. Promis*e is ECT's top-of-the-line intelligent software for control system design and documentation. With promis*e the user can make use of the automated design process to quickly generate control system schematics, panel layouts, bills of material, wire lists, terminal plans and more. NASA and its testing contractors currently use promis*e to create the drawings and schematics at the E2 Cell 2 test stand located at Stennis Space Center.
First year of ALMA site software deployment: where everything comes together
NASA Astrophysics Data System (ADS)
González, Víctor; Mora, Matias; Araya, Rodrigo; Arredondo, Diego; Bartsch, Marcelo; Burgos, Pablo; Ibsen, Jorge; Reveco, Johnny; Sáez, Norman; Schemrl, Anton; Sepulveda, Jorge; Shen, Tzu-Chiang; Soto, Rubén; Troncoso, Nicolás; Zambrano, Mauricio; Barriga, Nicolás; Glendenning, Brian; Raffi, Gianni; Kern, Jeff
2010-07-01
Starting 2009, the ALMA project initiated one of its most exciting phases within construction: the first antenna from one of the vendors was delivered to the Assembly, Integration and Verification team. With this milestone and the closure of the ALMA Test Facility in New Mexico, the JAO Computing Group in Chile found itself in the front line of the project's software deployment and integration effort. Among the group's main responsibilities are the deployment, configuration and support of the observation systems, in addition to infrastructure administration, all of which needs to be done in close coordination with the development groups in Europe, North America and Japan. Software support has been the primary interaction key with the current users (mainly scientists, operators and hardware engineers), as the software is normally the most visible part of the system. During this first year of work with the production hardware, three consecutive software releases have been deployed and commissioned. Also, the first three antennas have been moved to the Array Operations Site, at 5.000 meters elevation, and the complete end-to-end system has been successfully tested. This paper shares the experience of this 15-people group as part of the construction team at the ALMA site, and working together with Computing IPT, on the achievements and problems overcomed during this period. It explores the excellent results of teamwork, and also some of the troubles that such a complex and geographically distributed project can run into. Finally, it approaches the challenges still to come, with the transition to the ALMA operations plan.
Features of the Upgraded Imaging for Hypersonic Experimental Aeroheating Testing (IHEAT) Software
NASA Technical Reports Server (NTRS)
Mason, Michelle L.; Rufer, Shann J.
2016-01-01
The Imaging for Hypersonic Experimental Aeroheating Testing (IHEAT) software is used at the NASA Langley Research Center to analyze global aeroheating data on wind tunnel models tested in the Langley Aerothermodynamics Laboratory. One-dimensional, semi-infinite heating data derived from IHEAT are used in the design of thermal protection systems for hypersonic vehicles that are exposed to severe aeroheating loads, such as reentry vehicles during descent and landing procedures. This software program originally was written in the PV-WAVE(Registered Trademark) programming language to analyze phosphor thermography data from the two-color, relative-intensity system developed at Langley. To increase the efficiency, functionality, and reliability of IHEAT, the program was migrated to MATLAB(Registered Trademark) syntax and compiled as a stand-alone executable file labeled version 4.0. New features of IHEAT 4.0 include the options to perform diagnostic checks of the accuracy of the acquired data during a wind tunnel test, to extract data along a specified multi-segment line following a feature such as a leading edge or a streamline, and to batch process all of the temporal frame data from a wind tunnel run. Results from IHEAT 4.0 were compared on a pixel level to the output images from the legacy software to validate the program. The absolute differences between the heat transfer data output from the two programs were on the order of 10(exp -5) to 10(exp -7). IHEAT 4.0 replaces the PV-WAVE(Registered Trademark) version as the production software for aeroheating experiments conducted in the hypersonic facilities at NASA Langley.
Crackscope : automatic pavement cracking inspection system.
DOT National Transportation Integrated Search
2008-08-01
The CrackScope system is an automated pavement crack rating system consisting of a : digital line scan camera, laser-line illuminator, and proprietary crack detection and classification : software. CrackScope is able to perform real-time pavement ins...
Software process improvement in the NASA software engineering laboratory
NASA Technical Reports Server (NTRS)
Mcgarry, Frank; Pajerski, Rose; Page, Gerald; Waligora, Sharon; Basili, Victor; Zelkowitz, Marvin
1994-01-01
The Software Engineering Laboratory (SEL) was established in 1976 for the purpose of studying and measuring software processes with the intent of identifying improvements that could be applied to the production of ground support software within the Flight Dynamics Division (FDD) at the National Aeronautics and Space Administration (NASA)/Goddard Space Flight Center (GSFC). The SEL has three member organizations: NASA/GSFC, the University of Maryland, and Computer Sciences Corporation (CSC). The concept of process improvement within the SEL focuses on the continual understanding of both process and product as well as goal-driven experimentation and analysis of process change within a production environment.
The Very Large Array Data Processing Pipeline
NASA Astrophysics Data System (ADS)
Kent, Brian R.; Masters, Joseph S.; Chandler, Claire J.; Davis, Lindsey E.; Kern, Jeffrey S.; Ott, Juergen; Schinzel, Frank K.; Medlin, Drew; Muders, Dirk; Williams, Stewart; Geers, Vincent C.; Momjian, Emmanuel; Butler, Bryan J.; Nakazato, Takeshi; Sugimoto, Kanako
2018-01-01
We present the VLA Pipeline, software that is part of the larger pipeline processing framework used for the Karl G. Jansky Very Large Array (VLA), and Atacama Large Millimeter/sub-millimeter Array (ALMA) for both interferometric and single dish observations.Through a collection of base code jointly used by the VLA and ALMA, the pipeline builds a hierarchy of classes to execute individual atomic pipeline tasks within the Common Astronomy Software Applications (CASA) package. Each pipeline task contains heuristics designed by the team to actively decide the best processing path and execution parameters for calibration and imaging. The pipeline code is developed and written in Python and uses a "context" structure for tracking the heuristic decisions and processing results. The pipeline "weblog" acts as the user interface in verifying the quality assurance of each calibration and imaging stage. The majority of VLA scheduling blocks above 1 GHz are now processed with the standard continuum recipe of the pipeline and offer a calibrated measurement set as a basic data product to observatory users. In addition, the pipeline is used for processing data from the VLA Sky Survey (VLASS), a seven year community-driven endeavor started in September 2017 to survey the entire sky down to a declination of -40 degrees at S-band (2-4 GHz). This 5500 hour next-generation large radio survey will explore the time and spectral domains, relying on pipeline processing to generate calibrated measurement sets, polarimetry, and imaging data products that are available to the astronomical community with no proprietary period. Here we present an overview of the pipeline design philosophy, heuristics, and calibration and imaging results produced by the pipeline. Future development will include the testing of spectral line recipes, low signal-to-noise heuristics, and serving as a testing platform for science ready data products.The pipeline is developed as part of the CASA software package by an international consortium of scientists and software developers based at the National Radio Astronomical Observatory (NRAO), the European Southern Observatory (ESO), and the National Astronomical Observatory of Japan (NAOJ).
Modular Analytical Multicomponent Analysis in Gas Sensor Aarrays
Chaiyboun, Ali; Traute, Rüdiger; Kiesewetter, Olaf; Ahlers, Simon; Müller, Gerhard; Doll, Theodor
2006-01-01
A multi-sensor system is a chemical sensor system which quantitatively and qualitatively records gases with a combination of cross-sensitive gas sensor arrays and pattern recognition software. This paper addresses the issue of data analysis for identification of gases in a gas sensor array. We introduce a software tool for gas sensor array configuration and simulation. It concerns thereby about a modular software package for the acquisition of data of different sensors. A signal evaluation algorithm referred to as matrix method was used specifically for the software tool. This matrix method computes the gas concentrations from the signals of a sensor array. The software tool was used for the simulation of an array of five sensors to determine gas concentration of CH4, NH3, H2, CO and C2H5OH. The results of the present simulated sensor array indicate that the software tool is capable of the following: (a) identify a gas independently of its concentration; (b) estimate the concentration of the gas, even if the system was not previously exposed to this concentration; (c) tell when a gas concentration exceeds a certain value. A gas sensor data base was build for the configuration of the software. With the data base one can create, generate and manage scenarios and source files for the simulation. With the gas sensor data base and the simulation software an on-line Web-based version was developed, with which the user can configure and simulate sensor arrays on-line.
A SPDS Node to Support the Systematic Interpretation of Cosmic Ray Data
NASA Technical Reports Server (NTRS)
1997-01-01
The purpose of this project was to establish and maintain a Space Physics Data System (SPDS) node that supports the analysis and interpretation of current and future galactic cosmic ray (GCR) measurements by (1) providing on-line databases relevant to GCR propagation studies; (2) providing other on-line services, such as anonymous FTP access, mail list service and pointers to e-mail address books, to support the cosmic ray community; (3) providing a mechanism for those in the community who might wish to submit similar contributions for public access; (4) maintaining the node to assure that the databases remain current; and (5) investigating other possibilities, such as CD-ROM, for public dissemination of the data products. Shortly after the original grant to support these activities was established at Louisiana State University a detailed study of alternate choices for the node hardware was initiated. The chosen hardware was an Apple Workgroup Server 9150/120 consisting of a 120 MHz PowerPC 601 processor, 32 MB of memory, two I GB disks and one 2 GB disk. This hardware was ordered and installed and has been operating reliably ever since. A preliminary version of the database server was available during the first year effort and was used as part of the very successful SPDS demonstration during the Rome, Italy International Cosmic Ray Conference. For this server version we were able to establish the html and anonymous FTP server software, develop a Web page structure which can be easily modified to include new items, provide an on-line database of charge changing total cross sections, include the cross section prediction software of Silberberg & Tsao as well as Webber, Kish and Schrier for download access, and provide an on-line bibliography of the cross section measurement references by the Transport Collaboration. The preliminary version of this SPDS Cosmic Ray node was examined by members of the C&H SPDS committee and returned comments were used to refine the implementation.
Integrating multisource land use and land cover data
Wright, Bruce E.; Tait, Mike; Lins, K.F.; Crawford, J.S.; Benjamin, S.P.; Brown, Jesslyn F.
1995-01-01
As part of the U.S. Geological Survey's (USGS) land use and land cover (LULC) program, the USGS in cooperation with the Environmental Systems Research Institute (ESRI) is collecting and integrating LULC data for a standard USGS 1:100,000-scale product. The LULC data collection techniques include interpreting spectrally clustered Landsat Thematic Mapper (TM) images; interpreting 1-meter resolution digital panchromatic orthophoto images; and, for comparison, aggregating locally available large-scale digital data of urban areas. The area selected is the Vancouver, WA-OR quadrangle, which has a mix of urban, rural agriculture, and forest land. Anticipated products include an integrated LULC prototype data set in a standard classification scheme referenced to the USGS digital line graph (DLG) data of the area and prototype software to develop digital LULC data sets.This project will evaluate a draft standard LULC classification system developed by the USGS for use with various source material and collection techniques. Federal, State, and local governments, and private sector groups will have an opportunity to evaluate the resulting prototype software and data sets and to provide recommendations. It is anticipated that this joint research endeavor will increase future collaboration among interested organizations, public and private, for LULC data collection using common standards and tools.
The research and practice of spacecraft software engineering
NASA Astrophysics Data System (ADS)
Chen, Chengxin; Wang, Jinghua; Xu, Xiaoguang
2017-06-01
In order to ensure the safety and reliability of spacecraft software products, it is necessary to execute engineering management. Firstly, the paper introduces the problems of unsystematic planning, uncertain classified management and uncontinuous improved mechanism in domestic and foreign spacecraft software engineering management. Then, it proposes a solution for software engineering management based on system-integrated ideology in the perspective of spacecraft system. Finally, a application result of spacecraft is given as an example. The research can provides a reference for executing spacecraft software engineering management and improving software product quality.
NASA Technical Reports Server (NTRS)
Dunham, J. R. (Editor); Knight, J. C. (Editor)
1982-01-01
The state of the art in the production of crucial software for flight control applications was addressed. The association between reliability metrics and software is considered. Thirteen software development projects are discussed. A short term need for research in the areas of tool development and software fault tolerance was indicated. For the long term, research in format verification or proof methods was recommended. Formal specification and software reliability modeling, were recommended as topics for both short and long term research.
Pragmatic quality metrics for evolutionary software development models
NASA Technical Reports Server (NTRS)
Royce, Walker
1990-01-01
Due to the large number of product, project, and people parameters which impact large custom software development efforts, measurement of software product quality is a complex undertaking. Furthermore, the absolute perspective from which quality is measured (customer satisfaction) is intangible. While we probably can't say what the absolute quality of a software product is, we can determine the relative quality, the adequacy of this quality with respect to pragmatic considerations, and identify good and bad trends during development. While no two software engineers will ever agree on an optimum definition of software quality, they will agree that the most important perspective of software quality is its ease of change. We can call this flexibility, adaptability, or some other vague term, but the critical characteristic of software is that it is soft. The easier the product is to modify, the easier it is to achieve any other software quality perspective. This paper presents objective quality metrics derived from consistent lifecycle perspectives of rework which, when used in concert with an evolutionary development approach, can provide useful insight to produce better quality per unit cost/schedule or to achieve adequate quality more efficiently. The usefulness of these metrics is evaluated by applying them to a large, real world, Ada project.
Code of Federal Regulations, 2010 CFR
2010-07-01
...-made end product: (i) U.S.-origin goods (excluding software) comprise less than 10 percent of the foreign-made good (excluding software); (ii) U.S.-origin software comprises less than 10 percent of the foreign-made software; (iii) U.S.-origin technology comprises less than 10 percent of the foreign-made...
Vasilyev, K N
2013-01-01
When developing new software products and adapting existing software, project leaders have to decide which functionalities to keep, adapt or develop. They have to consider that the cost of making errors during the specification phase is extremely high. In this paper a formalised approach is proposed that considers the main criteria for selecting new software functions. The application of this approach minimises the chances of making errors in selecting the functions to apply. Based on the work on software development and support projects in the area of water resources and flood damage evaluation in economic terms at CH2M HILL (the developers of the flood modelling package ISIS), the author has defined seven criteria for selecting functions to be included in a software product. The approach is based on the evaluation of the relative significance of the functions to be included into the software product. Evaluation is achieved by considering each criterion and the weighting coefficients of each criterion in turn and applying the method of normalisation. This paper includes a description of this new approach and examples of its application in the development of new software products in the are of the water resources management.
Global Software Development with Cloud Platforms
NASA Astrophysics Data System (ADS)
Yara, Pavan; Ramachandran, Ramaseshan; Balasubramanian, Gayathri; Muthuswamy, Karthik; Chandrasekar, Divya
Offshore and outsourced distributed software development models and processes are facing challenges, previously unknown, with respect to computing capacity, bandwidth, storage, security, complexity, reliability, and business uncertainty. Clouds promise to address these challenges by adopting recent advances in virtualization, parallel and distributed systems, utility computing, and software services. In this paper, we envision a cloud-based platform that addresses some of these core problems. We outline a generic cloud architecture, its design and our first implementation results for three cloud forms - a compute cloud, a storage cloud and a cloud-based software service- in the context of global distributed software development (GSD). Our ”compute cloud” provides computational services such as continuous code integration and a compile server farm, ”storage cloud” offers storage (block or file-based) services with an on-line virtual storage service, whereas the on-line virtual labs represent a useful cloud service. We note some of the use cases for clouds in GSD, the lessons learned with our prototypes and identify challenges that must be conquered before realizing the full business benefits. We believe that in the future, software practitioners will focus more on these cloud computing platforms and see clouds as a means to supporting a ecosystem of clients, developers and other key stakeholders.
Aura Atmospheric Data Products and Their Availability from NASA Goddard Earth Sciences DAAC
NASA Technical Reports Server (NTRS)
Ahmad, S.; Johnson, J.; Gopalan, A.; Smith, P.; Leptoukh, G.; Kempler, S.
2004-01-01
NASA's EOS-Aura spacecraft was launched successfully on July 15, 2004. The four instruments onboard the spacecraft are the Microwave Limb Sounder (MLS), the Ozone Monitoring Instrument (OMI), the Tropospheric Emission Spectrometer (TES), and the High Resolution Dynamics Limb Sounder (HBDLS). The Aura instruments are designed to gather earth sciences measurements across the ultraviolet, visible, infra-red, thermal and microwave regions of the electromagnetic spectrum. Aura will provide over 70 distinct standard atmospheric data products for use in ozone layer and surface UV-B monitoring, air quality forecast, and atmospheric chemistry and climate change studies (http://eosaura.gsfc.nasa.gov/). These products include earth-atmosphere radiances and solar spectral irradiances; total column, tropospheric, and profiles of ozone and other trace gases, surface W-B flux; clouds and aerosol characteristics; and temperature, geopotential height, and water vapor profiles. The MLS, OMI, and HIRDLS data products will be archived at the NASA Goddard Earth Sciences (GES) Distributed Active Archive Center (DAAC), while data from TES will be archived at NASA Langley Research Center DAAC. Some of the standard products which have gone through quick preliminary checks are already archived at the GES DAAC (http://daac.nsfc.nasa.gov/) and are available to the Aura science team and data validation team members for data validation; and to the application and visualization software developers, for testing their application modules. Once data are corrected for obvious calibration problems and partially validated using in-situ observations, they would be made available to the broader user community. This presentation will provide details of the whole suite of Aura atmospheric data products, and the time line of the availability of the rest of the preliminary products and of the partially validated provisional products. Software and took available for data access, visualization, and data mining will also be discussed.
Comparative analysis on flexibility requirements of typical Cryogenic Transfer lines
NASA Astrophysics Data System (ADS)
Jadon, Mohit; Kumar, Uday; Choukekar, Ketan; Shah, Nitin; Sarkar, Biswanath
2017-04-01
The cryogenic systems and their applications; primarily in large Fusion devices, utilize multiple cryogen transfer lines of various sizes and complexities to transfer cryogenic fluids from plant to the various user/ applications. These transfer lines are composed of various critical sections i.e. tee section, elbows, flexible components etc. The mechanical sustainability (under failure circumstances) of these transfer lines are primary requirement for safe operation of the system and applications. The transfer lines need to be designed for multiple design constraints conditions like line layout, support locations and space restrictions. The transfer lines are subjected to single load and multiple load combinations, such as operational loads, seismic loads, leak in insulation vacuum loads etc. [1]. The analytical calculations and flexibility analysis using professional software are performed for the typical transfer lines without any flexible component, the results were analysed for functional and mechanical load conditions. The failure modes were identified along the critical sections. The same transfer line was then refurbished with the flexible components and analysed for failure modes. The flexible components provide additional flexibility to the transfer line system and make it safe. The results obtained from the analytical calculations were compared with those obtained from the flexibility analysis software calculations. The optimization of the flexible component’s size and selection was performed and components were selected to meet the design requirements as per code.
Idri, Ali; Bachiri, Mariam; Fernández-Alemán, José Luis
2016-03-01
Stakeholders' needs and expectations are identified by means of software quality requirements, which have an impact on software product quality. In this paper, we present a set of requirements for mobile personal health records (mPHRs) for pregnancy monitoring, which have been extracted from literature and existing mobile apps on the market. We also use the ISO/IEC 25030 standard to suggest the requirements that should be considered during the quality evaluation of these mPHRs. We then go on to design a checklist in which we contrast the mPHRs for pregnancy monitoring requirements with software product quality characteristics and sub-characteristics in order to calculate the impact of these requirements on software product quality, using the ISO/IEC 25010 software product quality standard. The results obtained show that the requirements related to the user's actions and the app's features have the most impact on the external sub-characteristics of the software product quality model. The only sub-characteristic affected by all the requirements is Appropriateness of Functional suitability. The characteristic Operability is affected by 95% of the requirements while the lowest degrees of impact were identified for Compatibility (15%) and Transferability (6%). Lastly, the degrees of the impact of the mPHRs for pregnancy monitoring requirements are discussed in order to provide appropriate recommendations for the developers and stakeholders of mPHRs for pregnancy monitoring.
Architecture for Integrated Medical Model Dynamic Probabilistic Risk Assessment
NASA Technical Reports Server (NTRS)
Jaworske, D. A.; Myers, J. G.; Goodenow, D.; Young, M.; Arellano, J. D.
2016-01-01
Probabilistic Risk Assessment (PRA) is a modeling tool used to predict potential outcomes of a complex system based on a statistical understanding of many initiating events. Utilizing a Monte Carlo method, thousands of instances of the model are considered and outcomes are collected. PRA is considered static, utilizing probabilities alone to calculate outcomes. Dynamic Probabilistic Risk Assessment (dPRA) is an advanced concept where modeling predicts the outcomes of a complex system based not only on the probabilities of many initiating events, but also on a progression of dependencies brought about by progressing down a time line. Events are placed in a single time line, adding each event to a queue, as managed by a planner. Progression down the time line is guided by rules, as managed by a scheduler. The recently developed Integrated Medical Model (IMM) summarizes astronaut health as governed by the probabilities of medical events and mitigation strategies. Managing the software architecture process provides a systematic means of creating, documenting, and communicating a software design early in the development process. The software architecture process begins with establishing requirements and the design is then derived from the requirements.
Ferro, Myriam; Tardif, Marianne; Reguer, Erwan; Cahuzac, Romain; Bruley, Christophe; Vermat, Thierry; Nugues, Estelle; Vigouroux, Marielle; Vandenbrouck, Yves; Garin, Jérôme; Viari, Alain
2008-05-01
PepLine is a fully automated software which maps MS/MS fragmentation spectra of trypsic peptides to genomic DNA sequences. The approach is based on Peptide Sequence Tags (PSTs) obtained from partial interpretation of QTOF MS/MS spectra (first module). PSTs are then mapped on the six-frame translations of genomic sequences (second module) giving hits. Hits are then clustered to detect potential coding regions (third module). Our work aimed at optimizing the algorithms of each component to allow the whole pipeline to proceed in a fully automated manner using raw nucleic acid sequences (i.e., genomes that have not been "reduced" to a database of ORFs or putative exons sequences). The whole pipeline was tested on controlled MS/MS spectra sets from standard proteins and from Arabidopsis thaliana envelope chloroplast samples. Our results demonstrate that PepLine competed with protein database searching softwares and was fast enough to potentially tackle large data sets and/or high size genomes. We also illustrate the potential of this approach for the detection of the intron/exon structure of genes.
ERIC Educational Resources Information Center
Curtis, Rick
This paper summarizes information about using computer hardware and software to aid in making purchase decisions that are based on user needs. The two major options in hardware are IBM-compatible machines and the Apple Macintosh line. The three basic software applications include word processing, database management, and spreadsheet applications.…
Artificial intelligence approaches to software engineering
NASA Technical Reports Server (NTRS)
Johannes, James D.; Macdonald, James R.
1988-01-01
Artificial intelligence approaches to software engineering are examined. The software development life cycle is a sequence of not so well-defined phases. Improved techniques for developing systems have been formulated over the past 15 years, but pressure continues to attempt to reduce current costs. Software development technology seems to be standing still. The primary objective of the knowledge-based approach to software development presented in this paper is to avoid problem areas that lead to schedule slippages, cost overruns, or software products that fall short of their desired goals. Identifying and resolving software problems early, often in the phase in which they first occur, has been shown to contribute significantly to reducing risks in software development. Software development is not a mechanical process but a basic human activity. It requires clear thinking, work, and rework to be successful. The artificial intelligence approaches to software engineering presented support the software development life cycle through the use of software development techniques and methodologies in terms of changing current practices and methods. These should be replaced by better techniques that that improve the process of of software development and the quality of the resulting products. The software development process can be structured into well-defined steps, of which the interfaces are standardized, supported and checked by automated procedures that provide error detection, production of the documentation and ultimately support the actual design of complex programs.
NASA Astrophysics Data System (ADS)
Herbuś, K.; Ociepka, P.
2017-08-01
In the work is examined the sequential control system of a technological line in the form of the final part of a system of an internal transport. The process of designing this technological line using the computer-aided approach ran concurrently in two different program environments. In the Mechatronics Concept Designer module of the PLM Siemens NX software was developed the 3D model of the technological line prepared for verification the logic interrelations implemented in the control system. For this purpose, from the whole system of the technological line, it was distinguished the sub-system of actuators and sensors, because their correct operation determines the correct operation of the whole system. Whereas in the application of the virtual controller have been implemented the algorithms of work of the planned line. Then both program environments have been integrated using the OPC server, which enables the exchange of data between the considered systems. The data on the state of the object and the data defining the way and sequence of operation of the technological line are exchanged between the virtual controller and the 3D model of the technological line in real time.
Advanced fingerprint verification software
NASA Astrophysics Data System (ADS)
Baradarani, A.; Taylor, J. R. B.; Severin, F.; Maev, R. Gr.
2016-05-01
We have developed a fingerprint software package that can be used in a wide range of applications from law enforcement to public and private security systems, and to personal devices such as laptops, vehicles, and door- locks. The software and processing units are a unique implementation of new and sophisticated algorithms that compete with the current best systems in the world. Development of the software package has been in line with the third generation of our ultrasonic fingerprinting machine1. Solid and robust performance is achieved in the presence of misplaced and low quality fingerprints.
Maruca, R F
1999-01-01
So far, Rachel Soltanoff's instincts had been right. As CEO in this fictional case study, she had successfully navigated TradeRite Software's transition from a news service for stockbrokers to a $70 million provider of shrink-wrapped software geared toward both brokers and the growing day-trader market. Now a well-financed start-up, Stock-net.com, was testing a very competitive product that traders could download directly over the Web. And TradeRite's Web site was nothing more than a collection of elaborate marketing brochures. Rachel knew she needed to start selling over the Web. But the e-commerce consultants she had hired to set up her Web store were behind schedule, and their 21-year-old CEO had just resigned. Her product manager, Lisa Bandini, was working overtime to transform TradeRite's entire product line into Web-aware applications to match Stocknet's, and Rachel had $2.5 million to launch them. But the consultants said it would take $5 million just to rent e-commerce capabilities. Ace sales VP Brian Rockart thought the company had already wasted too much time and money--money from his budget--on its Web site. Marketing VP Rob Collins thought TradeRite should focus on its core stockbroker customers. Chief Technical Officer Joe Martinez doesn't want to go ahead without a pilot project. Should Rachel try to convince Brian, Rob, and the rest of the senior management team that e-commerce is the way to go? Four commentators offer advice.
Elementary Keyboarding Software Product Reports.
ERIC Educational Resources Information Center
Northwest Regional Educational Lab., Portland, OR.
This report provides detailed product descriptions of 45 software programs designed to teach or improve the keyboarding skills of elementary school students that were identified by the MicroSIFT (Microcomputer Information and Software for Teachers) staff. The descriptions include program titles, producer names, costs, grade levels, hardware,…
15 CFR 995.27 - Format validation software testing.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 15 Commerce and Foreign Trade 3 2010-01-01 2010-01-01 false Format validation software testing... CERTIFICATION REQUIREMENTS FOR NOAA HYDROGRAPHIC PRODUCTS AND SERVICES CERTIFICATION REQUIREMENTS FOR... of NOAA ENC Products § 995.27 Format validation software testing. Tests shall be performed verifying...
A cost analysis of first-line chemotherapy for low-risk gestational trophoblastic neoplasia.
Shah, Neel T; Barroilhet, Lisa; Berkowitz, Ross S; Goldstein, Donald P; Horowitz, Neil
2012-01-01
To determine the optimal approach to first-line treatment for low-risk gestational trophoblastic neoplasia (GTN) using a cost analysis of 3 commonly used regimens. A decision tree of the 3 most commonly used first-line low-risk GTN treatment strategies was created, accounting for toxicities, response rates and need for second- or third-line therapy. These strategies included 8-day methotrexate (MTX)/folinic acid, weekly MTX, and pulsed actinomycin-D (act-D). Response rates, average number of cycles needed for remission, and toxicities were determined by review of the literature. Costs of each strategy were examined from a societal perspective, including the direct total treatment costs as well as the indirect lost labor production costs from work absences. Sensitivity analysis on these costs was performed using both deterministic and probabilistic cost-minimization models with the aid of decision tree software (TreeAge Pro 2011, TreeAge Inc., Williamstown, Massachusetts). We found that 8-day MTX/folinic acid is the least expensive to society, followed by pulsed act-D ($4,867 vs. $6,111 average societal cost per cure, respectively), with act-D becoming more favorable only with act-D per-cycle cost <$231, or response rate to first-line therapy > 99%. Weekly MTX is the most expensive first-line treatment strategy to society ($9,089 average cost per cure), despite being least expensive to administer per cycle, based on lower first-line response rate. Absolute societal cost of each strategy is driven by the probability of needing expensive third-line multiagent chemotherapy, however relative cost differences are robust to sensitivity analysis over the reported range of cycle number and response rate for all therapies. Based on similar efficacy and lower societal cost, we recommend 8-day MTX/folinic acid for first-line treatment of low-risk GTN.
NASA Technical Reports Server (NTRS)
Currit, P. A.
1983-01-01
The Cleanroom software development methodology is designed to take the gamble out of product releases for both suppliers and receivers of the software. The ingredients of this procedure are a life cycle of executable product increments, representative statistical testing, and a standard estimate of the MTTF (Mean Time To Failure) of the product at the time of its release. A statistical approach to software product testing using randomly selected samples of test cases is considered. A statistical model is defined for the certification process which uses the timing data recorded during test. A reasonableness argument for this model is provided that uses previously published data on software product execution. Also included is a derivation of the certification model estimators and a comparison of the proposed least squares technique with the more commonly used maximum likelihood estimators.
The Software Engineering Laboratory: An operational software experience factory
NASA Technical Reports Server (NTRS)
Basili, Victor R.; Caldiera, Gianluigi; Mcgarry, Frank; Pajerski, Rose; Page, Gerald; Waligora, Sharon
1992-01-01
For 15 years, the Software Engineering Laboratory (SEL) has been carrying out studies and experiments for the purpose of understanding, assessing, and improving software and software processes within a production software development environment at NASA/GSFC. The SEL comprises three major organizations: (1) NASA/GSFC, Flight Dynamics Division; (2) University of Maryland, Department of Computer Science; and (3) Computer Sciences Corporation, Flight Dynamics Technology Group. These organizations have jointly carried out several hundred software studies, producing hundreds of reports, papers, and documents, all of which describe some aspect of the software engineering technology that was analyzed in the flight dynamics environment at NASA. The studies range from small, controlled experiments (such as analyzing the effectiveness of code reading versus that of functional testing) to large, multiple project studies (such as assessing the impacts of Ada on a production environment). The organization's driving goal is to improve the software process continually, so that sustained improvement may be observed in the resulting products. This paper discusses the SEL as a functioning example of an operational software experience factory and summarizes the characteristics of and major lessons learned from 15 years of SEL operations.
SAO mission support software and data standards, version 1.0
NASA Technical Reports Server (NTRS)
Hsieh, P.
1993-01-01
This document defines the software developed by the SAO AXAF Mission Support (MS) Program and defines standards for the software development process and control of data products generated by the software. The SAO MS is tasked to develop and use software to perform a variety of functions in support of the AXAF mission. Software is developed by software engineers and scientists, and commercial off-the-shelf (COTS) software is used either directly or customized through the use of scripts to implement analysis procedures. Software controls real-time laboratory instruments, performs data archiving, displays data, and generates model predictions. Much software is used in the analysis of data to generate data products that are required by the AXAF project, for example, on-orbit mirror performance predictions or detailed characterization of the mirror reflection performance with energy.
NASA Astrophysics Data System (ADS)
Sadchikova, G. M.
2017-01-01
This article discusses the results of the introduction of computer-aided design NX by Siemens Plm Software to the classes of a higher education institution. The necessity of application of modern information technologies in teaching students of engineering profile and selection of a software product is substantiated. The author describes stages of the software module study in relation to some specific courses, considers the features of NX software, which require the creation of standard and unified product databases. The article also gives examples of research carried out by the students with the various software modules.
Software Design Improvements. Part 1; Software Benefits and Limitations
NASA Technical Reports Server (NTRS)
Lalli, Vincent R.; Packard, Michael H.; Ziemianski, Tom
1997-01-01
Computer hardware and associated software have been used for many years to process accounting information, to analyze test data and to perform engineering analysis. Now computers and software also control everything from automobiles to washing machines and the number and type of applications are growing at an exponential rate. The size of individual program has shown similar growth. Furthermore, software and hardware are used to monitor and/or control potentially dangerous products and safety-critical systems. These uses include everything from airplanes and braking systems to medical devices and nuclear plants. The question is: how can this hardware and software be made more reliable? Also, how can software quality be improved? What methodology needs to be provided on large and small software products to improve the design and how can software be verified?
Castel, Anne-Laure; Menet, Aymeric; Ennezat, Pierre-Vladimir; Delelis, François; Le Goffic, Caroline; Binda, Camille; Guerbaai, Raphaëlle-Ashley; Levy, Franck; Graux, Pierre; Tribouilloy, Christophe; Maréchaux, Sylvestre
2016-01-01
Speckle tracking can be used to measure left ventricular global longitudinal strain (GLS). To study the effect of speckle tracking software product upgrades on GLS values and intervendor consistency. Subjects (patients or healthy volunteers) underwent systematic echocardiography with equipment from Philips and GE, without a change in their position. Off-line post-processing for GLS assessment was performed with the former and most recent upgrades from these two vendors (Philips QLAB 9.0 and 10.2; GE EchoPAC 12.1 and 13.1.1). GLS was obtained in three myocardial layers with EchoPAC 13.1.1. Intersoftware and intervendor consistency was assessed. Interobserver variability was tested in a subset of patients. Among 73 subjects (65 patients and 8 healthy volunteers), absolute values of GLS were higher with QLAB 10.2 compared with 9.0 (intraclass correlation coefficient [ICC]: 0.88; bias: 2.2%). Agreement between EchoPAC 13.1.1 and 12.1 varied by myocardial layer (13.1.1 only): midwall (ICC: 0.95; bias: -1.1%), endocardium (ICC: 0.93; bias: 1.6%) and epicardial (ICC: 0.80; bias: -3.3%). Although GLS was comparable for QLAB 9.0 versus EchoPAC 12.1 (ICC: 0.95; bias: 0.5%), the agreement was lower between QLAB 10.2 and EchoPAC 13.1.1 endocardial (ICC: 0.91; bias: 1.1%), midwall (ICC: 0.73; bias: 3.9%) and epicardial (ICC: 0.54; bias: 6.0%). Interobserver variability of all software products in a subset of 20 patients was excellent (ICC: 0.97-0.99; bias: -0.8 to 1.0%). Upgrades of speckle tracking software may be associated with significant changes in GLS values, which could affect intersoftware and intervendor consistency. This finding has important clinical implications for the longitudinal follow-up of patients with speckle tracking echocardiography. Copyright © 2015 Elsevier Masson SAS. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
AlperEker; Mark Giammattia; Paul Houpt
''Intelligent Extruder'' described in this report is a software system and associated support services for monitoring and control of compounding extruders to improve material quality, reduce waste and energy use, with minimal addition of new sensors or changes to the factory floor system components. Emphasis is on process improvements to the mixing, melting and de-volatilization of base resins, fillers, pigments, fire retardants and other additives in the :finishing'' stage of high value added engineering polymer materials. While GE Plastics materials were used for experimental studies throughout the program, the concepts and principles are broadly applicable to other manufacturers materials. Themore » project involved a joint collaboration among GE Global Research, GE Industrial Systems and Coperion Werner & Pleiderer, USA, a major manufacturer of compounding equipment. Scope of the program included development of a algorithms for monitoring process material viscosity without rheological sensors or generating waste streams, a novel detection scheme for rapid detection of process upsets and an adaptive feedback control system to compensate for process upsets where at line adjustments are feasible. Software algorithms were implemented and tested on a laboratory scale extruder (50 lb/hr) at GE Global Research and data from a production scale system (2000 lb/hr) at GE Plastics was used to validate the monitoring and detection software. Although not evaluated experimentally, a new concept for extruder process monitoring through estimation of high frequency drive torque without strain gauges is developed and demonstrated in simulation. A plan to commercialize the software system is outlined, but commercialization has not been completed.« less
gPhoton: The GALEX Photon Data Archive
NASA Astrophysics Data System (ADS)
Million, Chase; Fleming, Scott W.; Shiao, Bernie; Seibert, Mark; Loyd, Parke; Tucker, Michael; Smith, Myron; Thompson, Randy; White, Richard L.
2016-12-01
gPhoton is a new database product and software package that enables analysis of GALEX ultraviolet data at the photon level. The project’s stand-alone, pure-Python calibration pipeline reproduces the functionality of the original mission pipeline to reduce raw spacecraft data to lists of time-tagged, sky-projected photons, which are then hosted in a publicly available database by the Mikulski Archive at Space Telescope. This database contains approximately 130 terabytes of data describing approximately 1.1 trillion sky-projected events with a timestamp resolution of five milliseconds. A handful of Python and command-line modules serve as a front end to interact with the database and to generate calibrated light curves and images from the photon-level data at user-defined temporal and spatial scales. The gPhoton software and source code are in active development and publicly available under a permissive license. We describe the motivation, design, and implementation of the calibration pipeline, database, and tools, with emphasis on divergence from prior work, as well as challenges created by the large data volume. We summarize the astrometric and photometric performance of gPhoton relative to the original mission pipeline. For a brief example of short time-domain science capabilities enabled by gPhoton, we show new flares from the known M-dwarf flare star CR Draconis. The gPhoton software has permanent object identifiers with the ASCL (ascl:1603.004) and DOI (doi:10.17909/T9CC7G). This paper describes the software as of version v1.27.2.
A method for tailoring the information content of a software process model
NASA Technical Reports Server (NTRS)
Perkins, Sharon; Arend, Mark B.
1990-01-01
The framework is defined for a general method for selecting a necessary and sufficient subset of a general software life cycle's information products, to support new software development process. Procedures for characterizing problem domains in general and mapping to a tailored set of life cycle processes and products is presented. An overview of the method is shown using the following steps: (1) During the problem concept definition phase, perform standardized interviews and dialogs between developer and user, and between user and customer; (2) Generate a quality needs profile of the software to be developed, based on information gathered in step 1; (3) Translate the quality needs profile into a profile of quality criteria that must be met by the software to satisfy the quality needs; (4) Map the quality criteria to set of accepted processes and products for achieving each criterion; (5) Select the information products which match or support the accepted processes and product of step 4; and (6) Select the design methodology which produces the information products selected in step 5.
A method for tailoring the information content of a software process model
NASA Technical Reports Server (NTRS)
Perkins, Sharon; Arend, Mark B.
1990-01-01
The framework is defined for a general method for selecting a necessary and sufficient subset of a general software life cycle's information products, to support new software development process. Procedures for characterizing problem domains in general and mapping to a tailored set of life cycle processes and products is presented. An overview of the method is shown using the following steps: (1) During the problem concept definition phase, perform standardized interviews and dialogs between developer and user, and between user and customer; (2) Generate a quality needs profile of the software to be developed, based on information gathered in step 1; (3) Translate the quality needs profile into a profile of quality criteria that must be met by the software to satisfy the quality needs; (4) Map the quality criteria to a set of accepted processes and products for achieving each criterion; (5) select the information products which match or support the accepted processes and product of step 4; and (6) Select the design methodology which produces the information products selected in step 5.
A new hyperspectral imaging based device for quality control in plastic recycling
NASA Astrophysics Data System (ADS)
Bonifazi, G.; D'Agostini, M.; Dall'Ava, A.; Serranti, S.; Turioni, F.
2013-05-01
The quality control of contamination level in the recycled plastics stream has been identified as an important key factor for increasing the value of the recycled material by both plastic recycling and compounder industries. Existing quality control methods for the detection of both plastics and non-plastics contaminants in the plastic waste streams at different stages of the industrial process (e.g. feed, intermediate and final products) are currently based on the manual collection from the stream of a sample and on the subsequent off-line laboratory analyses. The results of such analyses are usually available after some hours, or sometimes even some days, after the material has been processed. The laboratory analyses are time-consuming and expensive (both in terms of equipment cost and their maintenance and of labour cost).Therefore, a fast on-line assessment to monitor the plastic waste feed streams and to characterize the composition of the different plastic products, is fundamental to increase the value of secondary plastics. The paper is finalized to describe and evaluate the development of an HSI-based device and of the related software architectures and processing algorithms for quality assessment of plastics in recycling plants, with particular reference to polyolefins (PO). NIR-HSI sensing devices coupled with multivariate data analysis methods was demonstrated as an objective, rapid and non-destructive technique that can be used for on-line quality and process control in the recycling process of POs. In particular, the adoption of the previous mentioned HD&SW integrated architectures can provide a solution to one of the major problems of the recycling industry, which is the lack of an accurate quality certification of materials obtained by recycling processes. These results could therefore assist in developing strategies to certify the composition of recycled PO products.
Contemporary issues in HIM. Software engineering--what does it mean to you?
Wear, L L
1994-02-01
There have been significant advances in the way we develop software in the last two decades. Many companies are using the new process oriented approach to software development. Companies that use the new techniques and tools have reported improvements in both productivity and quality, but there are still companies developing software the way we did 30 years ago. If you saw the movie Jurassic Park, you saw the perfect way not to develop software. The programmer in the movie was the only person who knew the details of the system. No processes were followed, and there was no documentation. This was an absolutely perfect prescription for failure. Some of you are probably familiar with the term hacker which describes a person who spends hours sitting at a terminal hacking out code. Hackers have created some outstanding software products, but with today's complex systems, most companies are trying to get away from their dependence on hackers. They are instead turning to the process-oriented approach. When selecting software vendors, don't just look at the functionality of a product. Try to determine how the vendor develops software, and determine if you are dealing with hackers or a process-driven company. In the long run, you should get better, more reliable products from the latter.
Production Techniques for Computer-Based Learning Material.
ERIC Educational Resources Information Center
Moonen, Jef; Schoenmaker, Jan
Experiences in the development of educational software in the Netherlands have included the use of individual and team approaches, the determination of software content and how it should be presented, and the organization of the entire development process, from experimental programs to prototype to final product. Because educational software is a…
Records Inventory Data Collection Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cole, Brian A.
1995-03-01
DATALINK was created to provide an easy to use data collection program for records management software products. It provides several useful tools for capturing and validating record index data in the field. It also allows users to easily create a comma delimited, ASCII text file for data export into most records management software products.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-06
... INTERNATIONAL TRADE COMMISSION [Investigation No. 337-TA-795] Certain Video Analytics Software... filed by ObjectVideo, Inc. of Reston, Virginia. 76 FR 45859 (Aug. 1, 2011). The complaint, as amended... certain video analytics software, systems, components thereof, and products containing same by reason of...
The dynamics of software development project management: An integrative systems dynamic perspective
NASA Technical Reports Server (NTRS)
Vandervelde, W. E.; Abdel-Hamid, T.
1984-01-01
Rather than continuing to focus on software development projects per se, the system dynamics modeling approach outlined is extended to investigate a broader set of issues pertaining to the software development organization. Rather than trace the life cycle(s) of one or more software projects, the focus is on the operations of a software development department as a continuous stream of software products are developed, placed into operation, and maintained. A number of research questions are ""ripe'' for investigating including: (1) the efficacy of different organizational structures in different software development environments, (2) personnel turnover, (3) impact of management approaches such as management by objectives, and (4) the organizational/environmental determinants of productivity.
Toward Intelligent Software Defect Detection
NASA Technical Reports Server (NTRS)
Benson, Markland J.
2011-01-01
Source code level software defect detection has gone from state of the art to a software engineering best practice. Automated code analysis tools streamline many of the aspects of formal code inspections but have the drawback of being difficult to construct and either prone to false positives or severely limited in the set of defects that can be detected. Machine learning technology provides the promise of learning software defects by example, easing construction of detectors and broadening the range of defects that can be found. Pinpointing software defects with the same level of granularity as prominent source code analysis tools distinguishes this research from past efforts, which focused on analyzing software engineering metrics data with granularity limited to that of a particular function rather than a line of code.
AlQahtani, Nabeeh A.; Haralur, Satheesh B.; AlMaqbol, Mohammad; AlMufarrij, Ali Jubran; Al Dera, Ahmed Ali; Al-Qarni, Mohammed
2016-01-01
Objectives: To determine the occurrence of smile line and maxillary tooth shape in the Saudi Arabian subpopulation, and to estimate the association between these parameters with gingival biotype. Materials and Methods: On the fulfillment of selection criteria, total 315 patients belong to Saudi Arabian ethnic group were randomly selected. Two frontal photographs of the patients were acquired. The tooth morphology, gingival angle, and smile line classification were determined with ImageJ image analyzing software. The gingival biotype was assessed by probe transparency method. The obtained data were analyzed with SPSS 19 (IBM Corporation, New York, USA) software to determine the frequency and association between other parameters and gingival biotype. Results: Among the clinical parameters evaluated, the tapering tooth morphology (56.8%), thick gingival biotype (53%), and average smile line (57.5%) was more prevalent. The statistically significant association was found between thick gingival biotype and the square tooth, high smile line. The high gingival angle was associated with thin gingival biotype. Conclusions: The study results indicate the existence of an association between tooth shape, smile line, and gingival angle with gingival biotype. PMID:27195228
Laserprinter applications in a medical graphics department.
Lynch, P J
1987-01-01
Our experience with the Apple Macintosh and LaserWriter equipment has convinced us that lasergraphics holds much current and future promise in the creation of line graphics and typography for the biomedical community. Although we continue to use other computer graphics equipment to produce color slides and an occasional pen-plotter graphic, the most rapidly growing segment of our graphics workload is in material well-suited to production on the Macintosh/LaserWriter system. At present our goal is to integrate all of our computer graphics production (color slides, video paint graphics and monochrome print graphics) into a single Macintosh-based system within the next two years. The software and hardware currently available are capable of producing a wide range of science graphics very quickly and inexpensively. The cost-effectiveness, versatility and relatively low initial investment required to install this equipment make it an attractive alternative for cost-recovery departments just entering the field of computer graphics.
The MOLGENIS toolkit: rapid prototyping of biosoftware at the push of a button.
Swertz, Morris A; Dijkstra, Martijn; Adamusiak, Tomasz; van der Velde, Joeri K; Kanterakis, Alexandros; Roos, Erik T; Lops, Joris; Thorisson, Gudmundur A; Arends, Danny; Byelas, George; Muilu, Juha; Brookes, Anthony J; de Brock, Engbert O; Jansen, Ritsert C; Parkinson, Helen
2010-12-21
There is a huge demand on bioinformaticians to provide their biologists with user friendly and scalable software infrastructures to capture, exchange, and exploit the unprecedented amounts of new *omics data. We here present MOLGENIS, a generic, open source, software toolkit to quickly produce the bespoke MOLecular GENetics Information Systems needed. The MOLGENIS toolkit provides bioinformaticians with a simple language to model biological data structures and user interfaces. At the push of a button, MOLGENIS' generator suite automatically translates these models into a feature-rich, ready-to-use web application including database, user interfaces, exchange formats, and scriptable interfaces. Each generator is a template of SQL, JAVA, R, or HTML code that would require much effort to write by hand. This 'model-driven' method ensures reuse of best practices and improves quality because the modeling language and generators are shared between all MOLGENIS applications, so that errors are found quickly and improvements are shared easily by a re-generation. A plug-in mechanism ensures that both the generator suite and generated product can be customized just as much as hand-written software. In recent years we have successfully evaluated the MOLGENIS toolkit for the rapid prototyping of many types of biomedical applications, including next-generation sequencing, GWAS, QTL, proteomics and biobanking. Writing 500 lines of model XML typically replaces 15,000 lines of hand-written programming code, which allows for quick adaptation if the information system is not yet to the biologist's satisfaction. Each application generated with MOLGENIS comes with an optimized database back-end, user interfaces for biologists to manage and exploit their data, programming interfaces for bioinformaticians to script analysis tools in R, Java, SOAP, REST/JSON and RDF, a tab-delimited file format to ease upload and exchange of data, and detailed technical documentation. Existing databases can be quickly enhanced with MOLGENIS generated interfaces using the 'ExtractModel' procedure. The MOLGENIS toolkit provides bioinformaticians with a simple model to quickly generate flexible web platforms for all possible genomic, molecular and phenotypic experiments with a richness of interfaces not provided by other tools. All the software and manuals are available free as LGPLv3 open source at http://www.molgenis.org.
Software Product Data (SPD) Current Environment Report
DOT National Transportation Integrated Search
1990-04-01
This report describes the Air Force organization and functions employed in the acquisition, use, and management of Software Product Data (SPD). The flow of data among the Air Force and contractors during the design, development, and post-production p...
Software quality: Process or people
NASA Technical Reports Server (NTRS)
Palmer, Regina; Labaugh, Modenna
1993-01-01
This paper will present data related to software development processes and personnel involvement from the perspective of software quality assurance. We examine eight years of data collected from six projects. Data collected varied by project but usually included defect and fault density with limited use of code metrics, schedule adherence, and budget growth information. The data are a blend of AFSCP 800-14 and suggested productivity measures in Software Metrics: A Practioner's Guide to Improved Product Development. A software quality assurance database tool, SQUID, was used to store and tabulate the data.
NASA Technical Reports Server (NTRS)
1992-01-01
This standard specifies the software assurance program for the provider of software. It also delineates the assurance activities for the provider and the assurance data that are to be furnished by the provider to the acquirer. In any software development effort, the provider is the entity or individual that actually designs, develops, and implements the software product, while the acquirer is the entity or individual who specifies the requirements and accepts the resulting products. This standard specifies at a high level an overall software assurance program for software developed for and by NASA. Assurance includes the disciplines of quality assurance, quality engineering, verification and validation, nonconformance reporting and corrective action, safety assurance, and security assurance. The application of these disciplines during a software development life cycle is called software assurance. Subsequent lower-level standards will specify the specific processes within these disciplines.
Design and Development of a High Speed Sorting System Based on Machine Vision Guiding
NASA Astrophysics Data System (ADS)
Zhang, Wenchang; Mei, Jiangping; Ding, Yabin
In this paper, a vision-based control strategy to perform high speed pick-and-place tasks on automation product line is proposed, and relevant control software is develop. Using Delta robot to control a sucker to grasp disordered objects from one moving conveyer and then place them on the other in order. CCD camera gets one picture every time the conveyer moves a distance of ds. Objects position and shape are got after image processing. Target tracking method based on "Servo motor + synchronous conveyer" is used to fulfill the high speed porting operation real time. Experiments conducted on Delta robot sorting system demonstrate the efficiency and validity of the proposed vision-control strategy.
Designed tools for analysis of lithography patterns and nanostructures
NASA Astrophysics Data System (ADS)
Dervillé, Alexandre; Baderot, Julien; Bernard, Guilhem; Foucher, Johann; Grönqvist, Hanna; Labrosse, Aurélien; Martinez, Sergio; Zimmermann, Yann
2017-03-01
We introduce a set of designed tools for the analysis of lithography patterns and nano structures. The classical metrological analysis of these objects has the drawbacks of being time consuming, requiring manual tuning and lacking robustness and user friendliness. With the goal of improving the current situation, we propose new image processing tools at different levels: semi automatic, automatic and machine-learning enhanced tools. The complete set of tools has been integrated into a software platform designed to transform the lab into a virtual fab. The underlying idea is to master nano processes at the research and development level by accelerating the access to knowledge and hence speed up the implementation in product lines.
LANDSAT-4 MSS Geometric Correction: Methods and Results
NASA Technical Reports Server (NTRS)
Brooks, J.; Kimmer, E.; Su, J.
1984-01-01
An automated image registration system such as that developed for LANDSAT-4 can produce all of the information needed to verify and calibrate the software and to evaluate system performance. The on-line MSS archive generation process which upgrades systematic correction data to geodetic correction data is described as well as the control point library build subsystem which generates control point chips and support data for on-line upgrade of correction data. The system performance was evaluated for both temporal and geodetic registration. For temporal registration, 90% errors were computed to be .36 IFOV (instantaneous field of view) = 82.7 meters) cross track, and .29 IFOV along track. Also, for actual production runs monitored, the 90% errors were .29 IFOV cross track and .25 IFOV along track. The system specification is .3 IFOV, 90% of the time, both cross and along track. For geodetic registration performance, the model bias was measured by designating control points in the geodetically corrected imagery.
Making the most of on-line recruiting.
Cappelli, P
2001-03-01
Ninety percent of large U.S. companies are already recruiting via the Internet. By simply logging on to the Web, company recruiters can locate vast numbers of qualified candidates for jobs at every level, screen them in minutes, and contact the most promising ones immediately. The payoffs can be enormous: it costs substantially less to hire someone on-line, and the time saved is equally great. In this article, Peter Cappelli examines some of the emerging service providers and technologies--matchmakers, job boards, hiring management systems software, and applicant-screening mechanisms that test skills and record interests. He also looks at some of the strategies companies are adopting as they enter on-line labor markets. Recruiting needs to be refashioned to resemble marketing, he stresses. Accordingly, smart companies are designing Web pages, and even product ads, with potential recruits in mind. They're giving line managers authority to hire so that candidates in cyberspace aren't lost. They're building internal on-line job networks to retain talent. Integrating recruiting efforts with overall marketing campaigns, especially through coordination and identification with the company's brand, is the most important thing companies can do to ensure success in on-line hiring. Along the way, Cappelli sounds two cautionary notes. First, a human touch, not electronic contact, is vital in the last steps of a successful hiring process. Second, companies must make sure that on-line testing and hiring criteria do not discriminate against women, disabled people, workers over 40, or members of minority groups. When competition for talent is fierce, companies that master the art and science of on-line recruiting will be the ones that attract and keep the best people.
1977-05-01
C31) programs; (4) simulator/ trainer programs ; and (5) automatic test equipment software. Each of these five types of software represents a problem...coded in the same source language, say JOVIAL, then source—language statements would be a better measure, since that would automatically compensate...whether done at no (visible) cost or by renegotiation of the contract. Fig. 2.3 illustrates these with solid lines. It is conjec- tured that the change
NASA Technical Reports Server (NTRS)
2014-01-01
Topics covered include: Innovative Software Tools Measure Behavioral Alertness; Miniaturized, Portable Sensors Monitor Metabolic Health; Patient Simulators Train Emergency Caregivers; Solar Refrigerators Store Life-Saving Vaccines; Monitors Enable Medication Management in Patients' Homes; Handheld Diagnostic Device Delivers Quick Medical Readings; Experiments Result in Safer, Spin-Resistant Aircraft; Interfaces Visualize Data for Airline Safety, Efficiency; Data Mining Tools Make Flights Safer, More Efficient; NASA Standards Inform Comfortable Car Seats; Heat Shield Paves the Way for Commercial Space; Air Systems Provide Life Support to Miners; Coatings Preserve Metal, Stone, Tile, and Concrete; Robots Spur Software That Lends a Hand; Cloud-Based Data Sharing Connects Emergency Managers; Catalytic Converters Maintain Air Quality in Mines; NASA-Enhanced Water Bottles Filter Water on the Go; Brainwave Monitoring Software Improves Distracted Minds; Thermal Materials Protect Priceless, Personal Keepsakes; Home Air Purifiers Eradicate Harmful Pathogens; Thermal Materials Drive Professional Apparel Line; Radiant Barriers Save Energy in Buildings; Open Source Initiative Powers Real-Time Data Streams; Shuttle Engine Designs Revolutionize Solar Power; Procedure-Authoring Tool Improves Safety on Oil Rigs; Satellite Data Aid Monitoring of Nation's Forests; Mars Technologies Spawn Durable Wind Turbines; Programs Visualize Earth and Space for Interactive Education; Processor Units Reduce Satellite Construction Costs; Software Accelerates Computing Time for Complex Math; Simulation Tools Prevent Signal Interference on Spacecraft; Software Simplifies the Sharing of Numerical Models; Virtual Machine Language Controls Remote Devices; Micro-Accelerometers Monitor Equipment Health; Reactors Save Energy, Costs for Hydrogen Production; Cameras Monitor Spacecraft Integrity to Prevent Failures; Testing Devices Garner Data on Insulation Performance; Smart Sensors Gather Information for Machine Diagnostics; Oxygen Sensors Monitor Bioreactors and Ensure Health and Safety; Vision Algorithms Catch Defects in Screen Displays; and Deformable Mirrors Capture Exoplanet Data, Reflect Lasers.
Roadmap for Testing and Validation of Electric Vehicle Communication Standards
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pratt, Richard M.; Tuffner, Francis K.; Gowri, Krishnan
Vehicle to grid communication standards are critical to the charge management and interoperability among plug-in electric vehicles (PEVs), charging stations and utility providers. The Society of Automobile Engineers (SAE), International Organization for Standardization (ISO), International Electrotechnical Commission (IEC) and the ZigBee Alliance are developing requirements for communication messages and protocols. While interoperability standards development has been in progress for more than two years, no definitive guidelines are available for the automobile manufacturers, charging station manufacturers or utility backhaul network systems. At present, there is a wide range of proprietary communication options developed and supported in the industry. Recent work bymore » the Electric Power Research Institute (EPRI), in collaboration with SAE and automobile manufacturers, has identified performance requirements and developed a test plan based on possible communication pathways using power line communication (PLC). Though the communication pathways and power line communication technology options are identified, much work needs to be done in developing application software and testing of communication modules before these can be deployed in production vehicles. This paper presents a roadmap and results from testing power line communication modules developed to meet the requirements of SAE J2847/1 standard.« less
ERIC Educational Resources Information Center
Borman, Stuart A.
1985-01-01
Discusses various aspects of scientific software, including evaluation and selection of commercial software products; program exchanges, catalogs, and other information sources; major data analysis packages; statistics and chemometrics software; and artificial intelligence. (JN)
Product assurance policies and procedures for flight dynamics software development
NASA Technical Reports Server (NTRS)
Perry, Sandra; Jordan, Leon; Decker, William; Page, Gerald; Mcgarry, Frank E.; Valett, Jon
1987-01-01
The product assurance policies and procedures necessary to support flight dynamics software development projects for Goddard Space Flight Center are presented. The quality assurance and configuration management methods and tools for each phase of the software development life cycles are described, from requirements analysis through acceptance testing; maintenance and operation are not addressed.
DATALINK. Records Inventory Data Collection Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cole, B.
1995-03-01
DATALINK was created to provide an easy to use data collection program for records management software products. It provides several useful tools for capturing and validating record index data in the field. It also allows users to easily create a comma delimited, ASCII text file for data export into most records management software products.
The Production Data Approach for Full Lifecycle Management
NASA Astrophysics Data System (ADS)
Schopf, J.
2012-04-01
The amount of data generated by scientists is growing exponentially, and studies have shown [Koe04] that un-archived data sets have a resource half-life that is only a fraction of those resources that are electronically archived. Most groups still lack standard approaches and procedures for data management. Arguably, however, scientists know something about building software. A recent article in Nature [Mer10] stated that 45% of research scientists spend more time now developing software than they did 5 years ago, and 38% spent at least 1/5th of their time developing software. Fox argues [Fox10] that a simple release of data is not the correct approach to data curation. In addition, just as software is used in a wide variety of ways never initially envisioned by its developers, we're seeing this even to a greater extent with data sets. In order to address the need for better data preservation and access, we propose that data sets should be managed in a similar fashion to building production quality software. These production data sets are not simply published once, but go through a cyclical process, including phases such as design, development, verification, deployment, support, analysis, and then development again, thereby supporting the full lifecycle of a data set. The process involved in academically-produced software changes over time with respect to issues such as how much it is used outside the development group, but factors in aspects such as knowing who is using the code, enabling multiple developers to contribute to code development with common procedures, formal testing and release processes, developing documentation, and licensing. When we work with data, either as a collection source, as someone tagging data, or someone re-using it, many of the lessons learned in building production software are applicable. Table 1 shows a comparison of production software elements to production data elements. Table 1: Comparison of production software and production data. Production Software Production Data End-user considerations End-user considerations Multiple Coders: Repository with check-in procedures Coding standards Multiple producers/collectors Local archive with check-in procedure Metadata Standards Formal testing Formal testing Bug tracking and fixes Bug tracking and fixes, QA/QC Documentation Documentation Formal Release Process Formal release process to external archive License Citation/usage statement The full presentation of this abstract will include a detailed discussion of these issues so that researchers can produce usable and accessible data sets as a first step toward reproducible science. By creating production-quality data sets, we extend the potential of our data, both in terms of usability and usefulness to ourselves and other researchers. The more we treat data with formal processes and release cycles, the more relevant and useful it can be to the scientific community.
Support for life-cycle product reuse in NASA's SSE
NASA Technical Reports Server (NTRS)
Shotton, Charles
1989-01-01
The Software Support Environment (SSE) is a software factory for the production of Space Station Freedom Program operational software. The SSE is to be centrally developed and maintained and used to configure software production facilities in the field. The PRC product TTCQF provides for an automated qualification process and analysis of existing code that can be used for software reuse. The interrogation subsystem permits user queries of the reusable data and components which have been identified by an analyzer and qualified with associated metrics. The concept includes reuse of non-code life-cycle components such as requirements and designs. Possible types of reusable life-cycle components include templates, generics, and as-is items. Qualification of reusable elements requires analysis (separation of candidate components into primitives), qualification (evaluation of primitives for reusability according to reusability criteria) and loading (placing qualified elements into appropriate libraries). There can be different qualifications for different installations, methodologies, applications and components. Identifying reusable software and related components is labor-intensive and is best carried out as an integrated function of an SSE.
The Application of Function Points to Predict Source Lines of Code for Software Development
1992-09-01
there are some disadvantages. Software estimating tools are expensive. A single tool may cost more than $15,000 due to the high market value of the...term and Lang variables simultaneously onlN added marginal improvements over models with these terms included singularly. Using all the available
Multi-kanban mechanism for personal computer disassembly
NASA Astrophysics Data System (ADS)
Udomsawat, Gun; Gupta, Surendra M.; Kamarthi, Sagar V.
2004-12-01
The use of personal computers (PCs) continues to increase every year. According to a 1999 figure, 50 percent of all US households owned PCs, a figure that continues to rise every year. With continuous development of sophisticated software, PCs are becoming increasingly powerful. In addition, the price of a PC continues to steadily decline. Furthermore, the typical life of a PC in the workplace is approximately two to three years while in the home it is three to five years. As these PCs become obsolete, they are replaced and the old PCs are disposed of. It is estimated that between 14 and 20 million PCs are retired annually in the US. While 20 to 30% of the units may be resold, the others are discarded. These discards represent a significant potential source of lead for the waste stream. In some communities, waste cathode ray tubes (CRTs) represent the second largest source of lead in the waste stream after vehicular lead acid batteries. PCs are, therefore, not suitable for dumping in landfills. Besides, several components of a PC can be reused and then there are other valuable materials that can also be harvested. And with the advent of product stewardship, product recovery is the best solution for manufacturers. Disassembly line is perhaps the most suitable set up for disassembling PCs. However, planning and scheduling of disassembly on a disassembly line is complicated. In this paper, we discuss some of the complications including product arrival, demand arrival, inventory fluctuation and production control mechanisms. We then show how to overcome them by implementing a multi-kanban mechanism in the PC disassembly line setting. The multi-kanban mechanism relies on dynamic routing of kanbans according to the state of the system. We investigate the multi-kanban mechanism using simulation and demonstrate that this mechanism is superior to the traditional push system in terms of controlling the system"s inventory while maintaining a decent customer service level.
Proceedings Second Annual Cyber Security and Information Infrastructure Research Workshop
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sheldon, Frederick T; Krings, Axel; Yoo, Seong-Moo
2006-01-01
The workshop theme is Cyber Security: Beyond the Maginot Line Recently the FBI reported that computer crime has skyrocketed costing over $67 billion in 2005 alone and affecting 2.8M+ businesses and organizations. Attack sophistication is unprecedented along with availability of open source concomitant tools. Private, academic, and public sectors invest significant resources in cyber security. Industry primarily performs cyber security research as an investment in future products and services. While the public sector also funds cyber security R&D, the majority of this activity focuses on the specific mission(s) of the funding agency. Thus, broad areas of cyber security remain neglectedmore » or underdeveloped. Consequently, this workshop endeavors to explore issues involving cyber security and related technologies toward strengthening such areas and enabling the development of new tools and methods for securing our information infrastructure critical assets. We aim to assemble new ideas and proposals about robust models on which we can build the architecture of a secure cyberspace including but not limited to: * Knowledge discovery and management * Critical infrastructure protection * De-obfuscating tools for the validation and verification of tamper-proofed software * Computer network defense technologies * Scalable information assurance strategies * Assessment-driven design for trust * Security metrics and testing methodologies * Validation of security and survivability properties * Threat assessment and risk analysis * Early accurate detection of the insider threat * Security hardened sensor networks and ubiquitous computing environments * Mobile software authentication protocols * A new "model" of the threat to replace the "Maginot Line" model and more . . .« less
Dudhagara, Pravin; Tank, Shantilal
2018-01-01
The thermophilic bacterium, Bacillus licheniformis U1 is used for the optimization of bacterial growth (R1), laccase production (R2) and synthetic disperse blue DBR textile dye decolorization (R3) in the present study. Preliminary optimization has been performed by one variable at time (OVAT) approach using four media components viz., dye concentration, copper sulphate concentration, pH, and inoculum size. Based on OVAT result further statistical optimization of R1, R2 and R3 performed by Box–Behnken design (BBD) using response surface methodology (RSM) in R software with R Commander package. The total 29 experimental runs conducted in the experimental design study towards the construction of a quadratic model. The model indicated that dye concentration 110 ppm, copper sulphate 0.2 mM, pH 7.5 and inoculum size 6% v/v were found to be optimum to maximize the laccase production and bacterial growth. Whereas, maximum dye decolorization achieved in media containing dye concentration 110 ppm, copper sulphate 0.6 mM, pH 6 and inoculum size 6% v/v. R package predicted R2 of R1, R2 and R3 were 0.9917, 0.9831 and 0.9703 respectively; likened to Design-Expert (Stat-Ease) (DOE) predicted R2 of R1, R2, and R3 were 0.9893, 0.9822 and 0.8442 respectively. The values obtained by R software were more precise, reliable and reproducible, compared to the DOE model. The laccase production was 1.80 fold increased, and 2.24 fold enhancement in dye decolorization was achieved using optimized medium than initial experiments. Moreover, the laccase-treated sample demonstrated the less cytotoxic effect on L132 and MCF-7 cell lines compared to untreated sample using MTT assay. Higher cell viability and lower cytotoxicity observed in a laccase-treated sample suggest the impending application of bacterial laccase in the reduction of toxicity of dye to design rapid biodegradation process. PMID:29718934
ELSA: An integrated, semi-automated nebular abundance package
NASA Astrophysics Data System (ADS)
Johnson, Matthew D.; Levitt, Jesse S.; Henry, Richard B. C.; Kwitter, Karen B.
We present ELSA, a new modular software package, written in C, to analyze and manage spectroscopic data from emission-line objects. In addition to calculating plasma diagnostics and abundances from nebular emission lines, the software provides a number of convenient features including the ability to ingest logs produced by IRAF's splot task, to semi-automatically merge spectra in different wavelength ranges, and to automatically generate various data tables in machine-readable or LaTeX format. ELSA features a highly sophisticated interstellar reddening correction scheme that takes into account temperature and density effects as well as He II contamination of the hydrogen Balmer lines. Abundance calculations are performed using a 5-level atom approximation with recent atomic data, based on R. Henry's ABUN program. Downloading and detailed documentation for all aspects of ELSA are available at the following URL:
Team Software Development for Aerothermodynamic and Aerodynamic Analysis and Design
NASA Technical Reports Server (NTRS)
Alexandrov, N.; Atkins, H. L.; Bibb, K. L.; Biedron, R. T.; Carpenter, M. H.; Gnoffo, P. A.; Hammond, D. P.; Jones, W. T.; Kleb, W. L.; Lee-Rausch, E. M.
2003-01-01
A collaborative approach to software development is described. The approach employs the agile development techniques: project retrospectives, Scrum status meetings, and elements of Extreme Programming to efficiently develop a cohesive and extensible software suite. The software product under development is a fluid dynamics simulator for performing aerodynamic and aerothermodynamic analysis and design. The functionality of the software product is achieved both through the merging, with substantial rewrite, of separate legacy codes and the authorship of new routines. Examples of rapid implementation of new functionality demonstrate the benefits obtained with this agile software development process. The appendix contains a discussion of coding issues encountered while porting legacy Fortran 77 code to Fortran 95, software design principles, and a Fortran 95 coding standard.
Testing Software Development Project Productivity Model
NASA Astrophysics Data System (ADS)
Lipkin, Ilya
Software development is an increasingly influential factor in today's business environment, and a major issue affecting software development is how an organization estimates projects. If the organization underestimates cost, schedule, and quality requirements, the end results will not meet customer needs. On the other hand, if the organization overestimates these criteria, resources that could have been used more profitably will be wasted. There is no accurate model or measure available that can guide an organization in a quest for software development, with existing estimation models often underestimating software development efforts as much as 500 to 600 percent. To address this issue, existing models usually are calibrated using local data with a small sample size, with resulting estimates not offering improved cost analysis. This study presents a conceptual model for accurately estimating software development, based on an extensive literature review and theoretical analysis based on Sociotechnical Systems (STS) theory. The conceptual model serves as a solution to bridge organizational and technological factors and is validated using an empirical dataset provided by the DoD. Practical implications of this study allow for practitioners to concentrate on specific constructs of interest that provide the best value for the least amount of time. This study outlines key contributing constructs that are unique for Software Size E-SLOC, Man-hours Spent, and Quality of the Product, those constructs having the largest contribution to project productivity. This study discusses customer characteristics and provides a framework for a simplified project analysis for source selection evaluation and audit task reviews for the customers and suppliers. Theoretical contributions of this study provide an initial theory-based hypothesized project productivity model that can be used as a generic overall model across several application domains such as IT, Command and Control, Simulation and etc... This research validates findings from previous work concerning software project productivity and leverages said results in this study. The hypothesized project productivity model provides statistical support and validation of expert opinions used by practitioners in the field of software project estimation.
Building quality into medical product software design.
Mallory, S R
1993-01-01
The software engineering and quality assurance disciplines are a requisite to the design of safe and effective software-based medical devices. It is in the areas of software methodology and process that the most beneficial application of these disciplines to software development can be made. Software is a product of complex operations and methodologies and is not amenable to the traditional electromechanical quality assurance processes. Software quality must be built in by the developers, with the software verification and validation engineers acting as the independent instruments for ensuring compliance with performance objectives and with development and maintenance standards. The implementation of a software quality assurance program is a complex process involving management support, organizational changes, and new skill sets, but the benefits are profound. Its rewards provide safe, reliable, cost-effective, maintainable, and manageable software, which may significantly speed the regulatory review process and therefore potentially shorten the overall time to market. The use of a trial project can greatly facilitate the learning process associated with the first-time application of a software quality assurance program.
Proceedings of the Thirteenth Annual Software Engineering Workshop
NASA Technical Reports Server (NTRS)
1988-01-01
Topics covered in the workshop included studies and experiments conducted in the Software Engineering Laboratory (SEL), a cooperative effort of NASA Goddard Space Flight Center, the University of Maryland, and Computer Sciences Corporation; software models; software products; and software tools.
NASA Technical Reports Server (NTRS)
Mattox, J. R.; Bertsch, D. L.; Fichtel, C. E.; Hartman, R. C.; Hunter, S. D.; Kanbach, G.; Kniffen, D. A.; Kwok, P. W.; Lin, Y. C.; Mayer-Hasselwander, H. A.
1992-01-01
We describe the Energetic Gamma Ray Experiment Telescope (EGRET) data products which we anticipate will suffice for virtually all guest and archival investigations. The production process, content, availability, format, and the associated software of each product is described. Supplied here is sufficient detail for each researcher to do analysis which is not supported by extant software.
Intelligence algorithms for autonomous navigation in a ground vehicle
NASA Astrophysics Data System (ADS)
Petkovsek, Steve; Shakya, Rahul; Shin, Young Ho; Gautam, Prasanna; Norton, Adam; Ahlgren, David J.
2012-01-01
This paper will discuss the approach to autonomous navigation used by "Q," an unmanned ground vehicle designed by the Trinity College Robot Study Team to participate in the Intelligent Ground Vehicle Competition (IGVC). For the 2011 competition, Q's intelligence was upgraded in several different areas, resulting in a more robust decision-making process and a more reliable system. In 2010-2011, the software of Q was modified to operate in a modular parallel manner, with all subtasks (including motor control, data acquisition from sensors, image processing, and intelligence) running simultaneously in separate software processes using the National Instruments (NI) LabVIEW programming language. This eliminated processor bottlenecks and increased flexibility in the software architecture. Though overall throughput was increased, the long runtime of the image processing process (150 ms) reduced the precision of Q's realtime decisions. Q had slow reaction times to obstacles detected only by its cameras, such as white lines, and was limited to slow speeds on the course. To address this issue, the image processing software was simplified and also pipelined to increase the image processing throughput and minimize the robot's reaction times. The vision software was also modified to detect differences in the texture of the ground, so that specific surfaces (such as ramps and sand pits) could be identified. While previous iterations of Q failed to detect white lines that were not on a grassy surface, this new software allowed Q to dynamically alter its image processing state so that appropriate thresholds could be applied to detect white lines in changing conditions. In order to maintain an acceptable target heading, a path history algorithm was used to deal with local obstacle fields and GPS waypoints were added to provide a global target heading. These modifications resulted in Q placing 5th in the autonomous challenge and 4th in the navigation challenge at IGVC.
NASA Astrophysics Data System (ADS)
Gao, Guoyou; Jiang, Chunsheng; Chen, Tao; Hui, Chun
2018-05-01
Industrial robots are widely used in various processes of surface manufacturing, such as thermal spraying. The established robot programming methods are highly time-consuming and not accurate enough to fulfil the demands of the actual market. There are many off-line programming methods developed to reduce the robot programming effort. This work introduces the principle of several based robot trajectory generation strategy on planar surface and curved surface. Since the off-line programming software is widely used and thus facilitates the robot programming efforts and improves the accuracy of robot trajectory, the analysis of this work is based on the second development of off-line programming software Robot studio™. To meet the requirements of automotive paint industry, this kind of software extension helps provide special functions according to the users defined operation parameters. The presented planning strategy generates the robot trajectory by moving an orthogonal surface according to the information of coating surface, a series of intersection curves are then employed to generate the trajectory points. The simulation results show that the path curve created with this method is successive and smooth, which corresponds to the requirements of automotive spray industrial applications.
NASA Astrophysics Data System (ADS)
Kasaei, M. M.; Naeini, H. Moslemi; Tehrani, M. Salmani; Tafti, R. Azizi
2011-01-01
Cage roll forming is one of the advanced methods of cold roll forming process which is used widely for producing ERW pipes. In addition to decreasing the production cost and time, using cage roll forming provides smooth deformation on the strip. Few studies can be found about cage roll forming because of its complexity, and the available knowledge is experience-based more than science-based. In this paper, deformation of pipes with low ratio of thickness/diameter is investigated by 3D finite element simulation in Marc-Mentat software. Edge buckling defect in cage roll forming of low ratio of thickness/diameter pipes is very important. Due to direct influence of longitudinal strain on the edge buckling phenomenon, longitudinal strains at the edge and center line of the strip are investigated and high risk stands are introduced. The deformed strip is predicted using the simulation results and effects of each cage forming stage on the deformed strip profile are specified. In order to verify the simulation results, strip width and opening distance of the two edges in different forming stages are obtained from the simulations and compared with the experimental data which were measured from the production line. A good agreement between the experimental and simulated results is observed.
Experimental control in software reliability certification
NASA Technical Reports Server (NTRS)
Trammell, Carmen J.; Poore, Jesse H.
1994-01-01
There is growing interest in software 'certification', i.e., confirmation that software has performed satisfactorily under a defined certification protocol. Regulatory agencies, customers, and prospective reusers all want assurance that a defined product standard has been met. In other industries, products are typically certified under protocols in which random samples of the product are drawn, tests characteristic of operational use are applied, analytical or statistical inferences are made, and products meeting a standard are 'certified' as fit for use. A warranty statement is often issued upon satisfactory completion of a certification protocol. This paper outlines specific engineering practices that must be used to preserve the validity of the statistical certification testing protocol. The assumptions associated with a statistical experiment are given, and their implications for statistical testing of software are described.
Advanced program development management software system. Software description and user's manual
NASA Technical Reports Server (NTRS)
1990-01-01
The objectives of this project were to apply emerging techniques and tools from the computer science discipline of paperless management to the activities of the Space Transportation and Exploration Office (PT01) in Marshall Space Flight Center (MSFC) Program Development, thereby enhancing the productivity of the workforce, the quality of the data products, and the collection, dissemination, and storage of information. The approach used to accomplish the objectives emphasized the utilization of finished form (off-the-shelf) software products to the greatest extent possible without impacting the performance of the end product, to pursue developments when necessary in the rapid prototyping environment to provide a mechanism for frequent feedback from the users, and to provide a full range of user support functions during the development process to promote testing of the software.
Processing LiDAR Data to Predict Natural Hazards
NASA Technical Reports Server (NTRS)
Fairweather, Ian; Crabtree, Robert; Hager, Stacey
2008-01-01
ELF-Base and ELF-Hazards (wherein 'ELF' signifies 'Extract LiDAR Features' and 'LiDAR' signifies 'light detection and ranging') are developmental software modules for processing remote-sensing LiDAR data to identify past natural hazards (principally, landslides) and predict future ones. ELF-Base processes raw LiDAR data, including LiDAR intensity data that are often ignored in other software, to create digital terrain models (DTMs) and digital feature models (DFMs) with sub-meter accuracy. ELF-Hazards fuses raw LiDAR data, data from multispectral and hyperspectral optical images, and DTMs and DFMs generated by ELF-Base to generate hazard risk maps. Advanced algorithms in these software modules include line-enhancement and edge-detection algorithms, surface-characterization algorithms, and algorithms that implement innovative data-fusion techniques. The line-extraction and edge-detection algorithms enable users to locate such features as faults and landslide headwall scarps. Also implemented in this software are improved methodologies for identification and mapping of past landslide events by use of (1) accurate, ELF-derived surface characterizations and (2) three LiDAR/optical-data-fusion techniques: post-classification data fusion, maximum-likelihood estimation modeling, and hierarchical within-class discrimination. This software is expected to enable faster, more accurate forecasting of natural hazards than has previously been possible.
Usage of "Powergraph" software at laboratory lessons of "general physics" department of MEPhI
NASA Astrophysics Data System (ADS)
Klyachin, N. A.; Matronchik, A. Yu.; Khangulyan, E. V.
2017-01-01
One considers usage of "PowerGraph" software in laboratory exercise "Study of sodium spectrum" of physical experiment lessons. Togethe with the design of experiment setup, one discusses the sodium spectra digitized with computer audio chip. Usage of "PowerGraph" software in laboratory experiment "Study of sodium spectrum" allows an efficient visualization of the sodium spectrum and analysis of its fine structure. In particular, it allows quantitative measurements of the wavelengths and line relative intensities.
Porting the Starlink Software Collection to GNU Autotools
NASA Astrophysics Data System (ADS)
Gray, N.; Jenness, T.; Allan, A.; Berry, D. S.; Currie, M. J.; Draper, P. W.; Taylor, M. B.; Cavanagh, B.
2005-12-01
The Starlink software collection currently runs on three different Unix platforms and contains around 100 separate software items, totaling 2.5 million lines of code, in a mixture of languages. We have changed the build system from a hand-maintained collection of makefiles with hard-wired OS variants to a scheme involving feature-discovery via GNU Autoconf. As a result of this work, we have already ported the collection to Mac OS X and Cygwin. This had some unexpected benefits and costs, and valuable lessons.
An object oriented implementation of the Yeadon human inertia model
Dembia, Christopher; Moore, Jason K.; Hubbard, Mont
2015-01-01
We present an open source software implementation of a popular mathematical method developed by M.R. Yeadon for calculating the body and segment inertia parameters of a human body. The software is written in a high level open source language and provides three interfaces for manipulating the data and the model: a Python API, a command-line user interface, and a graphical user interface. Thus the software can fit into various data processing pipelines and requires only simple geometrical measures as input. PMID:25717365
An object oriented implementation of the Yeadon human inertia model.
Dembia, Christopher; Moore, Jason K; Hubbard, Mont
2014-01-01
We present an open source software implementation of a popular mathematical method developed by M.R. Yeadon for calculating the body and segment inertia parameters of a human body. The software is written in a high level open source language and provides three interfaces for manipulating the data and the model: a Python API, a command-line user interface, and a graphical user interface. Thus the software can fit into various data processing pipelines and requires only simple geometrical measures as input.
Selecting Software for Libraries.
ERIC Educational Resources Information Center
Beiser, Karl
1993-01-01
Discusses resources and strategies that libraries can use to evaluate competing database management software for purchase. Needs assessments, types of software available, features of good software, evaluation aids, shareware, and marketing and product trends are covered. (KRN)
A Comparison of Keyboarding Software for the Elementary Grades. A Quarterly Report.
ERIC Educational Resources Information Center
Nolf, Kathleen; Weaver, Dave
This paper provides generalizations and ideas on what to look for when previewing software products designed for teaching or improving the keyboarding skills of elementary school students, a list of nine products that the MicroSIFT (Microcomputer Software and Information for Teachers) staff recommends for preview, and a table of features comparing…
Software Products for Temperature Data Reduction of Platinum Resistance Thermometers (PRT)
NASA Technical Reports Server (NTRS)
Sherrod, Jerry K.
1998-01-01
The main objective of this project is to create user-friendly personal computer (PC) software for reduction/analysis of platinum resistance thermometer (PRT) data. Software products were designed and created to help users of PRT data with the tasks of using the Callendar-Van Dusen method. Sample runs are illustrated in this report.
Bioinformatics on the cloud computing platform Azure.
Shanahan, Hugh P; Owen, Anne M; Harrison, Andrew P
2014-01-01
We discuss the applicability of the Microsoft cloud computing platform, Azure, for bioinformatics. We focus on the usability of the resource rather than its performance. We provide an example of how R can be used on Azure to analyse a large amount of microarray expression data deposited at the public database ArrayExpress. We provide a walk through to demonstrate explicitly how Azure can be used to perform these analyses in Appendix S1 and we offer a comparison with a local computation. We note that the use of the Platform as a Service (PaaS) offering of Azure can represent a steep learning curve for bioinformatics developers who will usually have a Linux and scripting language background. On the other hand, the presence of an additional set of libraries makes it easier to deploy software in a parallel (scalable) fashion and explicitly manage such a production run with only a few hundred lines of code, most of which can be incorporated from a template. We propose that this environment is best suited for running stable bioinformatics software by users not involved with its development.
NASA Astrophysics Data System (ADS)
Palou, Anna; Miró, Aira; Blanco, Marcelo; Larraz, Rafael; Gómez, José Francisco; Martínez, Teresa; González, Josep Maria; Alcalà, Manel
2017-06-01
Even when the feasibility of using near infrared (NIR) spectroscopy combined with partial least squares (PLS) regression for prediction of physico-chemical properties of biodiesel/diesel blends has been widely demonstrated, inclusion in the calibration sets of the whole variability of diesel samples from diverse production origins still remains as an important challenge when constructing the models. This work presents a useful strategy for the systematic selection of calibration sets of samples of biodiesel/diesel blends from diverse origins, based on a binary code, principal components analysis (PCA) and the Kennard-Stones algorithm. Results show that using this methodology the models can keep their robustness over time. PLS calculations have been done using a specialized chemometric software as well as the software of the NIR instrument installed in plant, and both produced RMSEP under reproducibility values of the reference methods. The models have been proved for on-line simultaneous determination of seven properties: density, cetane index, fatty acid methyl esters (FAME) content, cloud point, boiling point at 95% of recovery, flash point and sulphur.
Bioinformatics on the Cloud Computing Platform Azure
Shanahan, Hugh P.; Owen, Anne M.; Harrison, Andrew P.
2014-01-01
We discuss the applicability of the Microsoft cloud computing platform, Azure, for bioinformatics. We focus on the usability of the resource rather than its performance. We provide an example of how R can be used on Azure to analyse a large amount of microarray expression data deposited at the public database ArrayExpress. We provide a walk through to demonstrate explicitly how Azure can be used to perform these analyses in Appendix S1 and we offer a comparison with a local computation. We note that the use of the Platform as a Service (PaaS) offering of Azure can represent a steep learning curve for bioinformatics developers who will usually have a Linux and scripting language background. On the other hand, the presence of an additional set of libraries makes it easier to deploy software in a parallel (scalable) fashion and explicitly manage such a production run with only a few hundred lines of code, most of which can be incorporated from a template. We propose that this environment is best suited for running stable bioinformatics software by users not involved with its development. PMID:25050811
NASA Technical Reports Server (NTRS)
Chance, Kelly
2003-01-01
This grant is an extension to our previous NASA Grant NAG5-3461, providing incremental funding to continue GOME (Global Ozone Monitoring Experiment) and SCIAMACHY (SCanning Imaging Absorption SpectroMeter for Atmospheric CHartographY) studies. This report summarizes research done under these grants through December 31, 2002. The research performed during this reporting period includes development and maintenance of scientific software for the GOME retrieval algorithms, consultation on operational software development for GOME, consultation and development for SCIAMACHY near-real-time (NRT) and off-line (OL) data products, and participation in initial SCIAMACHY validation studies. The Global Ozone Monitoring Experiment was successfully launched on the ERS-2 satellite on April 20, 1995, and remains working in normal fashion. SCIAMACHY was launched March 1, 2002 on the ESA Envisat satellite. Three GOME-2 instruments are now scheduled to fly on the Metop series of operational meteorological satellites (Eumetsat). K. Chance is a member of the reconstituted GOME Scientific Advisory Group, which will guide the GOME-2 program as well as the continuing ERS-2 GOME program.
Operator Performance Support System (OPSS)
NASA Technical Reports Server (NTRS)
Conklin, Marlen Z.
1993-01-01
In the complex and fast reaction world of military operations, present technologies, combined with tactical situations, have flooded the operator with assorted information that he is expected to process instantly. As technologies progress, this flow of data and information have both guided and overwhelmed the operator. However, the technologies that have confounded many operators today can be used to assist him -- thus the Operator Performance Support Team. In this paper we propose an operator support station that incorporates the elements of Video and Image Databases, productivity Software, Interactive Computer Based Training, Hypertext/Hypermedia Databases, Expert Programs, and Human Factors Engineering. The Operator Performance Support System will provide the operator with an integrating on-line information/knowledge system that will guide expert or novice to correct systems operations. Although the OPSS is being developed for the Navy, the performance of the workforce in today's competitive industry is of major concern. The concepts presented in this paper which address ASW systems software design issues are also directly applicable to industry. the OPSS will propose practical applications in how to more closely align the relationships between technical knowledge and equipment operator performance.
Zhang, Xin; Zhang, Daijun; Lu, Peili; Bai, Cui; Xiao, Pengying
2011-01-01
Based on the structure of the hybrid respirometer previously developed in our group, a novel implementation for titrimetry was developed, in which two pH electrodes were installed at the inlet and outlet of the measuring cell. The software capable of digital filtering and titration time delay correction was developed in LabVIEW. The hardware and software of the titrimeter and the respirometer were integrated to construct a novel system of respirometry-titrimetry. The system was applied to monitor a batch nitrification process. The obtained profiles of oxygen uptake rate (OUR) and hydrogen ion production rate (HPR) are consistent with each other and agree with the principle of the biological nitrification reaction. According to the OUR and HPR measurements, the oxidized ammonium concentrations were estimated accurately. Furthermore, the endpoint of ammonium oxidation was identified with much higher sensitivity by the HPR measurement. The system could be potentially used for on-line monitoring of biochemical reactions occurring in any kind of bioreactors because its measuring cell is completely independent of the bioreactor.
NASA Astrophysics Data System (ADS)
Chance, Kelly
2003-02-01
This grant is an extension to our previous NASA Grant NAG5-3461, providing incremental funding to continue GOME (Global Ozone Monitoring Experiment) and SCIAMACHY (SCanning Imaging Absorption SpectroMeter for Atmospheric CHartographY) studies. This report summarizes research done under these grants through December 31, 2002. The research performed during this reporting period includes development and maintenance of scientific software for the GOME retrieval algorithms, consultation on operational software development for GOME, consultation and development for SCIAMACHY near-real-time (NRT) and off-line (OL) data products, and participation in initial SCIAMACHY validation studies. The Global Ozone Monitoring Experiment was successfully launched on the ERS-2 satellite on April 20, 1995, and remains working in normal fashion. SCIAMACHY was launched March 1, 2002 on the ESA Envisat satellite. Three GOME-2 instruments are now scheduled to fly on the Metop series of operational meteorological satellites (Eumetsat). K. Chance is a member of the reconstituted GOME Scientific Advisory Group, which will guide the GOME-2 program as well as the continuing ERS-2 GOME program.
Dynamic Reconfiguration of Security Policies in Wireless Sensor Networks
Pinto, Mónica; Gámez, Nadia; Fuentes, Lidia; Amor, Mercedes; Horcas, José Miguel; Ayala, Inmaculada
2015-01-01
Providing security and privacy to wireless sensor nodes (WSNs) is very challenging, due to the heterogeneity of sensor nodes and their limited capabilities in terms of energy, processing power and memory. The applications for these systems run in a myriad of sensors with different low-level programming abstractions, limited capabilities and different routing protocols. This means that applications for WSNs need mechanisms for self-adaptation and for self-protection based on the dynamic adaptation of the algorithms used to provide security. Dynamic software product lines (DSPLs) allow managing both variability and dynamic software adaptation, so they can be considered a key technology in successfully developing self-protected WSN applications. In this paper, we propose a self-protection solution for WSNs based on the combination of the INTER-TRUST security framework (a solution for the dynamic negotiation and deployment of security policies) and the FamiWare middleware (a DSPL approach to automatically configure and reconfigure instances of a middleware for WSNs). We evaluate our approach using a case study from the intelligent transportation system domain. PMID:25746093
Abstract-Reasoning Software for Coordinating Multiple Agents
NASA Technical Reports Server (NTRS)
Clement, Bradley; Barrett, Anthony; Rabideau, Gregg; Knight, Russell
2003-01-01
A computer program for scheduling the activities of multiple agents that share limited resources has been incorporated into the Automated Scheduling and Planning Environment (ASPEN) software system, aspects of which have been reported in several previous NASA Tech Briefs articles. In the original intended application, the agents would be multiple spacecraft and/or robotic vehicles engaged in scientific exploration of distant planets. The program could also be used on Earth in such diverse settings as production lines and military maneuvers. This program includes a planning/scheduling subprogram of the iterative repair type that reasons about the activities of multiple agents at abstract levels in order to greatly improve the scheduling of their use of shared resources. The program summarizes the information about the constraints on, and resource requirements of, abstract activities on the basis of the constraints and requirements that pertain to their potential refinements (decomposition into less-abstract and ultimately to primitive activities). The advantage of reasoning about summary information is that time needed to find consistent schedules is exponentially smaller than the time that would be needed for reasoning about the same tasks at the primitive level.
Design and installation of a next generation pilot scale fermentation system.
Junker, B; Brix, T; Lester, M; Kardos, P; Adamca, J; Lynch, J; Schmitt, J; Salmon, P
2003-01-01
Four new fermenters were designed and constructed for use in secondary metabolite cultivations, bioconversions, and enzyme production. A new PC/PLC-based control system also was implemented using GE Fanuc PLCs, Genius I/O blocks, and Fix Dynamics SCADA software. These systems were incorporated into an industrial research fermentation pilot plant, designed and constructed in the early 1980s. Details of the design of these new fermenters and the new control system are described and compared with the existing installation for expected effectiveness. In addition, the reasoning behind selection of some of these features has been included. Key to the design was the goal of preserving similarity between the new and previously existing and successfully utilized fermenter hardware and software installations where feasible but implementing improvements where warranted and beneficial. Examples of enhancements include strategic use of Inconel as a material of construction to reduce corrosion, piping layout design for simplified hazardous energy isolation, on-line calculation and control of nutrient feed rates, and the use of field I/O modules located near the vessel to permit low-cost addition of new instrumentation.
Strengthening Software Authentication with the ROSE Software Suite
DOE Office of Scientific and Technical Information (OSTI.GOV)
White, G
2006-06-15
Many recent nonproliferation and arms control software projects include a software authentication regime. These include U.S. Government-sponsored projects both in the United States and in the Russian Federation (RF). This trend toward requiring software authentication is only accelerating. Demonstrating assurance that software performs as expected without hidden ''backdoors'' is crucial to a project's success. In this context, ''authentication'' is defined as determining that a software package performs only its intended purpose and performs said purpose correctly and reliably over the planned duration of an agreement. In addition to visual inspections by knowledgeable computer scientists, automated tools are needed to highlightmore » suspicious code constructs, both to aid visual inspection and to guide program development. While many commercial tools are available for portions of the authentication task, they are proprietary and not extensible. An open-source, extensible tool can be customized to the unique needs of each project (projects can have both common and custom rules to detect flaws and security holes). Any such extensible tool has to be based on a complete language compiler. ROSE is precisely such a compiler infrastructure developed within the Department of Energy (DOE) and targeted at the optimization of scientific applications and user-defined libraries within large-scale applications (typically applications of a million lines of code). ROSE is a robust, source-to-source analysis and optimization infrastructure currently addressing large, million-line DOE applications in C and C++ (handling the full C, C99, C++ languages and with current collaborations to support Fortran90). We propose to extend ROSE to address a number of security-specific requirements, and apply it to software authentication for nonproliferation and arms control projects.« less
Concept of Operations for the ESC Product Line Approach.
1996-08-30
production of the application. Product Line Engineering Center ( PLEC ) defines and evolves product line architectures with the SAG. The PLEC is also tasked... PLEC , SAG, and PLAS and offers scenarios for asset and system development. • Section 4 outlines the ESC Product Line transition strategy. • Section...Line or System Needs User Select PLEC ; Assess PL architecture Product Line Architecture Development ments; architecture selection Architecture
EOS MLS Level 2 Data Processing Software Version 3
NASA Technical Reports Server (NTRS)
Livesey, Nathaniel J.; VanSnyder, Livesey W.; Read, William G.; Schwartz, Michael J.; Lambert, Alyn; Santee, Michelle L.; Nguyen, Honghanh T.; Froidevaux, Lucien; wang, Shuhui; Manney, Gloria L.;
2011-01-01
This software accepts the EOS MLS calibrated measurements of microwave radiances products and operational meteorological data, and produces a set of estimates of atmospheric temperature and composition. This version has been designed to be as flexible as possible. The software is controlled by a Level 2 Configuration File that controls all aspects of the software: defining the contents of state and measurement vectors, defining the configurations of the various forward models available, reading appropriate a priori spectroscopic and calibration data, performing retrievals, post-processing results, computing diagnostics, and outputting results in appropriate files. In production mode, the software operates in a parallel form, with one instance of the program acting as a master, coordinating the work of multiple slave instances on a cluster of computers, each computing the results for individual chunks of data. In addition, to do conventional retrieval calculations and producing geophysical products, the Level 2 Configuration File can instruct the software to produce files of simulated radiances based on a state vector formed from a set of geophysical product files taken as input. Combining both the retrieval and simulation tasks in a single piece of software makes it far easier to ensure that identical forward model algorithms and parameters are used in both tasks. This also dramatically reduces the complexity of the code maintenance effort.
Shenoy, Shailesh M
2016-07-01
A challenge in any imaging laboratory, especially one that uses modern techniques, is to achieve a sustainable and productive balance between using open source and commercial software to perform quantitative image acquisition, analysis and visualization. In addition to considering the expense of software licensing, one must consider factors such as the quality and usefulness of the software's support, training and documentation. Also, one must consider the reproducibility with which multiple people generate results using the same software to perform the same analysis, how one may distribute their methods to the community using the software and the potential for achieving automation to improve productivity.
Watanabe, Hiroshi; Nomura, Yoshikazu; Kuribayashi, Ami; Kurabayashi, Tohru
2018-02-01
We aimed to employ the Radia diagnostic software with the safety and efficacy of a new emerging dental X-ray modality (SEDENTEXCT) image quality (IQ) phantom in CT, and to evaluate its validity. The SEDENTEXCT IQ phantom and Radia diagnostic software were employed. The phantom was scanned using one medical full-body CT and two dentomaxillofacial cone beam CTs. The obtained images were imported to the Radia software, and the spatial resolution outputs were evaluated. The oversampling method was employed using our original wire phantom as a reference. The resultant modulation transfer function (MTF) curves were compared. The null hypothesis was that MTF curves generated using both methods would be in agreement. One-way analysis of variance tests were applied to the f50 and f10 values from the MTF curves. The f10 values were subjectively confirmed by observing the line pair modules. The Radia software reported the MTF curves on the xy-plane of the CT scans, but could not return f50 and f10 values on the z-axis. The null hypothesis concerning the reported MTF curves on the xy-plane was rejected. There were significant differences between the results of the Radia software and our reference method, except for f10 values in CS9300. These findings were consistent with our line pair observations. We evaluated the validity of the Radia software with the SEDENTEXCT IQ phantom. The data provided were semi-automatic, albeit with problems and statistically different from our reference. We hope the manufacturer will overcome these limitations.
High/Scope Buyer's Guide to Children's Software. 11th Edition.
ERIC Educational Resources Information Center
Hohmann, Charles; And Others
This 11th edition of the High/Scope Buyer's Guide to Children's Software was designed to help teachers, caregivers, and parents make good choices when purchasing software to enhance children's learning. The book consists of an introduction, a chapter on finding the best software, software reviews for 48 different software products. The…
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1997-07-01
New hardware and software tools build on existing platforms and add performance and ease-of-use benefits as the struggle to find and produce hydrocarbons at the lowest cost becomes more and more competitive. Software tools now provide geoscientists and petroleum engineers with a better understanding of reservoirs from the shape and makeup of formation to behavior projections as hydrocarbons are extracted. Petroleum software tools allow scientists to simulate oil flow, predict the life expectancy of a reservoir, and even help determine how to extend the life and economic viability of the reservoir. The requirement of the petroleum industry to find andmore » extract petroleum more efficiently drives the solutions provided by software and service companies. To one extent or another, most of the petroleum software products available today have achieved an acceptable level of competency. Innovative, high-impact products from small, focussed companies often were bought out by larger companies with deeper pockets if their developers couldn`t fund their expansion. Other products disappeared from the scene, because they were unable to evolve fast enough to compete. There are still enough small companies around producing excellent products to prevent the marketplace from feeling too narrow and lacking in choice. Oil companies requiring specific solutions to their problems have helped fund product development within the commercial sector. As the industry has matured, strategic alliances between vendors, both hardware and software, have provided market advantages, often combining strengths to enter new and undeveloped areas for technology. The pace of technological development has been fast and constant.« less
Improving Software Sustainability: Lessons Learned from Profiles in Science.
Gallagher, Marie E
2013-01-01
The Profiles in Science® digital library features digitized surrogates of historical items selected from the archival collections of the U.S. National Library of Medicine as well as collaborating institutions. In addition, it contains a database of descriptive, technical and administrative metadata. It also contains various software components that allow creation of the metadata, management of the digital items, and access to the items and metadata through the Profiles in Science Web site [1]. The choices made building the digital library were designed to maximize the sustainability and long-term survival of all of the components of the digital library [2]. For example, selecting standard and open digital file formats rather than proprietary formats increases the sustainability of the digital files [3]. Correspondingly, using non-proprietary software may improve the sustainability of the software--either through in-house expertise or through the open source community. Limiting our digital library software exclusively to open source software or to software developed in-house has not been feasible. For example, we have used proprietary operating systems, scanning software, a search engine, and office productivity software. We did this when either lack of essential capabilities or the cost-benefit trade-off favored using proprietary software. We also did so knowing that in the future we would need to replace or upgrade some of our proprietary software, analogous to migrating from an obsolete digital file format to a new format as the technological landscape changes. Since our digital library's start in 1998, all of its software has been upgraded or replaced, but the digitized items have not yet required migration to other formats. Technological changes that compelled us to replace proprietary software included the cost of product licensing, product support, incompatibility with other software, prohibited use due to evolving security policies, and product abandonment. Sometimes these changes happen on short notice, so we continually monitor our library's software for signs of endangerment. We have attempted to replace proprietary software with suitable in-house or open source software. When the replacement involves a standalone piece of software with a nearly equivalent version, such as replacing a commercial HTTP server with an open source HTTP server, the replacement is straightforward. Recently we replaced software that functioned not only as our search engine but also as the backbone of the architecture of our Web site. In this paper, we describe the lessons learned and the pros and cons of replacing this software with open source software.
The Effect of Software Reusability on Information Theory Based Software Metrics
1990-01-01
of plans across programming languages and application areas, only a brief abstract treatment of non-contiguous "program parts" is mentioned in the...info->num = linenum; CA6 if(*info->text) W. if(find(linenum)) C.8 patchup(linenum, 1); /*fix up old line numbers*/ 107 C.9 if(*info->text) C-10 start
COSTMODL: An automated software development cost estimation tool
NASA Technical Reports Server (NTRS)
Roush, George B.
1991-01-01
The cost of developing computer software continues to consume an increasing portion of many organizations' total budgets, both in the public and private sector. As this trend develops, the capability to produce reliable estimates of the effort and schedule required to develop a candidate software product takes on increasing importance. The COSTMODL program was developed to provide an in-house capability to perform development cost estimates for NASA software projects. COSTMODL is an automated software development cost estimation tool which incorporates five cost estimation algorithms including the latest models for the Ada language and incrementally developed products. The principal characteristic which sets COSTMODL apart from other software cost estimation programs is its capacity to be completely customized to a particular environment. The estimation equations can be recalibrated to reflect the programmer productivity characteristics demonstrated by the user's organization, and the set of significant factors which effect software development costs can be customized to reflect any unique properties of the user's development environment. Careful use of a capability such as COSTMODL can significantly reduce the risk of cost overruns and failed projects.
A measurement system for large, complex software programs
NASA Technical Reports Server (NTRS)
Rone, Kyle Y.; Olson, Kitty M.; Davis, Nathan E.
1994-01-01
This paper describes measurement systems required to forecast, measure, and control activities for large, complex software development and support programs. Initial software cost and quality analysis provides the foundation for meaningful management decisions as a project evolves. In modeling the cost and quality of software systems, the relationship between the functionality, quality, cost, and schedule of the product must be considered. This explicit relationship is dictated by the criticality of the software being developed. This balance between cost and quality is a viable software engineering trade-off throughout the life cycle. Therefore, the ability to accurately estimate the cost and quality of software systems is essential to providing reliable software on time and within budget. Software cost models relate the product error rate to the percent of the project labor that is required for independent verification and validation. The criticality of the software determines which cost model is used to estimate the labor required to develop the software. Software quality models yield an expected error discovery rate based on the software size, criticality, software development environment, and the level of competence of the project and developers with respect to the processes being employed.
NASA Astrophysics Data System (ADS)
Shani, Uri; Kol, Tomer; Shachor, Gal
2004-04-01
Managing medical digital information objects, and in particular medical images is an enterprise-grade problem. Firstly, there is the sheer amount of digital data that is generated in the proliferation of digital (and film-free) medical imaging. Secondly, the managing software ought to enjoy high availability, recoverability and manageability that are found only in the most business-critical systems. Indeed, such requirements are borrowed from the business enterprise world. Moreover, the solution for the medical information management problem should too employ the same software tools, middlewares and architectures. It is safe to say that all first-line medical PACS products strive to provide a solution for all these challenging requirements. The DICOM standard has been a prime enabler of such solutions. DICOM created the interconnectivity, which made it possible for a PACS service to manage millions of exams consisting of trillions of images. With the more comprehensive IHE architecture, the enterprise is expanded into a multi-facility regional conglomerate, which presents extreme demands from the data management system. HIPPA legislations add considerable challenges per security, privacy and other legal issues, which aggravate the situation. In this paper, we firstly present what in our view should be the general requirements for a first-line medical PACS, taken from an enterprise medical imaging storage and management solution perspective. While these requirements can be met by homegrown implementations, we suggest looking at the existing technologies, which have emerged in the recent years to meet exactly these challenges in the business world. We present an evolutionary process, which led to the design and implementation of a medical object management subsystem. This is indeed an enterprise medical imaging solution that is built upon respective technological components. The system answers all these challenges simply by not reinventing wheels, but rather reusing the best "wheels" for the job. Relying on such middleware components allowed us to concentrate on added value for this specific problem domain.
[Research progress of probe design software of oligonucleotide microarrays].
Chen, Xi; Wu, Zaoquan; Liu, Zhengchun
2014-02-01
DNA microarray has become an essential medical genetic diagnostic tool for its high-throughput, miniaturization and automation. The design and selection of oligonucleotide probes are critical for preparing gene chips with high quality. Several sets of probe design software have been developed and are available to perform this work now. Every set of the software aims to different target sequences and shows different advantages and limitations. In this article, the research and development of these sets of software are reviewed in line with three main criteria, including specificity, sensitivity and melting temperature (Tm). In addition, based on the experimental results from literatures, these sets of software are classified according to their applications. This review will be helpful for users to choose an appropriate probe-design software. It will also reduce the costs of microarrays, improve the application efficiency of microarrays, and promote both the research and development (R&D) and commercialization of high-performance probe design software.
Lou, Jerry J; Andrechak, Gary; Riben, Michael; Yong, William H
2011-01-01
Patient safety initiatives throughout the anatomic laboratory and in biorepository laboratories have mandated increasing emphasis on the need for accurately identifying and tracking biospecimen assets throughout their production lifecycle and for archiving/retrieval purposes. However, increasing production volume along with complex workflow characteristics, reliance on manual production processes, and required asset movement to disparate destinations throughout asset lifecycles continue to challenge laboratory efforts. Radio Frequency Identification (RFID) technology, use of radio waves to communicate data between electronic tags attached to objects and a reader, shows significant potential to facilitate and overcome these hurdles. Advantages over traditional barcode labeling include readability without direct line-of-sight alignment to the reader, ability to read multiple tags simultaneously, higher data storage capacity, faster data transmission rate, and capacity to perform multiple read-writes of data to the tag. Most importantly, use of radio waves decreases the need to manually scan each asset, and at each step, identification or tracking event is needed. Temperature monitoring by on-board sensors and three-dimensional position tracking are additional potential benefits of using RFID technology. To date, barriers to implementation of RFID systems in the anatomic laboratory include increased associated costs of tags and readers, system software, data security concerns, lack of specific data standards for stored information, and potential for technological obsolescence during decades of specimen storage. Novel RFID production techniques and increased production capacity are projected to lower costs of some tags to a few cents each. Potentially, information security concerns can be addressed by techniques such as shielding, data encryption, and tag pseudonyms. Commitment by stakeholder groups to develop RFID tag data standards for anatomic pathology and biorepository laboratories could avoid or mitigate the "islands of data" dilemma presented by barcode usage where there are innumerable standards and a consequent paucity of hardware or software "plug and play" interoperability. Work remains to be done to establish the durability and appropriate shielding of individual tag types for use in harsh laboratory environmental conditions, and for long-term archival storage. Finally, given the requirements for long-term storage of biospecimen assets, consideration should be given to ways of mitigating data isolation due to eventual technological obsolescence of a particular RFID technology or software.
Lou, Jerry J.; Andrechak, Gary; Riben, Michael; Yong, William H.
2011-01-01
Patient safety initiatives throughout the anatomic laboratory and in biorepository laboratories have mandated increasing emphasis on the need for accurately identifying and tracking biospecimen assets throughout their production lifecycle and for archiving/retrieval purposes. However, increasing production volume along with complex workflow characteristics, reliance on manual production processes, and required asset movement to disparate destinations throughout asset lifecycles continue to challenge laboratory efforts. Radio Frequency Identification (RFID) technology, use of radio waves to communicate data between electronic tags attached to objects and a reader, shows significant potential to facilitate and overcome these hurdles. Advantages over traditional barcode labeling include readability without direct line-of-sight alignment to the reader, ability to read multiple tags simultaneously, higher data storage capacity, faster data transmission rate, and capacity to perform multiple read-writes of data to the tag. Most importantly, use of radio waves decreases the need to manually scan each asset, and at each step, identification or tracking event is needed. Temperature monitoring by on-board sensors and three-dimensional position tracking are additional potential benefits of using RFID technology. To date, barriers to implementation of RFID systems in the anatomic laboratory include increased associated costs of tags and readers, system software, data security concerns, lack of specific data standards for stored information, and potential for technological obsolescence during decades of specimen storage. Novel RFID production techniques and increased production capacity are projected to lower costs of some tags to a few cents each. Potentially, information security concerns can be addressed by techniques such as shielding, data encryption, and tag pseudonyms. Commitment by stakeholder groups to develop RFID tag data standards for anatomic pathology and biorepository laboratories could avoid or mitigate the “islands of data” dilemma presented by barcode usage where there are innumerable standards and a consequent paucity of hardware or software “plug and play” interoperability. Work remains to be done to establish the durability and appropriate shielding of individual tag types for use in harsh laboratory environmental conditions, and for long-term archival storage. Finally, given the requirements for long-term storage of biospecimen assets, consideration should be given to ways of mitigating data isolation due to eventual technological obsolescence of a particular RFID technology or software. PMID:21886890
Rapid assessment of assignments using plagiarism detection software.
Bischoff, Whitney R; Abrego, Patricia C
2011-01-01
Faculty members most often use plagiarism detection software to detect portions of students' written work that have been copied and/or not attributed to their authors. The rise in plagiarism has led to a parallel rise in software products designed to detect plagiarism. Some of these products are configurable for rapid assessment and teaching, as well as for plagiarism detection.
NASA Astrophysics Data System (ADS)
Candia, Sante; Lisio, Giovanni; Campolo, Giovanni; Pascucci, Dario
2010-08-01
The Avionics Software (ASW), in charge of controlling the Low Earth Orbit (LEO) Spacecraft PRIMA Platform (Piattaforma Ri-configurabile Italiana Multi-Applicativa), is evolving towards a highly modular and re-usable architecture based on an architectural framework allowing the effective integration of the software building blocks (SWBBs) providing the on-board control functions. During the recent years, the PRIMA ASW design and production processes have been improved to reach the following objectives: (a) at PUS Services level, separation of the mission-independent software mechanisms from the mission-dependent configuration information; (b) at Application level, identification of mission-independent recurrent functions for promoting abstraction and obtaining a more efficient and safe ASW production, with positive implications also on the software validation activities. This paper is dedicated to the characterisation activity which has been performed at Application level for a software component abstracting a set of functions for the generic On-Board Assembly (OBA), a set of hardware units used to deliver an on-board service. Moreover, the ASW production process is specified to show how it results after the introduction of the new design features.
SAGA: A project to automate the management of software production systems
NASA Technical Reports Server (NTRS)
Campbell, Roy H.; Laliberte, D.; Render, H.; Sum, R.; Smith, W.; Terwilliger, R.
1987-01-01
The Software Automation, Generation and Administration (SAGA) project is investigating the design and construction of practical software engineering environments for developing and maintaining aerospace systems and applications software. The research includes the practical organization of the software lifecycle, configuration management, software requirements specifications, executable specifications, design methodologies, programming, verification, validation and testing, version control, maintenance, the reuse of software, software libraries, documentation, and automated management.
gPhoton: THE GALEX PHOTON DATA ARCHIVE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Million, Chase; Fleming, Scott W.; Shiao, Bernie
gPhoton is a new database product and software package that enables analysis of GALEX ultraviolet data at the photon level. The project’s stand-alone, pure-Python calibration pipeline reproduces the functionality of the original mission pipeline to reduce raw spacecraft data to lists of time-tagged, sky-projected photons, which are then hosted in a publicly available database by the Mikulski Archive at Space Telescope. This database contains approximately 130 terabytes of data describing approximately 1.1 trillion sky-projected events with a timestamp resolution of five milliseconds. A handful of Python and command-line modules serve as a front end to interact with the database andmore » to generate calibrated light curves and images from the photon-level data at user-defined temporal and spatial scales. The gPhoton software and source code are in active development and publicly available under a permissive license. We describe the motivation, design, and implementation of the calibration pipeline, database, and tools, with emphasis on divergence from prior work, as well as challenges created by the large data volume. We summarize the astrometric and photometric performance of gPhoton relative to the original mission pipeline. For a brief example of short time-domain science capabilities enabled by gPhoton, we show new flares from the known M-dwarf flare star CR Draconis. The gPhoton software has permanent object identifiers with the ASCL (ascl:1603.004) and DOI (doi:10.17909/T9CC7G). This paper describes the software as of version v1.27.2.« less
Mendez Astudillo, Jorge; Lau, Lawrence; Tang, Yu-Ting; Moore, Terry
2018-02-14
As Global Navigation Satellite System (GNSS) signals travel through the troposphere, a tropospheric delay occurs due to a change in the refractive index of the medium. The Precise Point Positioning (PPP) technique can achieve centimeter/millimeter positioning accuracy with only one GNSS receiver. The Zenith Tropospheric Delay (ZTD) is estimated alongside with the position unknowns in PPP. Estimated ZTD can be very useful for meteorological applications, an example is the estimation of water vapor content in the atmosphere from the estimated ZTD. PPP is implemented with different algorithms and models in online services and software packages. In this study, a performance assessment with analysis of ZTD estimates from three PPP online services and three software packages is presented. The main contribution of this paper is to show the accuracy of ZTD estimation achievable in PPP. The analysis also provides the GNSS users and researchers the insight of the processing algorithm dependence and impact on PPP ZTD estimation. Observation data of eight whole days from a total of nine International GNSS Service (IGS) tracking stations spread in the northern hemisphere, the equatorial region and the southern hemisphere is used in this analysis. The PPP ZTD estimates are compared with the ZTD obtained from the IGS tropospheric product of the same days. The estimates of two of the three online PPP services show good agreement (<1 cm) with the IGS ZTD values at the northern and southern hemisphere stations. The results also show that the online PPP services perform better than the selected PPP software packages at all stations.
AU-FREDI - AUTONOMOUS FREQUENCY DOMAIN IDENTIFICATION
NASA Technical Reports Server (NTRS)
Yam, Y.
1994-01-01
The Autonomous Frequency Domain Identification program, AU-FREDI, is a system of methods, algorithms and software that was developed for the identification of structural dynamic parameters and system transfer function characterization for control of large space platforms and flexible spacecraft. It was validated in the CALTECH/Jet Propulsion Laboratory's Large Spacecraft Control Laboratory. Due to the unique characteristics of this laboratory environment, and the environment-specific nature of many of the software's routines, AU-FREDI should be considered to be a collection of routines which can be modified and reassembled to suit system identification and control experiments on large flexible structures. The AU-FREDI software was originally designed to command plant excitation and handle subsequent input/output data transfer, and to conduct system identification based on the I/O data. Key features of the AU-FREDI methodology are as follows: 1. AU-FREDI has on-line digital filter design to support on-orbit optimal input design and data composition. 2. Data composition of experimental data in overlapping frequency bands overcomes finite actuator power constraints. 3. Recursive least squares sine-dwell estimation accurately handles digitized sinusoids and low frequency modes. 4. The system also includes automated estimation of model order using a product moment matrix. 5. A sample-data transfer function parametrization supports digital control design. 6. Minimum variance estimation is assured with a curve fitting algorithm with iterative reweighting. 7. Robust root solvers accurately factorize high order polynomials to determine frequency and damping estimates. 8. Output error characterization of model additive uncertainty supports robustness analysis. The research objectives associated with AU-FREDI were particularly useful in focusing the identification methodology for realistic on-orbit testing conditions. Rather than estimating the entire structure, as is typically done in ground structural testing, AU-FREDI identifies only the key transfer function parameters and uncertainty bounds that are necessary for on-line design and tuning of robust controllers. AU-FREDI's system identification algorithms are independent of the JPL-LSCL environment, and can easily be extracted and modified for use with input/output data files. The basic approach of AU-FREDI's system identification algorithms is to non-parametrically identify the sampled data in the frequency domain using either stochastic or sine-dwell input, and then to obtain a parametric model of the transfer function by curve-fitting techniques. A cross-spectral analysis of the output error is used to determine the additive uncertainty in the estimated transfer function. The nominal transfer function estimate and the estimate of the associated additive uncertainty can be used for robust control analysis and design. AU-FREDI's I/O data transfer routines are tailored to the environment of the CALTECH/ JPL-LSCL which included a special operating system to interface with the testbed. Input commands for a particular experiment (wideband, narrowband, or sine-dwell) were computed on-line and then issued to respective actuators by the operating system. The operating system also took measurements through displacement sensors and passed them back to the software for storage and off-line processing. In order to make use of AU-FREDI's I/O data transfer routines, a user would need to provide an operating system capable of overseeing such functions between the software and the experimental setup at hand. The program documentation contains information designed to support users in either providing such an operating system or modifying the system identification algorithms for use with input/output data files. It provides a history of the theoretical, algorithmic and software development efforts including operating system requirements and listings of some of the various special purpose subroutines which were developed and optimized for Lahey FORTRAN compilers on IBM PC-AT computers before the subroutines were integrated into the system software. Potential purchasers are encouraged to purchase and review the documentation before purchasing the AU-FREDI software. AU-FREDI is distributed in DEC VAX BACKUP format on a 1600 BPI 9-track magnetic tape (standard media) or a TK50 tape cartridge. AU-FREDI was developed in 1989 and is a copyrighted work with all copyright vested in NASA.
Software Process Improvement: Supporting the Linking of the Software and the Business Strategies
NASA Astrophysics Data System (ADS)
Albuquerque, Adriano Bessa; Rocha, Ana Regina; Lima, Andreia Cavalcanti
The market is becoming more and more competitive, a lot of products and services depend of the software product and the software is one of the most important assets, which influence the organizations’ businesses. Considering this context, we can observe that the companies must to deal with the software, developing or acquiring, carefully. One of the perspectives that can help to take advantage of the software, supporting effectively the business, is to invest on the organization’s software processes. This paper presents an approach to evaluate and improve the processes assets of the software organizations, based on internationally well-known standards and process models. This approach is supported by automated tools from the TABA Workstation and is part of a wider improvement strategy constituted of three layers (organizational layer, process execution layer and external entity layer). Moreover, this paper presents the experience of use and their results.
Towards understanding software: 15 years in the SEL
NASA Technical Reports Server (NTRS)
Mcgarry, Frank; Pajerski, Rose
1990-01-01
For 15 years, the Software Engineering Laboratory (SEL) at GSFC has been carrying out studies and experiments for the purpose of understanding, assessing, and improving software, and software processes within a production software environment. The SEL comprises three major organizations: (1) the GSFC Flight Dynamics Division; (2) the University of Maryland Computer Science Department; and (3) the Computer Sciences Corporation Flight Dynamics Technology Group. These organizations have jointly carried out several hundred software studies, producing hundreds of reports, papers, and documents: all describing some aspect of the software engineering technology that has undergone analysis in the flight dynamics environment. The studies range from small controlled experiments (such as analyzing the effectiveness of code reading versus functional testing) to large, multiple-project studies (such as assessing the impacts of Ada on a production environment). The key findings that NASA feels have laid the foundation for ongoing and future software development and research activities are summarized.
Detection and avoidance of errors in computer software
NASA Technical Reports Server (NTRS)
Kinsler, Les
1989-01-01
The acceptance test errors of a computer software project to determine if the errors could be detected or avoided in earlier phases of development. GROAGSS (Gamma Ray Observatory Attitude Ground Support System) was selected as the software project to be examined. The development of the software followed the standard Flight Dynamics Software Development methods. GROAGSS was developed between August 1985 and April 1989. The project is approximately 250,000 lines of code of which approximately 43,000 lines are reused from previous projects. GROAGSS had a total of 1715 Change Report Forms (CRFs) submitted during the entire development and testing. These changes contained 936 errors. Of these 936 errors, 374 were found during the acceptance testing. These acceptance test errors were first categorized into methods of avoidance including: more clearly written requirements; detail review; code reading; structural unit testing; and functional system integration testing. The errors were later broken down in terms of effort to detect and correct, class of error, and probability that the prescribed detection method would be successful. These determinations were based on Software Engineering Laboratory (SEL) documents and interviews with the project programmers. A summary of the results of the categorizations is presented. The number of programming errors at the beginning of acceptance testing can be significantly reduced. The results of the existing development methodology are examined for ways of improvements. A basis is provided for the definition is a new development/testing paradigm. Monitoring of the new scheme will objectively determine its effectiveness on avoiding and detecting errors.
A Legal Guide for the Software Developer.
ERIC Educational Resources Information Center
Minnesota Small Business Assistance Office, St. Paul.
This booklet has been prepared to familiarize the inventor, creator, or developer of a new computer software product or software invention with the basic legal issues involved in developing, protecting, and distributing the software in the United States. Basic types of software protection and related legal matters are discussed in detail,…
NASA Technical Reports Server (NTRS)
Mallasch, Paul G.; Babic, Slavoljub
1994-01-01
The United States Air Force (USAF) provides NASA Lewis Research Center with monthly reports containing the Synchronous Satellite Catalog and the associated Two Line Mean Element Sets. The USAF Synchronous Satellite Catalog supplies satellite orbital parameters collected by an automated monitoring system and provided to Lewis Research Center as text files on magnetic tape. Software was developed to facilitate automated formatting, data normalization, cross-referencing, and error correction of Synchronous Satellite Catalog files before loading into the NASA Geosynchronous Satellite Orbital Statistics Database System (GSOSTATS). This document contains the User's Guide and Software Maintenance Manual with information necessary for installation, initialization, start-up, operation, error recovery, and termination of the software application. It also contains implementation details, modification aids, and software source code adaptations for use in future revisions.
Vaeggemose, Ulla; Ankersen, Pia Vedel; Aagaard, Jørgen; Burau, Viola
2018-01-01
Co-production involves knowledge and skills based on both lived experiences of citizens and professionally training of staff. In Europe, co-production is viewed as an essential tool for meeting the demographic, political and economic challenges of welfare states. However, co-production is facing challenges because public services and civil society are rooted in two very different logics. These challenges are typically encountered by provider organisations and their staff who must convert policies and strategies into practice. Denmark is a welfare state with a strong public services sector and a relatively low involvement of volunteers. The aim of this study was to investigate how provider organisations and their staff navigate between the two logics. The present analysis is a critical case study of two municipalities selected from seven participating municipalities, for their maximum diversity. The study setting was the Community Families programme, which aim to support the social network of mental health users by offering regular contact with selected private families/individuals. The task of the municipalities was to initiate and support Community Families. The analysis built on qualitative data generated at the organisational level in the seven participating municipalities. Within the two "case study" municipalities, qualitative interviews were conducted with front-line co-ordinators (six) and line managers (two). The interviews were recorded, transcribed verbatim and coded using the software program NVivo. The results confirm the central role played by staff and identify a close interplay between public services and civil society logics as essential for the organisation of co-production. Corresponding objectives, activities and collaborative relations of provider organisations are keys for facilitating the co-productive practice of individual staff. Organised in this way, co-production can succeed even in a mental health setting associated with social stigma and in a welfare state dominated by public services. © 2017 John Wiley & Sons Ltd.
Maintaining the Health of Software Monitors
NASA Technical Reports Server (NTRS)
Person, Suzette; Rungta, Neha
2013-01-01
Software health management (SWHM) techniques complement the rigorous verification and validation processes that are applied to safety-critical systems prior to their deployment. These techniques are used to monitor deployed software in its execution environment, serving as the last line of defense against the effects of a critical fault. SWHM monitors use information from the specification and implementation of the monitored software to detect violations, predict possible failures, and help the system recover from faults. Changes to the monitored software, such as adding new functionality or fixing defects, therefore, have the potential to impact the correctness of both the monitored software and the SWHM monitor. In this work, we describe how the results of a software change impact analysis technique, Directed Incremental Symbolic Execution (DiSE), can be applied to monitored software to identify the potential impact of the changes on the SWHM monitor software. The results of DiSE can then be used by other analysis techniques, e.g., testing, debugging, to help preserve and improve the integrity of the SWHM monitor as the monitored software evolves.
AOIPS 3 user's guide. Volume 2: Program descriptions
NASA Technical Reports Server (NTRS)
Schotz, Steve S.; Piper, Thomas S.; Negri, Andrew J.
1990-01-01
The Atmospheric and Oceanographic Information Processing System (AOIPS) 3 is the version of the AOIPS software as of April 1989. The AOIPS software was developed jointly by the Goddard Space Flight Center and General Sciences Corporation. A detailed description of very AOIPS program is presented. It is intended to serve as a reference for such items as program functionality, program operational instructions, and input/output variable descriptions. Program descriptions are derived from the on-line help information. Each program description is divided into two sections. The functional description section describes the purpose of the program and contains any pertinent operational information. The program description sections lists the program variables as they appear on-line, and describes them in detail.
ASERA: A Spectrum Eye Recognition Assistant
NASA Astrophysics Data System (ADS)
Yuan, Hailong; Zhang, Haotong; Zhang, Yanxia; Lei, Yajuan; Dong, Yiqiao; Zhao, Yongheng
2018-04-01
ASERA, ASpectrum Eye Recognition Assistant, aids in quasar spectral recognition and redshift measurement and can also be used to recognize various types of spectra of stars, galaxies and AGNs (Active Galactic Nucleus). This interactive software allows users to visualize observed spectra, superimpose template spectra from the Sloan Digital Sky Survey (SDSS), and interactively access related spectral line information. ASERA is an efficient and user-friendly semi-automated toolkit for the accurate classification of spectra observed by LAMOST (the Large Sky Area Multi-object Fiber Spectroscopic Telescope) and is available as a standalone Java application and as a Java applet. The software offers several functions, including wavelength and flux scale settings, zoom in and out, redshift estimation, and spectral line identification.
On-line evaluation of multiloop digital controller performance
NASA Technical Reports Server (NTRS)
Wieseman, Carol D.
1993-01-01
The purpose of this presentation is to inform the Guidance and Control community of capabilities which were developed by the Aeroservoelasticity Branch to evaluate the performance of multivariable control laws, on-line, during wind-tunnel testing. The capabilities are generic enough to be useful for all kinds of on-line analyses involving multivariable control in experimental testing. Consequently, it was decided to present this material at this workshop even though it has been presented elsewhere. Topics covered include: essential on-line analysis requirements; on-line analysis capabilities; on-line analysis software; frequency domain procedures; controller performance evaluation frequency-domain flutter suppression; and plant determination.
On-line 3-dimensional confocal imaging in vivo.
Li, J; Jester, J V; Cavanagh, H D; Black, T D; Petroll, W M
2000-09-01
In vivo confocal microscopy through focusing (CMTF) can provide a 3-D stack of high-resolution corneal images and allows objective measurements of corneal sublayer thickness and backscattering. However, current systems require time-consuming off-line image processing and analysis on multiple software platforms. Furthermore, there is a trade off between the CMTF speed and measurement precision. The purpose of this study was to develop a novel on-line system for in vivo corneal imaging and analysis that overcomes these limitations. A tandem scanning confocal microscope (TSCM) was used for corneal imaging. The TSCM video camera was interfaced directly to a PC image acquisition board to implement real-time digitization. Software was developed to allow in vivo 2-D imaging, CMTF image acquisition, interactive 3-D reconstruction, and analysis of CMTF data to be performed on line in a single user-friendly environment. A procedure was also incorporated to separate the odd/even video fields, thereby doubling the CMTF sampling rate and theoretically improving the precision of CMTF thickness measurements by a factor of two. In vivo corneal examinations of a normal human and a photorefractive keratectomy patient are presented to demonstrate the capabilities of the new system. Improvements in the convenience, speed, and functionality of in vivo CMTF image acquisition, display, and analysis are demonstrated. This is the first full-featured software package designed for in vivo TSCM imaging of the cornea, which performs both 2-D and 3-D image acquisition, display, and processing as well as CMTF analysis. The use of a PC platform and incorporation of easy to use, on line, and interactive features should help to improve the clinical utility of this technology.
Adaptive Optimization of Aircraft Engine Performance Using Neural Networks
NASA Technical Reports Server (NTRS)
Simon, Donald L.; Long, Theresa W.
1995-01-01
Preliminary results are presented on the development of an adaptive neural network based control algorithm to enhance aircraft engine performance. This work builds upon a previous National Aeronautics and Space Administration (NASA) effort known as Performance Seeking Control (PSC). PSC is an adaptive control algorithm which contains a model of the aircraft's propulsion system which is updated on-line to match the operation of the aircraft's actual propulsion system. Information from the on-line model is used to adapt the control system during flight to allow optimal operation of the aircraft's propulsion system (inlet, engine, and nozzle) to improve aircraft engine performance without compromising reliability or operability. Performance Seeking Control has been shown to yield reductions in fuel flow, increases in thrust, and reductions in engine fan turbine inlet temperature. The neural network based adaptive control, like PSC, will contain a model of the propulsion system which will be used to calculate optimal control commands on-line. Hopes are that it will be able to provide some additional benefits above and beyond those of PSC. The PSC algorithm is computationally intensive, it is valid only at near steady-state flight conditions, and it has no way to adapt or learn on-line. These issues are being addressed in the development of the optimal neural controller. Specialized neural network processing hardware is being developed to run the software, the algorithm will be valid at steady-state and transient conditions, and will take advantage of the on-line learning capability of neural networks. Future plans include testing the neural network software and hardware prototype against an aircraft engine simulation. In this paper, the proposed neural network software and hardware is described and preliminary neural network training results are presented.
NASA Technical Reports Server (NTRS)
1993-01-01
Under a NASA Small Business Innovation Research (SBIR) contract, Axiomatics Corporation developed a shunting Dielectric Sensor to determine the nutrient level and analyze plant nutrient solutions in the CELSS, NASA's space life support program. (CELSS is an experimental facility investigating closed-cycle plant growth and food processing for long duration manned missions.) The DiComp system incorporates a shunt electrode and is especially sensitive to changes in dielectric property changes in materials at measurements much lower than conventional sensors. The analyzer has exceptional capabilities for predicting composition of liquid streams or reactions. It measures concentrations and solids content up to 100 percent in applications like agricultural products, petrochemicals, food and beverages. The sensor is easily installed; maintenance is low, and it can be calibrated on line. The software automates data collection and analysis.
NASA Astrophysics Data System (ADS)
Upadhyay, Parijat; Dan, Pranab K.
To achieve synergy across product lines, businesses are implementing a set of standard business applications and consistent data definitions across all business units. ERP packages are extremely useful in integrating a global company and provide a "common language" throughout the company. Companies are not only implementing a standardized application but is also moving to a common architecture and infrastructure. For many companies, a standardized software rollout is a good time to do some consolidation of their IT infrastructure across various locations. Companies are also finding that the ERP solutions help them get rid of their legacy systems, most of which may not be compliant with the modern day business requirements.
NASA Technical Reports Server (NTRS)
2003-01-01
With NASA on its side, Positive Systems, Inc., of Whitefish, Montana, is veering away from the industry standards defined for producing and processing remotely sensed images. A top developer of imaging products for geographic information system (GIS) and computer-aided design (CAD) applications, Positive Systems is bucking traditional imaging concepts with a cost-effective and time-saving software tool called Digital Images Made Easy (DIME(trademark)). Like piecing a jigsaw puzzle together, DIME can integrate a series of raw aerial or satellite snapshots into a single, seamless panoramic image, known as a 'mosaic.' The 'mosaicked' images serve as useful backdrops to GIS maps - which typically consist of line drawings called 'vectors' - by allowing users to view a multidimensional map that provides substantially more geographic information.
1994-03-25
Technology Building 225, Room A266 Gait•--eburg, Maryland 20899 U.S.A. Ada Von Ogan~ztionAda Jointt Program Office De & Software David R . Basel...Standards and Technology Building 225, Room A266 Gaithersburg, Maryland 20899 U.S.A. azi Ada Joint Program office Directoz’,’Coputer & Softvare David R ...characters, a bar (" r ) is written in the 16th position and the rest of the characters ame not prined. "* The place of the definition, i.e.. a line
Coping with Variability in Model-Based Systems Engineering: An Experience in Green Energy
NASA Astrophysics Data System (ADS)
Trujillo, Salvador; Garate, Jose Miguel; Lopez-Herrejon, Roberto Erick; Mendialdua, Xabier; Rosado, Albert; Egyed, Alexander; Krueger, Charles W.; de Sosa, Josune
Model-Based Systems Engineering (MBSE) is an emerging engineering discipline whose driving motivation is to provide support throughout the entire system life cycle. MBSE not only addresses the engineering of software systems but also their interplay with physical systems. Quite frequently, successful systems need to be customized to cater for the concrete and specific needs of customers, end-users, and other stakeholders. To effectively meet this demand, it is vital to have in place mechanisms to cope with the variability, the capacity to change, that such customization requires. In this paper we describe our experience in modeling variability using SysML, a leading MBSE language, for developing a product line of wind turbine systems used for the generation of electricity.
DATALINK: Records inventory data collection software. User`s guide
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cole, B.A.
1995-03-01
DATALINK was created to provide an easy to use data collection program for records management software products. It provides several useful tools for capturing and validating record index data in the field. It also allows users to easily create a comma delimited, ASCII text file for data export into most records management software products. It runs on virtually any computer us MS-DOS.
NASA Technical Reports Server (NTRS)
Wolf, S. W. D.; Goodyer, M. J.
1982-01-01
Operation of the Transonic Self-Streamlining Wind Tunnel (TSWT) involved on-line data acquisition with automatic wall adjustment. A tunnel run consisted of streamlining the walls from known starting contours in iterative steps and acquiring model data. Each run performs what is described as a streamlining cycle. The associated software is presented.
ERIC Educational Resources Information Center
Kabaca, Tolga
2013-01-01
Solution set of any inequality or compound inequality, which has one-variable, lies in the real line which is one dimensional. So a difficulty appears when computer assisted graphical representation is intended to use for teaching these topics. Sketching a one-dimensional graph by using computer software is not a straightforward work. In this…
Processing Raman Spectra of High-Pressure Hydrogen Flames
NASA Technical Reports Server (NTRS)
Nguyen, Quang-Viet; Kojima, Jun
2006-01-01
The Raman Code automates the analysis of laser-Raman-spectroscopy data for diagnosis of combustion at high pressure. On the basis of the theory of molecular spectroscopy, the software calculates the rovibrational and pure rotational Raman spectra of H2, O2, N2, and H2O in hydrogen/air flames at given temperatures and pressures. Given a set of Raman spectral data from measurements on a given flame and results from the aforementioned calculations, the software calculates the thermodynamic temperature and number densities of the aforementioned species. The software accounts for collisional spectral-line-broadening effects at pressures up to 60 bar (6 MPa). The line-broadening effects increase with pressure and thereby complicate the analysis. The software also corrects for spectral interference ("cross-talk") among the various chemical species. In the absence of such correction, the cross-talk is a significant source of error in temperatures and number densities. This is the first known comprehensive computer code that, when used in conjunction with a spectral calibration database, can process Raman-scattering spectral data from high-pressure hydrogen/air flames to obtain temperatures accurate to within 10 K and chemical-species number densities accurate to within 2 percent.
NASA Astrophysics Data System (ADS)
Hussain, Azham; Mkpojiogu, Emmanuel O. C.; Yusof, Muhammad Mat
2016-08-01
This paper reports the effect of proposed software products features on the satisfaction and dissatisfaction of potential customers of proposed software products. Kano model's functional and dysfunctional technique was used along with Berger et al.'s customer satisfaction coefficients. The result shows that only two features performed the most in influencing the satisfaction and dissatisfaction of would-be customers of the proposed software product. Attractive and one-dimensional features had the highest impact on the satisfaction and dissatisfaction of customers. This result will benefit requirements analysts, developers, designers, projects and sales managers in preparing for proposed products. Additional analysis showed that the Kano model's satisfaction and dissatisfaction scores were highly related to the Park et al.'s average satisfaction coefficient (r=96%), implying that these variables can be used interchangeably or in place of one another to elicit customer satisfaction. Furthermore, average satisfaction coefficients and satisfaction and dissatisfaction indexes were all positively and linearly correlated.
The effect of advertising in clinical software on general practitioners' prescribing behaviour.
Henderson, Joan; Miller, Graeme; Pan, Ying; Britt, Helena
2008-01-07
To assess the effect of pharmaceutical advertising embedded in clinical software on the prescribing behaviour of general practitioners. Secondary analysis of data from a random sample of 1336 Australian GPs who participated in Bettering the Evaluation and Care of Health, a national continuous cross-sectional survey of general practice activity, between November 2003 and March 2005. The prescribing behaviour of participants who used the advertising software was compared with that of participants who did not, for seven pharmaceutical products advertised continually throughout the study period. Prescription for advertised product as a proportion (%) of prescriptions for all pharmaceutical products in the same generic class or group. GP age, practice location, accreditation status, patient bulk-billing status and hours worked were significantly associated (P < 0.05) with use of advertising software. We found no significant differences, either before or after adjustment for these confounders, in the prescribing rate of Lipitor (adjusted odds ratio [AOR], 0.90; P = 0.26); Micardis (AOR, 0.98; P = 0.91); Mobic (AOR, 1.02; P = 0.89); Norvasc (AOR, 1.02; P = 0.91); Natrilix (AOR, 0.80; P = 0.32); or Zanidip (AOR, 0.88; P = 0.47). GPs using advertising software prescribed Nexium significantly less often than those not using advertising software (AOR, 0.78; P = 0.02). When all advertised products were combined and compared with products that were not advertised, no difference in the overall prescribing behaviour was demonstrated (AOR, 0.96; P = 0.42). Exposure to advertisements in clinical software has little influence on the prescribing behaviour of GPs.
Exploring Foundation Concepts in Introductory Statistics Using Dynamic Data Points
ERIC Educational Resources Information Center
Ekol, George
2015-01-01
This paper analyses introductory statistics students' verbal and gestural expressions as they interacted with a dynamic sketch (DS) designed using "Sketchpad" software. The DS involved numeric data points built on the number line whose values changed as the points were dragged along the number line. The study is framed on aggregate…
Final report on fiscal year 1992 activities for the environmental monitors line-loss study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kenoyer, J.L.
The work performed on this Environmental Monitors Line-Loss Study has been performed under Contract Numbers MLW-SVV-073750 and MFH-SVV-207554. Work on the task was initiated mid-December 1991, and this report documents and summarizes the work performed through January 18, 1993. The sections included in this report summarize the work performed on the Environmental Monitors Line-Loss Study. The sections included in this report are arranged to reflect individual sub-tasks and include: descriptions of measurement systems and procedures used to obtain cascade impactor samples and laser spectrometer measurements from multiple stacks and locations; information on data acquisition, analyses, assessment, and software; discussion ofmore » the analyses and measurement results from the cascade impactor and laser spectrometer systems and software used; discussion on the development of general test methods and procedures for line-loss determinations; an overall summary and specific conclusions that can be made with regard to efforts performed on this task during FY 1992 and FY 1993. Supporting information for these sections is included in this report as appendices.« less
AIRS Maps from Space Processing Software
NASA Technical Reports Server (NTRS)
Thompson, Charles K.; Licata, Stephen J.
2012-01-01
This software package processes Atmospheric Infrared Sounder (AIRS) Level 2 swath standard product geophysical parameters, and generates global, colorized, annotated maps. It automatically generates daily and multi-day averaged colorized and annotated maps of various AIRS Level 2 swath geophysical parameters. It also generates AIRS input data sets for Eyes on Earth, Puffer-sphere, and Magic Planet. This program is tailored to AIRS Level 2 data products. It re-projects data into 1/4-degree grids that can be combined and averaged for any number of days. The software scales and colorizes global grids utilizing AIRS-specific color tables, and annotates images with title and color bar. This software can be tailored for use with other swath data products for the purposes of visualization.
Requirements model for an e-Health awareness portal
NASA Astrophysics Data System (ADS)
Hussain, Azham; Mkpojiogu, Emmanuel O. C.; Nawi, Mohd Nasrun M.
2016-08-01
Requirements engineering is at the heart and foundation of software engineering process. Poor quality requirements inevitably lead to poor quality software solutions. Also, poor requirement modeling is tantamount to designing a poor quality product. So, quality assured requirements development collaborates fine with usable products in giving the software product the needed quality it demands. In the light of the foregoing, the requirements for an e-Ebola Awareness Portal were modeled with a good attention given to these software engineering concerns. The requirements for the e-Health Awareness Portal are modeled as a contribution to the fight against Ebola and helps in the fulfillment of the United Nation's Millennium Development Goal No. 6. In this study requirements were modeled using UML 2.0 modeling technique.
Code of Federal Regulations, 2010 CFR
2010-10-01
... DEFENSE ACQUISITION PLANNING REQUIRED SOURCES OF SUPPLIES AND SERVICES Enterprise Software Agreements 208... services, such as software maintenance, in accordance with the DoD Enterprise Software Initiative (ESI... software and related services. ESI does not dictate the products or services to be acquired. ...
ERIC Educational Resources Information Center
Cibbarelli, Pamela
1996-01-01
Examines library automation product introductions and conversions to new operating systems. Compares user satisfaction ratings of the following library software packages: DOS/Windows, UNIX, Macintosh, and DEC VAX/VMS. Software is rated according to documentation, service/support, training, product reliability, product capabilities, ease of use,…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-10
... Certain GPS Navigation Products, Components Thereof, and Related Software, DN 2814; the Commission is... importation of certain GPS navigation products, components thereof, and related software. The complaint names...
Building an experience factory for maintenance
NASA Technical Reports Server (NTRS)
Valett, Jon D.; Condon, Steven E.; Briand, Lionel; Kim, Yong-Mi; Basili, Victor R.
1994-01-01
This paper reports the preliminary results of a study of the software maintenance process in the Flight Dynamics Division (FDD) of the National Aeronautics and Space Administration/Goddard Space Flight Center (NASA/GSFC). This study is being conducted by the Software Engineering Laboratory (SEL), a research organization sponsored by the Software Engineering Branch of the FDD, which investigates the effectiveness of software engineering technologies when applied to the development of applications software. This software maintenance study began in October 1993 and is being conducted using the Quality Improvement Paradigm (QIP), a process improvement strategy based on three iterative steps: understanding, assessing, and packaging. The preliminary results represent the outcome of the understanding phase, during which SEL researchers characterized the maintenance environment, product, and process. Findings indicate that a combination of quantitative and qualitative analysis is effective for studying the software maintenance process, that additional measures should be collected for maintenance (as opposed to new development), and that characteristics such as effort, error rate, and productivity are best considered on a 'release' basis rather than on a project basis. The research thus far has documented some basic differences between new development and software maintenance. It lays the foundation for further application of the QIP to investigate means of improving the maintenance process and product in the FDD.
NASA Astrophysics Data System (ADS)
Preradović, D. M.; Mićić, Lj S.; Barz, C.
2017-05-01
Production conditions in today’s world require software support at every stage of production and development of new products, for quality assurance and compliance with ISO standards. In addition to ISO standards such as usual metrics of quality, companies today are focused on other optional standards, such as CMMI (Capability Maturity Model Integrated) or prescribing they own standards. However, while there is intensive progress being made in the PM (project management), there is still a significant number of projects, at the global level, that are failures. These have failed to achieve their goals, within budget or timeframe. This paper focuses on checking the role of software tools through the rate of success in projects implemented in the case of internationally manufactured electrical equipment. The results of this research show the level of contribution of the project management software used to manage and develop new products to improve PM processes and PM functions, and how selection of the software tools affects the quality of PM processes and successfully completed projects.
Implementation of interconnect simulation tools in spice
NASA Technical Reports Server (NTRS)
Satsangi, H.; Schutt-Aine, J. E.
1993-01-01
Accurate computer simulation of high speed digital computer circuits and communication circuits requires a multimode approach to simulate both the devices and the interconnects between devices. Classical circuit analysis algorithms (lumped parameter) are needed for circuit devices and the network formed by the interconnected devices. The interconnects, however, have to be modeled as transmission lines which incorporate electromagnetic field analysis. An approach to writing a multimode simulator is to take an existing software package which performs either lumped parameter analysis or field analysis and add the missing type of analysis routines to the package. In this work a traditionally lumped parameter simulator, SPICE, is modified so that it will perform lossy transmission line analysis using a different model approach. Modifying SPICE3E2 or any other large software package is not a trivial task. An understanding of the programming conventions used, simulation software, and simulation algorithms is required. This thesis was written to clarify the procedure for installing a device into SPICE3E2. The installation of three devices is documented and the installations of the first two provide a foundation for installation of the lossy line which is the third device. The details of discussions are specific to SPICE, but the concepts will be helpful when performing installations into other circuit analysis packages.
Preliminary In-Flight Loads Analysis of In-Line Launch Vehicles using the VLOADS 1.4 Program
NASA Technical Reports Server (NTRS)
Graham, J. B.; Luz, P. L.
1998-01-01
To calculate structural loads of in-line launch vehicles for preliminary design, a very useful computer program is VLOADS 1.4. This software may also be used to calculate structural loads for upper stages and planetary transfer vehicles. Launch vehicle inputs such as aerodynamic coefficients, mass properties, propellants, engine thrusts, and performance data are compiled and analyzed by VLOADS to produce distributed shear loads, bending moments, axial forces, and vehicle line loads as a function of X-station along the vehicle's length. Interface loads, if any, and translational accelerations are also computed. The major strength of the software is that it enables quick turnaround analysis of structural loads for launch vehicles during the preliminary design stage of its development. This represents a significant improvement over the alternative-the time-consuming, and expensive chore of developing finite element models. VLOADS was developed as a Visual BASIC macro in a Microsoft Excel 5.0 work book on a Macintosh. VLOADS has also been implemented on a PC computer using Microsoft Excel 7.0a for Windows 95. VLOADS was developed in 1996, and the current version was released to COSMIC, NASA's Software Technology Transfer Center, in 1997. The program is a copyrighted work with all copyright vested in NASA.
ERIC Educational Resources Information Center
Ichu, Emmanuel A.
2010-01-01
Software quality is perhaps one of the most sought-after attributes in product development, however; this goal is unattained. Problem factors in software development and how these have affected the maintainability of the delivered software systems requires a thorough investigation. It was, therefore, very important to understand software…
ERIC Educational Resources Information Center
Mitchell, Susan Marie
2012-01-01
Uncontrollable costs, schedule overruns, and poor end product quality continue to plague the software engineering field. Innovations formulated with the expectation to minimize or eliminate cost, schedule, and quality problems have generally fallen into one of three categories: programming paradigms, software tools, and software process…
Real-Time Multimission Event Notification System for Mars Relay
NASA Technical Reports Server (NTRS)
Wallick, Michael N.; Allard, Daniel A.; Gladden, Roy E.; Wang, Paul; Hy, Franklin H.
2013-01-01
As the Mars Relay Network is in constant flux (missions and teams going through their daily workflow), it is imperative that users are aware of such state changes. For example, a change by an orbiter team can affect operations on a lander team. This software provides an ambient view of the real-time status of the Mars network. The Mars Relay Operations Service (MaROS) comprises a number of tools to coordinate, plan, and visualize various aspects of the Mars Relay Network. As part of MaROS, a feature set was developed that operates on several levels of the software architecture. These levels include a Web-based user interface, a back-end "ReSTlet" built in Java, and databases that store the data as it is received from the network. The result is a real-time event notification and management system, so mission teams can track and act upon events on a moment-by-moment basis. This software retrieves events from MaROS and displays them to the end user. Updates happen in real time, i.e., messages are pushed to the user while logged into the system, and queued when the user is not online for later viewing. The software does not do away with the email notifications, but augments them with in-line notifications. Further, this software expands the events that can generate a notification, and allows user-generated notifications. Existing software sends a smaller subset of mission-generated notifications via email. A common complaint of users was that the system-generated e-mails often "get lost" with other e-mail that comes in. This software allows for an expanded set (including user-generated) of notifications displayed in-line of the program. By separating notifications, this can improve a user's workflow.
van Beek, J; Haanperä, M; Smit, P W; Mentula, S; Soini, H
2018-04-11
Culture-based assays are currently the reference standard for drug susceptibility testing for Mycobacterium tuberculosis. They provide good sensitivity and specificity but are time consuming. The objective of this study was to evaluate whether whole genome sequencing (WGS), combined with software tools for data analysis, can replace routine culture-based assays for drug susceptibility testing of M. tuberculosis. M. tuberculosis cultures sent to the Finnish mycobacterial reference laboratory in 2014 (n = 211) were phenotypically tested by Mycobacteria Growth Indicator Tube (MGIT) for first-line drug susceptibilities. WGS was performed for all isolates using the Illumina MiSeq system, and data were analysed using five software tools (PhyResSE, Mykrobe Predictor, TB Profiler, TGS-TB and KvarQ). Diagnostic time and reagent costs were estimated for both methods. The sensitivity of the five software tools to predict any resistance among strains was almost identical, ranging from 74% to 80%, and specificity was more than 95% for all software tools except for TGS-TB. The sensitivity and specificity to predict resistance to individual drugs varied considerably among the software tools. Reagent costs for MGIT and WGS were €26 and €143 per isolate respectively. Turnaround time for MGIT was 19 days (range 10-50 days) for first-line drugs, and turnaround time for WGS was estimated to be 5 days (range 3-7 days). WGS could be used as a prescreening assay for drug susceptibility testing with confirmation of resistant strains by MGIT. The functionality and ease of use of the software tools need to be improved. Copyright © 2018 European Society of Clinical Microbiology and Infectious Diseases. Published by Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Glass, B. J.; Hack, E. C.
1990-01-01
A knowledge-based control system for real-time control and fault detection, isolation and recovery (FDIR) of a prototype two-phase Space Station Freedom external thermal control system (TCS) is discussed in this paper. The Thermal Expert System (TEXSYS) has been demonstrated in recent tests to be capable of both fault anticipation and detection and real-time control of the thermal bus. Performance requirements were achieved by using a symbolic control approach, layering model-based expert system software on a conventional numerical data acquisition and control system. The model-based capabilities of TEXSYS were shown to be advantageous during software development and testing. One representative example is given from on-line TCS tests of TEXSYS. The integration and testing of TEXSYS with a live TCS testbed provides some insight on the use of formal software design, development and documentation methodologies to qualify knowledge-based systems for on-line or flight applications.
Unit Testing for the Application Control Language (ACL) Software
NASA Technical Reports Server (NTRS)
Heinich, Christina Marie
2014-01-01
In the software development process, code needs to be tested before it can be packaged for release in order to make sure the program actually does what it says is supposed to happen as well as to check how the program deals with errors and edge cases (such as negative or very large numbers). One of the major parts of the testing process is unit testing, where you test specific units of the code to make sure each individual part of the code works. This project is about unit testing many different components of the ACL software and fixing any errors encountered. To do this, mocks of other objects need to be created and every line of code needs to be exercised to make sure every case is accounted for. Mocks are important to make because it gives direct control of the environment the unit lives in instead of attempting to work with the entire program. This makes it easier to achieve the second goal of exercising every line of code.
Evaluation of Optical Disk Jukebox Software.
ERIC Educational Resources Information Center
Ranade, Sanjay; Yee, Fonald
1989-01-01
Discusses software that is used to drive and access optical disk jukeboxes, which are used for data storage. Categories of the software are described, user categories are explained, the design of implementation approaches is discussed, and representative software products are reviewed. (eight references) (LRW)
Automated Flight Dynamics Product Generation for the EOS AM-1 Spacecraft
NASA Technical Reports Server (NTRS)
Matusow, Carla
1999-01-01
As part of NASA's Earth Science Enterprise, the Earth Observing System (EOS) AM-1 spacecraft is designed to monitor long-term, global, environmental changes. Because of the complexity of the AM-1 spacecraft, the mission operations center requires more than 80 distinct flight dynamics products (reports). To create these products, the AM-1 Flight Dynamics Team (FDT) will use a combination of modified commercial software packages (e.g., Analytical Graphic's Satellite ToolKit) and NASA-developed software applications. While providing the most cost-effective solution to meeting the mission requirements, the integration of these software applications raises several operational concerns: (1) Routine product generation requires knowledge of multiple applications executing on variety of hardware platforms. (2) Generating products is a highly interactive process requiring a user to interact with each application multiple times to generate each product. (3) Routine product generation requires several hours to complete. (4) User interaction with each application introduces the potential for errors, since users are required to manually enter filenames and input parameters as well as run applications in the correct sequence. Generating products requires some level of flight dynamics expertise to determine the appropriate inputs and sequencing. To address these issues, the FDT developed an automation software tool called AutoProducts, which runs on a single hardware platform and provides all necessary coordination and communication among the various flight dynamics software applications. AutoProducts, autonomously retrieves necessary files, sequences and executes applications with correct input parameters, and deliver the final flight dynamics products to the appropriate customers. Although AutoProducts will normally generate pre-programmed sets of routine products, its graphical interface allows for easy configuration of customized and one-of-a-kind products. Additionally, AutoProducts has been designed as a mission-independent tool, and can be easily reconfigured to support other missions or incorporate new flight dynamics software packages. After the AM-1 launch, AutoProducts will run automatically at pre-determined time intervals . The AutoProducts tool reduces many of the concerns associated with the flight dynamics product generation. Although AutoProducts required a significant effort to develop because of the complexity of the interfaces involved, its use will provide significant cost savings through reduced operator time and maximum product reliability. In addition, user satisfaction is significantly improved and flight dynamics experts have more time to perform valuable analysis work. This paper will describe the evolution of the AutoProducts tool, highlighting the cost savings and customer satisfaction resulting from its development. It will also provide details about the tool including its graphical interface and operational capabilities.
NASA Astrophysics Data System (ADS)
Ismail, M. A. M.; Kumar, N. S.; Abidin, M. H. Z.; Madun, A.
2018-04-01
This study is about systematic approach to photogrammetric survey that is applicable in the extraction of elevation data for geophysical surveys in hilly terrains using Unmanned Aerial Vehicles (UAVs). The outcome will be to acquire high-quality geophysical data from areas where elevations vary by locating the best survey lines. The study area is located at the proposed construction site for the development of a water reservoir and related infrastructure in Kampus Pauh Putra, Universiti Malaysia Perlis. Seismic refraction surveys were carried out for the modelling of the subsurface for detailed site investigations. Study were carried out to identify the accuracy of the digital elevation model (DEM) produced from an UAV. At 100 m altitude (flying height), over 135 overlapping images were acquired using a DJI Phantom 3 quadcopter. All acquired images were processed for automatic 3D photo-reconstruction using Agisoft PhotoScan digital photogrammetric software, which was applied to all photogrammetric stages. The products generated included a 3D model, dense point cloud, mesh surface, digital orthophoto, and DEM. In validating the accuracy of the produced DEM, the coordinates of the selected ground control point (GCP) of the survey line in the imaging area were extracted from the generated DEM with the aid of Global Mapper software. These coordinates were compared with the GCPs obtained using a real-time kinematic global positioning system. The maximum percentage of difference between GCP’s and photogrammetry survey is 13.3 %. UAVs are suitable for acquiring elevation data for geophysical surveys which can save time and cost.
Product-based Safety Certification for Medical Devices Embedded Software.
Neto, José Augusto; Figueiredo Damásio, Jemerson; Monthaler, Paul; Morais, Misael
2015-01-01
Worldwide medical device embedded software certification practices are currently focused on manufacturing best practices. In Brazil, the national regulatory agency does not hold a local certification process for software-intensive medical devices and admits international certification (e.g. FDA and CE) from local and international industry to operate in the Brazilian health care market. We present here a product-based certification process as a candidate process to support the Brazilian regulatory agency ANVISA in medical device software regulation. Center of Strategic Technology for Healthcare (NUTES) medical device embedded software certification is based on a solid safety quality model and has been tested with reasonable success against the Class I risk device Generic Infusion Pump (GIP).
Using commercial software products for atmospheric remote sensing
NASA Astrophysics Data System (ADS)
Kristl, Joseph A.; Tibaudo, Cheryl; Tang, Kuilian; Schroeder, John W.
2002-02-01
The Ontar Corporation (www.Ontar.com) has developed several products for atmospheric remote sensing to calculate radiative transport, atmospheric transmission, and sensor performance in both the normal atmosphere and the atmosphere disturbed by battlefield conditions of smoke, dust, explosives and turbulence. These products include: PcModWin: Uses the USAF standard MODTRAN model to compute the atmospheric transmission and radiance at medium spectral resolution (2 cm-1) from the ultraviolet/visible into the infrared and microwave regions of the spectrum. It can be used for any geometry and atmospheric conditions such as aerosols, clouds and rain. PcLnWin: Uses the USAF standard FASCOD model to compute atmospheric transmission and emission at high (line-by-line) spectral resolution using the HITRAN 2000 database. It can be used over the same spectrum from the UV/visible into the infrared and microwave regions of the spectrum. HitranPC: Computes the absolute high (line-by-line) spectral resolution transmission spectrum of the atmosphere for different temperatures and pressures. HitranPC is a user-friendly program developed by the University of South Florida (USF) and uses the international standard molecular spectroscopic database, HITRAN. LidarPC: A computer program to calculate the Laser Radar/L&n Equation for hard targets and atmospheric backscatter using manual input atmospheric parameters or HitranPC and BETASPEC - transmission and backscatter calculations of the atmosphere. Also developed by the University of South Florida (USF). PcEosael: is a library of programs that mathematically describe aspects of electromagnetic propagation in battlefield environments. 25 modules are connected but can be exercised individually. Covers eight general categories of atmospheric effects, including gases, aerosols and laser propagation. Based on codes developed by the Army Research Lab. NVTherm: NVTherm models parallel scan, serial scan, and staring thermal imagers that operate in the mid and far infrared spectral bands (3 to 12 micrometers wavelength). It predicts the Minimum Resolvable Temperature Difference (MRTD) or just MRT) that can be discriminated by a human when using a thermal imager. NVTherm also predicts the target acquisition range performance likely to be achieved using the sensor.
The Validation by Measurement Theory of Proposed Object-Oriented Software Metrics
NASA Technical Reports Server (NTRS)
Neal, Ralph D.
1996-01-01
Moving software development into the engineering arena requires controllability, and to control a process, it must be measurable. Measuring the process does no good if the product is not also measured, i.e., being the best at producing an inferior product does not define a quality process. Also, not every number extracted from software development is a valid measurement. A valid measurement only results when we are able to verify that the number is representative of the attribute that we wish to measure. Many proposed software metrics are used by practitioners without these metrics ever having been validated, leading to costly but often useless calculations. Several researchers have bemoaned the lack of scientific precision in much of the published software measurement work and have called for validation of software metrics by measurement theory. This dissertation applies measurement theory to validate fifty proposed object-oriented software metrics.
Railroads and Riddles Highlight New Software.
ERIC Educational Resources Information Center
Kinnamon, J. C.
1988-01-01
Six software products are reviewed including multimedia packages for history/geography and science. Other products include a coloring program, riddle-maker, word puzzle generator, a lesson on counting money, and a math game equipped with animation and sound effects. (IAH)
Putting Safety in the Software
NASA Technical Reports Server (NTRS)
Wetherholt, Martha S.; Berens, Kalynnda M.; Hardy, Sandra (Technical Monitor)
2001-01-01
Software is a vital component of nearly every piece of modern technology. It is not a 'sub-system', able to be separated out from the system as a whole, but a 'co-system' that controls, manipulates, or interacts with the hardware and with the end user. Software has its fingers into all the pieces of the pie. If that 'pie', the system, can lead to injury, death, loss of major equipment, or impact your business bottom line, then software safety becomes vitally important. Learning to think about software from a safety perspective is the focus of this paper. We want you to think of software as part of the safety critical system, a major part. This requires 'system thinking' - being able to grasp the whole picture. Software's contribution to modern technology is both good and potentially bad. Software allows more complex and useful devices to be built. It can also contribute to plane crashes and power outages. We want you to see software in a whole new light, see it as a contributor to system hazards, and also as a possible fix or mitigation to some of those hazards.
Lossef, S V; Schwartz, L H
1990-09-01
A computerized reference system for radiology journal articles was developed by using an IBM-compatible personal computer with a hand-held optical scanner and optical character recognition software. This allows direct entry of scanned text from printed material into word processing or data-base files. Additionally, line diagrams and photographs of radiographs can be incorporated into these files. A text search and retrieval software program enables rapid searching for keywords in scanned documents. The hand scanner and software programs are commercially available, relatively inexpensive, and easily used. This permits construction of a personalized radiology literature file of readily accessible text and images requiring minimal typing or keystroke entry.
Electric Vehicle Communication Standards Testing and Validation Phase I: SAE J2847/1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pratt, Richard M.; Tuffner, Francis K.; Gowri, Krishnan
Executive Summary Vehicle to grid communication standards are critical to the charge management and interoperability among vehicles, charging stations and utility providers. Several standards initiatives by the Society of Automobile Engineers (SAE), International Standards Organization and International Electrotechnical Commission (ISO/IEC), and ZigBee / HomePlug Alliance are developing requirements for communication messages and protocols. While the standard development is in progress for more than two years, no definitive guidelines are available for the automobile manufacturers, charging station manufacturers and utility backhaul network systems. At present, there is a wide range of proprietary communication options developed and supported in the industry. Recentmore » work by the Electric Power Research Institute (EPRI) in collaboration with SAE and automobile manufacturers has identified performance requirements and test plan based on possible communication pathways using power line communication over the control pilot and mains. Though the communication pathways and power line communication technology options are identified, much work needs to be done in developing application software and testing of communication modules before these can be deployed in production vehicles. This report presents a test plan and results from initial testing of two power line communication modules developed to meet the requirements of SAE J2847/1 standard.« less
The influence of computer-generated path on the robot’s effector stability of motion
NASA Astrophysics Data System (ADS)
Foit, K.; Banaś, W.; Gwiazda, A.; Ćwikła, G.
2017-08-01
The off-line trajectory planning is often carried out due to economical and practical reasons: the robot is not excluded from the production process and the operator could benefit from testing programs in the virtual environment. On the other hand, the dedicated off-line programming and simulation software is often limited in features and is intended to roughly check the program. It should be expected that the arm of the real robot’s manipulator will realize the trajectory in different manner: the acceleration and deceleration phases may trigger the vibrations of the kinematic chain that could affect the precision of effector positioning and degrade the quality of process realized by the robot. The purpose of this work is the analysis of the selected cases, when the robot’s effector has been moved along the programmed path. The off-line generated, test trajectories have different arrangement of points: such approach has allowed evaluating the time needed to complete the each of the tasks, as well as measuring the level of the vibration of the robot’s wrist. All tests were performed without the load. The conclusions of the experiment may be useful during the trajectory planning in order to avoid the critical configuration of points.
An ad hoc 3D-printed tool facilitates intraesophageal suturing in experimental surgery
Steinemann, D.C.; Müller, P.C.; Apitz, M.; Nickel, F.; Kenngott, H.G.; Müller-Stich, B.P.; Linke, G.R.
2018-01-01
Background Three-dimensional printing (3DP) has become popular for development of anatomic models, preoperative planning, and production of tailored implants. A novel laparoscopic, transgastric procedure for distal esophageal mucosectomy was developed. During this procedure a space holder had to be introduced into the distal esophagus for exposure during suturing. The production process and evaluation of a 3DP space holder are described herein. Material and methods Computer-aided design software was used to develop models printed from polylactic acid. The prototype was adapted after testing in a cadaveric model. Subsequently the device was evaluated in a non-survival porcine model. A mucosal purse-string suture was placed as orally as possible in the esophagus, in the intervention group with and in the control group without use of the tool (n=8 each). The distance of the stitches from the Z-line was measured. The variability of stitches indicated the suture quality. Results The median maximum distance from Z-line to purse-string suture was larger in the intervention group (5.0 [3.3-6.4] versus 2.4 [2.0-4.1] cm;P=0.013). The time taken to place the sutures was shorter in the control group (P<0.001). Stitch variance tended to be greater in the intervention group (2.3 [0.9-2.5] versus 0.7 [0.2-0.4] cm;P=0.051). The time required for design and production of a tailored tool was below 24 h. Conclusions 3DP in experimental surgery enables rapid production, permits repeated adaptation until a tailored tool is obtained, and ensures independence from industrial partners. With the aid of the space holder more orally located esophageal lesions came within reach. PMID:29433890
Studies on the finite element simulation in sheet metal stamping processes
NASA Astrophysics Data System (ADS)
Huang, Ying
The sheet metal stamping process plays an important role in modern industry. With the ever-increasing demand for shape complexity, product quality and new materials, the traditional trial and error method for setting up a sheet metal stamping process is no longer efficient. As a result, the Finite Element Modeling (FEM) method has now been widely used. From a physical point of view, the formability and the quality of a product are influenced by several factors. The design of the product in the initial stage and the motion of the press during the production stage are two of these crucial factors. This thesis focuses on the numerical simulation for these two factors using FEM. Currently, there are a number of commercial FEM software systems available in the market. These software systems are based on an incremental FEM process that models the sheet metal stamping process in small incremental steps. Even though the incremental FEM is accurate, it is not suitable for the initial conceptual design for its needing of detailed design parameters and enormous calculation times. As a result, another type of FEM, called the inverse FEM method or one-step FEM method, has been proposed. While it is less accurate than that of the incremental method, this method requires much less computation and hence, has a great potential. However, it also faces a number of unsolved problems, which limits its application. This motivates the presented research. After the review of the basic theory of the inverse method, a new modified arc-length search method is proposed to find better initial solution. The methods to deal with the vertical walls are also discussed and presented. Then, a generalized multi-step inverse FEM method is proposed. It solves two key obstacles: the first one is to determine the initial solution of the intermediate three-dimensional configurations and the other is to control the movement of nodes so they could only slide on constraint surfaces during the search by Newton-Raphson iteration. The computer implementation of the generalized multi-step inverse FEM is also presented. By comparing to the simulation results using a commercial software system, the effectiveness of the new method is validated. Other than the product design, the punch motion (including punch speed and punch trajectory) of the stamping press also has significant effect on the formability and the quality of the product. In fact, this is one of the major reasons why hydraulic presses and/or servo presses are used for parts which demand high quality. In order to reveal the quantitative correlation between the punch motion and the part quality, the Cowper-Symonds strain rate constitutive model and the implicit dynamic incremental FEM are combined to conduct the research. The effects of the punch motion on the part quality, especially the plastic strain distribution and the potential springback, have been investigated for the deep drawing and the bending processes respectively. A qualitative relationship between the punch motion and the part quality is also derived. The reaction force of the punch motion causes the dynamic deformation of the press during the stamping, which in turn influences the part quality as well. This dynamic information, in the form of the strain signal, is an important basis for the on-line monitoring of the part quality. By using the actual force as the input to the press, the incremental FEM is needed to predict the strain of the press. The result is validated by means of experiments and can be used to assist the on-line monitoring.
Software Engineering Laboratory (SEL) compendium of tools, revision 1
NASA Technical Reports Server (NTRS)
1982-01-01
A set of programs used to aid software product development is listed. Known as software tools, such programs include requirements analyzers, design languages, precompilers, code auditors, code analyzers, and software librarians. Abstracts, resource requirements, documentation, processing summaries, and availability are indicated for most tools.
Vector Data Model: A New Model of HDF-EOS to Support GIS Applications in EOS
NASA Astrophysics Data System (ADS)
Chi, E.; Edmonds, R d
2001-05-01
NASA's Earth Science Data Information System (ESDIS) project has an active program of research and development of systems for the storage and management of Earth science data for Earth Observation System (EOS) mission, a key program of NASA Earth Science Enterprise. EOS has adopted an extension of the Hierarchical Data Format (HDF) as the format of choice for standard product distribution. Three new EOS specific datatypes - point, swath and grid - have been defined within the HDF framework. The enhanced data format is named HDF-EOS. Geographic Information Systems (GIS) are used by Earth scientists in EOS data product generation, visualization, and analysis. There are two major data types in GIS applications, raster and vector. The current HDF-EOS handles only raster type in the swath data model. The vector data model is identified and developed as a new HDFEOS format to meet the requirements of scientists working with EOS data products in vector format. The vector model is designed using a topological data structure, which defines the spatial relationships among points, lines, and polygons. The three major topological concepts that the vector model adopts are: a) lines connect to each other at nodes (connectivity), b) lines that connect to surround an area define a polygon (area definition), and c) lines have direction and left and right sides (contiguity). The vector model is implemented in HDF by mapping the conceptual model to HDF internal data models and structures, viz. Vdata, Vgroup, and their associated attribute structures. The point, line, and polygon geometry and attribute data are stored in similar tables. Further, the vector model utilizes the structure and product metadata, which characterize the HDF-EOS. Both types of metadata are stored as attributes in HDF-EOS files, and are encoded in text format by using Object Description Language (ODL) and stored as global attributes in HDF-EOS files. EOS has developed a series of routines for storing, retrieving, and manipulating vector data in category of access, definition, basic I/O, inquiry, and subsetting. The routines are tested and form a package, HDF-EOS/Vector. The alpha version of HDFEOS/Vector has been distributed through the HDF-EOS project web site at http://hdfeos.gsfc.nasa.gov. We are also developing translators between HDF-EOS vector format and variety of GIS formats, such as Shapefile. The HDF-EOS vector model enables EOS scientists to deliver EOS data in a way ready for Earth scientists to analyze using GIS software, and also provides EOS project a mechanism to store GIS data product in meaningful vector format with significant economy in storage.
User systems guidelines for software projects
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abrahamson, L.
1986-04-01
This manual presents guidelines for software standards which were developed so that software project-development teams and management involved in approving the software could have a generalized view of all phases in the software production procedure and the steps involved in completing each phase. Guidelines are presented for six phases of software development: project definition, building a user interface, designing software, writing code, testing code, and preparing software documentation. The discussions for each phase include examples illustrating the recommended guidelines. 45 refs. (DWL)
Product line cost estimation: a standard cost approach.
Cooper, J C; Suver, J D
1988-04-01
Product line managers often must make decisions based on inaccurate cost information. A method is needed to determine costs more accurately. By using a standard costing model, product line managers can better estimate the cost of intermediate and end products, and hence better estimate the costs of the product line.
A Quantitative Analysis of Open Source Software's Acceptability as Production-Quality Code
ERIC Educational Resources Information Center
Fischer, Michael
2011-01-01
The difficulty in writing defect-free software has been long acknowledged both by academia and industry. A constant battle occurs as developers seek to craft software that works within aggressive business schedules and deadlines. Many tools and techniques are used in attempt to manage these software projects. Software metrics are a tool that has…
Ten recommendations for software engineering in research.
Hastings, Janna; Haug, Kenneth; Steinbeck, Christoph
2014-01-01
Research in the context of data-driven science requires a backbone of well-written software, but scientific researchers are typically not trained at length in software engineering, the principles for creating better software products. To address this gap, in particular for young researchers new to programming, we give ten recommendations to ensure the usability, sustainability and practicality of research software.
Nonlinear Simulation of the Tooth Enamel Spectrum for EPR Dosimetry
NASA Astrophysics Data System (ADS)
Kirillov, V. A.; Dubovsky, S. V.
2016-07-01
Software was developed where initial EPR spectra of tooth enamel were deconvoluted based on nonlinear simulation, line shapes and signal amplitudes in the model initial spectrum were calculated, the regression coefficient was evaluated, and individual spectra were summed. Software validation demonstrated that doses calculated using it agreed excellently with the applied radiation doses and the doses reconstructed by the method of additive doses.
Distributed operating system for NASA ground stations
NASA Technical Reports Server (NTRS)
Doyle, John F.
1987-01-01
NASA ground stations are characterized by ever changing support requirements, so application software is developed and modified on a continuing basis. A distributed operating system was designed to optimize the generation and maintenance of those applications. Unusual features include automatic program generation from detailed design graphs, on-line software modification in the testing phase, and the incorporation of a relational database within a real-time, distributed system.
Environmental databases and other computerized information tools
NASA Technical Reports Server (NTRS)
Clark-Ingram, Marceia
1995-01-01
Increasing environmental legislation has brought about the development of many new environmental databases and software application packages to aid in the quest for environmental compliance. These databases and software packages are useful tools and applicable to a wide range of environmental areas from atmospheric modeling to materials replacement technology. The great abundance of such products and services can be very overwhelming when trying to identify the tools which best meet specific needs. This paper will discuss the types of environmental databases and software packages available. This discussion will also encompass the affected environmental areas of concern, product capabilities, and hardware requirements for product utilization.
Improving the Product Documentation Process of a Small Software Company
NASA Astrophysics Data System (ADS)
Valtanen, Anu; Ahonen, Jarmo J.; Savolainen, Paula
Documentation is an important part of the software process, even though it is often neglected in software companies. The eternal question is how much documentation is enough. In this article, we present a practical implementation of lightweight product documentation process resulting from SPI efforts in a small company. Small companies’ financial and human resources are often limited. The documentation process described here, offers a template for creating adequate documentation consuming minimal amount of resources. The key element of the documentation process is an open source web-based bugtracking system that was customized to be used as a documentation tool. The use of the tool enables iterative and well structured documentation. The solution best serves the needs of a small company with off-the-shelf software products and striving for SPI.
Lebozec, Kristell; Jandrot-Perrus, Martine; Avenard, Gilles; Favre-Bulle, Olivier; Billiald, Philippe
2018-09-25
Monoclonal antibody fragments (Fab) are a promising class of therapeutic agents. Fabs are aglycosylated proteins and so many expression platforms have been developed including prokaryotic, yeast and mammalian cells. However, these platforms are not equivalent in terms of cell line development and culture time, product quality and possibly cost of production that greatly influence the success of a drug candidate's pharmaceutical development. This study is an assessment of the humanized Fab fragment ACT017 produced from two microorganisms (Escherichia coli and Pichia pastoris) and one mammalian cell host (CHO). Following low scale production and Protein L-affinity purification under generic conditions, physico-chemical and functional quality assessments were carried out prior to economic analysis of industrial scale production using a specialized software (Biosolve, Biopharm Services, UK). Results show higher titer production when using E. coli but associated with high heterogeneity of the protein content recovered in the supernatant. We also observed glycoforms of the Fab produced from P. pastoris, while Fab secreted from CHO was the most homogeneous despite a much longer culture time and slightly higher estimated cost of goods. This study may help inform future pharmaceutical development of this class of therapeutic proteins. Copyright © 2018 Elsevier B.V. All rights reserved.
Detailed Life Cycle Assessment of Bounty Paper Towel ...
Life Cycle Assessment (LCA) is a well-established and informative method of understanding the environmental impacts of consumer products across the entire value chain. However, companies committed to sustainability are interested in more methods that examine their products and activities' impacts. Methods that build on LCA strengths and illuminate other connected but less understood facets, related to social and economic impacts, would provide greater value to decision-makers. This study is a LCA that calculates the potential impacts associated with Bounty® paper towels from two facilities with different production lines, an older one (Albany, Georgia) representing established technology and the other (Box Elder, Utah), a newer state-of-the-art platform. This is unique in that it includes use of Industrial Process Systems Assessment (IPSA), new electricity and pulp data, modeled in open source software, and is the basis for the development of new integrated sustainability metrics (published separately). The new metrics can guide supply chain and manufacturing enhancements, and product design related to environmental protection and resource sustainability. Results of the LCA indicate Box Elder had improvements on environmental impact scores related to air emission indicators, except for particulate matter. Albany had lower water use impacts. After normalization of the results, fossil fuel depletion is the most critical environmental indicator. Pulp production, e
Nayak, Pratibha; Barker, Dianne C; Huang, Jidong; Kemp, Catherine B; Wagener, Theodore L; Chaloupka, Frank
2018-03-26
While the market share of electronic vapor products (EVPs), sold primarily through vape shops and other outlets, has increased rapidly, these products remained largely unregulated until 2016. This study, conducted prior to announcement of the deeming regulations, provides insights into vape shop operator attitudes toward potential government regulations of EVPs. In 2015, we conducted 37 in-person interviews of vape shop operators across nine US cities. Shops were identified through extensive web-searches. We used QSR International's NVivo 11 qualitative data analysis software to analyze the transcripts. Many vape shop operators viewed regulations requiring safe production of e-liquids, child-resistant bottles and listing e-juice ingredients as acceptable. They disagreed with the elimination of free samples and bans on flavored e-liquid sales, which generate significant revenue for their stores. Many held negative perceptions of pre-market review of new product lines and EVP-specific taxes. All agreed that EVPs should not be sold to minors, but most felt that owners should not be fined if minors visited vape shops. Findings from this study offer insights into the acceptability of proposed regulations, as well as barriers to effective regulation implementation.
Pharmacy component of a hospital end-product cost-accounting system.
Smith, J E; Sheaffer, S L; Meyer, G E; Giorgilli, F
1988-04-01
Determination of pharmacy department standard costs for providing drug products to patients at Thomas Jefferson University Hospital in Philadelphia is described. The hospital is implementing a cost-accounting system (CAS) that uses software developed at the New England Medical Center, Boston. The pharmacy identified nine categories of intermediate products on the basis of labor consumption. Standard labor times for each product category are based on measurement or estimation of time for each task in the preparation and distribution of a dose. Variable-labor standard time was determined by adjusting the cumulative time for the tasks to account for nonproductive time and nonroutine activities, and a variable-labor standard cost for each category was calculated. The standard cost per dose included the costs of labor and supplies (variable and fixed) and equipment; this standard cost plus the acquisition cost of a drug line item is the total intermediate product cost. Because the CAS is based on the hospital's patient charges, clinical pharmacy services are excluded. Intermediate products that substantially affect end-product costs (costs per patient case) will be identified for inclusion in CAS reports. The CAS will give a more accurate picture of resource consumption, enabling managers to focus their efforts to improve efficiency and productivity and reduce supply use; it could also improve the accuracy of the budgeting process. The CAS will support hospital administration decisions about marketing end products and department managers' decisions about controlling intermediate-product costs.
Verification of an on line in vivo semiconductor dosimetry system for TBI with two TLD procedures.
Sánchez-Doblado, F; Terrón, J A; Sánchez-Nieto, B; Arráns, R; Errazquin, L; Biggs, D; Lee, C; Núñez, L; Delgado, A; Muñiz, J L
1995-01-01
This work presents the verification of an on line in vivo dosimetry system based on semiconductors. Software and hardware has been designed to convert the diode signal into absorbed dose. Final verification was made in the form of an intercomparison with two independent thermoluminiscent (TLD) dosimetry systems, under TBI conditions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Willenbring, James M.; Bartlett, Roscoe Ainsworth; Heroux, Michael Allen
2012-01-01
Software lifecycles are becoming an increasingly important issue for computational science and engineering (CSE) software. The process by which a piece of CSE software begins life as a set of research requirements and then matures into a trusted high-quality capability is both commonplace and extremely challenging. Although an implicit lifecycle is obviously being used in any effort, the challenges of this process - respecting the competing needs of research vs. production - cannot be overstated. Here we describe a proposal for a well-defined software lifecycle process based on modern Lean/Agile software engineering principles. What we propose is appropriate for manymore » CSE software projects that are initially heavily focused on research but also are expected to eventually produce usable high-quality capabilities. The model is related to TriBITS, a build, integration and testing system, which serves as a strong foundation for this lifecycle model, and aspects of this lifecycle model are ingrained in the TriBITS system. Here, we advocate three to four phases or maturity levels that address the appropriate handling of many issues associated with the transition from research to production software. The goals of this lifecycle model are to better communicate maturity levels with customers and to help to identify and promote Software Engineering (SE) practices that will help to improve productivity and produce better software. An important collection of software in this domain is Trilinos, which is used as the motivation and the initial target for this lifecycle model. However, many other related and similar CSE (and non-CSE) software projects can also make good use of this lifecycle model, especially those that use the TriBITS system. Indeed this lifecycle process, if followed, will enable large-scale sustainable integration of many complex CSE software efforts across several institutions.« less
Effective organizational solutions for implementation of DBMS software packages
NASA Technical Reports Server (NTRS)
Jones, D.
1984-01-01
The space telescope management information system development effort is a guideline for discussing effective organizational solutions used in implementing DBMS software. Focus is on the importance of strategic planning. The value of constructing an information system architecture to conform to the organization's managerial needs, the need for a senior decision maker, dealing with shifting user requirements, and the establishment of a reliable working relationship with the DBMS vendor are examined. Requirements for a schedule to demonstrate progress against a defined timeline and the importance of continued monitoring for production software control, production data control, and software enhancements are also discussed.
OLTARIS: On-Line Tool for the Assessment of Radiation in Space
NASA Technical Reports Server (NTRS)
Sandridge, Chris A.; Blattnig, Steve R.; Clowdsley, Martha S.; Norbury, John; Qualis, Garry D.; Simonsen, Lisa C.; Singleterry, Robert C.; Slaba, Tony C.; Walker, Steven A.; Badavi, Francis F.;
2009-01-01
The effects of ionizing radiation on humans in space is a major technical challenge for exploration to the moon and beyond. The radiation shielding team at NASA Langley Research Center has been working for over 30 years to develop techniques that can efficiently assist the engineer throughout the entire design process. OLTARIS: On-Line Tool for the Assessment of Radiation in Space is a new NASA website (http://oltaris.larc.nasa.gov) that allows engineers and physicists to access a variety of tools and models to study the effects of ionizing space radiation on humans and shielding materials. The site is intended to be an analysis and design tool for those working radiation issues for current and future manned missions, as well as a research tool for developing advanced material and shielding concepts. The site, along with the analysis tools and models within, have been developed using strict software practices to ensure reliable and reproducible results in a production environment. They have also been developed as a modular system so that models and algorithms can be easily added or updated.
NASA Astrophysics Data System (ADS)
Sinquin, J. M.; Sorribas, J.
2014-12-01
Within the EUROFLEETS project, and linked to the EMODNet and Geo-Seas European projects, GLOBE (Global Oceanographic Bathymetry Explorer) is an innovative and generic software. I. INTRODUCTION The first version can be used onboard during the survey to get a quick overview of acquired data, or later, to re-process data with accurate environmental data. II. MAIN FUNCTIONALITIES The version shown at AGU-2014 will present several key items : - 3D visualization: DTM multi-layers from EMODNet, - Water Column echogram, Seismic lines, ... - Bathymetry Plug-In: manual and automatic data cleaning, integration of EMODNet methodology to introduce CDI concept, filtering, spline, data gridding, ... - Backscatter with compensation, - Tectonic toolset, - Photo/Video Plug-In - Navigation 3D including tide correction, MRU corrections, GPS offsets correction, - WMS/WFS interfaces. III. FOCUS ON EMODNET One of the main objectives of the EMODNet European project is to elaborate a common processing flow for gridding the bathymetry data and for generating harmonized digital terrain model (DTM) : this flow includes the definition of the DTM characteristics (geodetic parameters, grid spacing, interpolation and smoothing parameters…) and also the specifications of a set of layers which enrich the basic depth layer : statistical layers (sounding density, standard deviation,…) and an innovative data source layer which indicates the source of the soundings and and which is linked and collects to the associated metadata. GLOBE Software provides the required tools for applying this methodology and is offered to the project partners. V. FOCUS ON THE TECTONIC TOOLSET The tectonic toolset allows the user to associate any DTM to 3D rotation movements. These rotations represent the movement of tectonic plates along discrete time lines (from 200 million years ago to now). One rotation is described by its axes, its value angle and its date. GLOBE can display the movement of tectonic plates, represented by a DTM, at different geological times. The same movements can be operated for geotiff images or GMT files representing grids for any kind of data. The free software GLOBE3D is a product of Ifremer and is funded by Carnot-Edrome
The MOLGENIS toolkit: rapid prototyping of biosoftware at the push of a button
2010-01-01
Background There is a huge demand on bioinformaticians to provide their biologists with user friendly and scalable software infrastructures to capture, exchange, and exploit the unprecedented amounts of new *omics data. We here present MOLGENIS, a generic, open source, software toolkit to quickly produce the bespoke MOLecular GENetics Information Systems needed. Methods The MOLGENIS toolkit provides bioinformaticians with a simple language to model biological data structures and user interfaces. At the push of a button, MOLGENIS’ generator suite automatically translates these models into a feature-rich, ready-to-use web application including database, user interfaces, exchange formats, and scriptable interfaces. Each generator is a template of SQL, JAVA, R, or HTML code that would require much effort to write by hand. This ‘model-driven’ method ensures reuse of best practices and improves quality because the modeling language and generators are shared between all MOLGENIS applications, so that errors are found quickly and improvements are shared easily by a re-generation. A plug-in mechanism ensures that both the generator suite and generated product can be customized just as much as hand-written software. Results In recent years we have successfully evaluated the MOLGENIS toolkit for the rapid prototyping of many types of biomedical applications, including next-generation sequencing, GWAS, QTL, proteomics and biobanking. Writing 500 lines of model XML typically replaces 15,000 lines of hand-written programming code, which allows for quick adaptation if the information system is not yet to the biologist’s satisfaction. Each application generated with MOLGENIS comes with an optimized database back-end, user interfaces for biologists to manage and exploit their data, programming interfaces for bioinformaticians to script analysis tools in R, Java, SOAP, REST/JSON and RDF, a tab-delimited file format to ease upload and exchange of data, and detailed technical documentation. Existing databases can be quickly enhanced with MOLGENIS generated interfaces using the ‘ExtractModel’ procedure. Conclusions The MOLGENIS toolkit provides bioinformaticians with a simple model to quickly generate flexible web platforms for all possible genomic, molecular and phenotypic experiments with a richness of interfaces not provided by other tools. All the software and manuals are available free as LGPLv3 open source at http://www.molgenis.org. PMID:21210979
Product line management in oncology: a Canadian experience.
Wodinsky, H B; Egan, D; Markel, F
1988-01-01
More competition for finite resources and increasing regulation have led many hospitals to consider a strategic reorganization. Recently, one common reorganization strategy has been"product line management." Product line management can be broadly defined in terms of centralized program management, planning, and marketing strategies. In Canada, while strategic driving forces may be different, a product line management alternative has arisen in one of the most potentially complex product lines, cancer services. This article compares and contrasts the theoretical model for product line management development, with special reference to cancer services, to the experience of one Canadian medical center and cancer center.
NASA Technical Reports Server (NTRS)
Mallasch, Paul G.
1993-01-01
This volume contains the complete software system documentation for the Federal Communications Commission (FCC) Transponder Loading Data Conversion Software (FIX-FCC). This software was written to facilitate the formatting and conversion of FCC Transponder Occupancy (Loading) Data before it is loaded into the NASA Geosynchronous Satellite Orbital Statistics Database System (GSOSTATS). The information that FCC supplies NASA is in report form and must be converted into a form readable by the database management software used in the GSOSTATS application. Both the User's Guide and Software Maintenance Manual are contained in this document. This volume of documentation passed an independent quality assurance review and certification by the Product Assurance and Security Office of the Planning Research Corporation (PRC). The manuals were reviewed for format, content, and readability. The Software Management and Assurance Program (SMAP) life cycle and documentation standards were used in the development of this document. Accordingly, these standards were used in the review. Refer to the System/Software Test/Product Assurance Report for the Geosynchronous Satellite Orbital Statistics Database System (GSOSTATS) for additional information.