Science.gov

Sample records for agile software process

  1. Agile Software Development

    ERIC Educational Resources Information Center

    Biju, Soly Mathew

    2008-01-01

    Many software development firms are now adopting the agile software development method. This method involves the customer at every level of software development, thus reducing the impact of change in the requirement at a later stage. In this article, the principles of the agile method for software development are explored and there is a focus on…

  2. Software ``Best'' Practices: Agile Deconstructed

    NASA Astrophysics Data System (ADS)

    Fraser, Steven

    Software “best” practices depend entirely on context - in terms of the problem domain, the system constructed, the software designers, and the “customers” ultimately deriving value from the system. Agile practices no longer have the luxury of “choosing” small non-mission critical projects with co-located teams. Project stakeholders are selecting and adapting practices based on a combina tion of interest, need and staffing. For example, growing product portfolios through a merger or the acquisition of a company exposes legacy systems to new staff, new software integration challenges, and new ideas. Innovation in communications (tools and processes) to span the growth and contraction of both information and organizations, while managing the adoption of changing software practices, is imperative for success. Traditional web-based tools such as web pages, document libraries, and forums are not suf ficient. A blend of tweeting, blogs, wikis, instant messaging, web-based confer encing, and telepresence creates a new dimension of communication “best” practices.

  3. Teaching Agile Software Development: A Case Study

    ERIC Educational Resources Information Center

    Devedzic, V.; Milenkovic, S. R.

    2011-01-01

    This paper describes the authors' experience of teaching agile software development to students of computer science, software engineering, and other related disciplines, and comments on the implications of this and the lessons learned. It is based on the authors' eight years of experience in teaching agile software methodologies to various groups…

  4. Agile Methods for Open Source Safety-Critical Software

    PubMed Central

    Enquobahrie, Andinet; Ibanez, Luis; Cheng, Patrick; Yaniv, Ziv; Cleary, Kevin; Kokoori, Shylaja; Muffih, Benjamin; Heidenreich, John

    2011-01-01

    The introduction of software technology in a life-dependent environment requires the development team to execute a process that ensures a high level of software reliability and correctness. Despite their popularity, agile methods are generally assumed to be inappropriate as a process family in these environments due to their lack of emphasis on documentation, traceability, and other formal techniques. Agile methods, notably Scrum, favor empirical process control, or small constant adjustments in a tight feedback loop. This paper challenges the assumption that agile methods are inappropriate for safety-critical software development. Agile methods are flexible enough to encourage the right amount of ceremony; therefore if safety-critical systems require greater emphasis on activities like formal specification and requirements management, then an agile process will include these as necessary activities. Furthermore, agile methods focus more on continuous process management and code-level quality than classic software engineering process models. We present our experiences on the image-guided surgical toolkit (IGSTK) project as a backdrop. IGSTK is an open source software project employing agile practices since 2004. We started with the assumption that a lighter process is better, focused on evolving code, and only adding process elements as the need arose. IGSTK has been adopted by teaching hospitals and research labs, and used for clinical trials. Agile methods have matured since the academic community suggested they are not suitable for safety-critical systems almost a decade ago, we present our experiences as a case study for renewing the discussion. PMID:21799545

  5. Agile Methods for Open Source Safety-Critical Software.

    PubMed

    Gary, Kevin; Enquobahrie, Andinet; Ibanez, Luis; Cheng, Patrick; Yaniv, Ziv; Cleary, Kevin; Kokoori, Shylaja; Muffih, Benjamin; Heidenreich, John

    2011-08-01

    The introduction of software technology in a life-dependent environment requires the development team to execute a process that ensures a high level of software reliability and correctness. Despite their popularity, agile methods are generally assumed to be inappropriate as a process family in these environments due to their lack of emphasis on documentation, traceability, and other formal techniques. Agile methods, notably Scrum, favor empirical process control, or small constant adjustments in a tight feedback loop. This paper challenges the assumption that agile methods are inappropriate for safety-critical software development. Agile methods are flexible enough to encourage the rightamount of ceremony; therefore if safety-critical systems require greater emphasis on activities like formal specification and requirements management, then an agile process will include these as necessary activities. Furthermore, agile methods focus more on continuous process management and code-level quality than classic software engineering process models. We present our experiences on the image-guided surgical toolkit (IGSTK) project as a backdrop. IGSTK is an open source software project employing agile practices since 2004. We started with the assumption that a lighter process is better, focused on evolving code, and only adding process elements as the need arose. IGSTK has been adopted by teaching hospitals and research labs, and used for clinical trials. Agile methods have matured since the academic community suggested they are not suitable for safety-critical systems almost a decade ago, we present our experiences as a case study for renewing the discussion. PMID:21799545

  6. Implementing Kanban for agile process management within the ALMA Software Operations Group

    NASA Astrophysics Data System (ADS)

    Reveco, Johnny; Mora, Matias; Shen, Tzu-Chiang; Soto, Ruben; Sepulveda, Jorge; Ibsen, Jorge

    2014-07-01

    After the inauguration of the Atacama Large Millimeter/submillimeter Array (ALMA), the Software Operations Group in Chile has refocused its objectives to: (1) providing software support to tasks related to System Integration, Scientific Commissioning and Verification, as well as Early Science observations; (2) testing the remaining software features, still under development by the Integrated Computing Team across the world; and (3) designing and developing processes to optimize and increase the level of automation of operational tasks. Due to their different stakeholders, each of these tasks presents a wide diversity of importances, lifespans and complexities. Aiming to provide the proper priority and traceability for every task without stressing our engineers, we introduced the Kanban methodology in our processes in order to balance the demand on the team against the throughput of the delivered work. The aim of this paper is to share experiences gained during the implementation of Kanban in our processes, describing the difficulties we have found, solutions and adaptations that led us to our current but still evolving implementation, which has greatly improved our throughput, prioritization and problem traceability.

  7. Lean and Agile Development of the AITS Ground Software System

    NASA Astrophysics Data System (ADS)

    Richters, Mark; Dutruel, Etienne; Mecredy, Nicolas

    2013-08-01

    We present the ongoing development of a new ground software system used for integrating, testing and operating spacecraft. The Advanced Integration and Test Services (AITS) project aims at providing a solution for electrical ground support equipment and mission control systems in future Astrium Space Transportation missions. Traditionally ESA ground or flight software development projects are conducted according to a waterfall-like process as specified in the ECSS-E-40 standard promoted by ESA in the European industry. In AITS a decision was taken to adopt an agile development process. This work could serve as a reference for future ESA software projects willing to apply agile concepts.

  8. Impact of Agile Software Development Model on Software Maintainability

    ERIC Educational Resources Information Center

    Gawali, Ajay R.

    2012-01-01

    Software maintenance and support costs account for up to 60% of the overall software life cycle cost and often burdens tightly budgeted information technology (IT) organizations. Agile software development approach delivers business value early, but implications on software maintainability are still unknown. The purpose of this quantitative study…

  9. Distributed agile software development for the SKA

    NASA Astrophysics Data System (ADS)

    Wicenec, Andreas; Parsons, Rebecca; Kitaeff, Slava; Vinsen, Kevin; Wu, Chen; Nelson, Paul; Reed, David

    2012-09-01

    The SKA software will most probably be developed by many groups distributed across the globe and coming from dierent backgrounds, like industries and research institutions. The SKA software subsystems will have to cover a very wide range of dierent areas, but still they have to react and work together like a single system to achieve the scientic goals and satisfy the challenging data ow requirements. Designing and developing such a system in a distributed fashion requires proper tools and the setup of an environment to allow for ecient detection and tracking of interface and integration issues in particular in a timely way. Agile development can provide much faster feedback mechanisms and also much tighter collaboration between the customer (scientist) and the developer. Continuous integration and continuous deployment on the other hand can provide much faster feedback of integration issues from the system level to the subsystem developers. This paper describes the results obtained from trialing a potential SKA development environment based on existing science software development processes like ALMA, the expected distribution of the groups potentially involved in the SKA development and experience gained in the development of large scale commercial software projects.

  10. Agile

    NASA Technical Reports Server (NTRS)

    Trimble, Jay Phillip

    2013-01-01

    This is based on a previous talk on agile development. Methods for delivering software on a short cycle are described, including interactions with the customer, the affect on the team, and how to be more effective, streamlined and efficient.

  11. Applying Agile MethodstoWeapon/Weapon-Related Software

    SciTech Connect

    Adams, D; Armendariz, M; Blackledge, M; Campbell, F; Cloninger, M; Cox, L; Davis, J; Elliott, M; Granger, K; Hans, S; Kuhn, C; Lackner, M; Loo, P; Matthews, S; Morrell, K; Owens, C; Peercy, D; Pope, G; Quirk, R; Schilling, D; Stewart, A; Tran, A; Ward, R; Williamson, M

    2007-05-02

    This white paper provides information and guidance to the Department of Energy (DOE) sites on Agile software development methods and the impact of their application on weapon/weapon-related software development. The purpose of this white paper is to provide an overview of Agile methods, examine the accepted interpretations/uses/practices of these methodologies, and discuss the applicability of Agile methods with respect to Nuclear Weapons Complex (NWC) Technical Business Practices (TBPs). It also provides recommendations on the application of Agile methods to the development of weapon/weapon-related software.

  12. Agile: From Software to Mission System

    NASA Technical Reports Server (NTRS)

    Trimble, Jay; Shirley, Mark H.; Hobart, Sarah Groves

    2016-01-01

    The Resource Prospector (RP) is an in-situ resource utilization (ISRU) technology demonstration mission, designed to search for volatiles at the Lunar South Pole. This is NASA's first near real time tele-operated rover on the Moon. The primary objective is to search for volatiles at one of the Lunar Poles. The combination of short mission duration, a solar powered rover, and the requirement to explore shadowed regions makes for an operationally challenging mission. To maximize efficiency and flexibility in Mission System design and thus to improve the performance and reliability of the resulting Mission System, we are tailoring Agile principles that we have used effectively in ground data system software development and applying those principles to the design of elements of the mission operations system.

  13. A Case Study of Coordination in Distributed Agile Software Development

    NASA Astrophysics Data System (ADS)

    Hole, Steinar; Moe, Nils Brede

    Global Software Development (GSD) has gained significant popularity as an emerging paradigm. Companies also show interest in applying agile approaches in distributed development to combine the advantages of both approaches. However, in their most radical forms, agile and GSD can be placed in each end of a plan-based/agile spectrum because of how work is coordinated. We describe how three GSD projects applying agile methods coordinate their work. We found that trust is needed to reduce the need of standardization and direct supervision when coordinating work in a GSD project, and that electronic chatting supports mutual adjustment. Further, co-location and modularization mitigates communication problems, enables agility in at least part of a GSD project, and renders the implementation of Scrum of Scrums possible.

  14. A Capstone Course on Agile Software Development Using Scrum

    ERIC Educational Resources Information Center

    Mahnic, V.

    2012-01-01

    In this paper, an undergraduate capstone course in software engineering is described that not only exposes students to agile software development, but also makes it possible to observe the behavior of developers using Scrum for the first time. The course requires students to work as Scrum Teams, responsible for the implementation of a set of user…

  15. How Can Agile Practices Minimize Global Software Development Co-ordination Risks?

    NASA Astrophysics Data System (ADS)

    Hossain, Emam; Babar, Muhammad Ali; Verner, June

    The distribution of project stakeholders in Global Software Development (GSD) projects provides significant risks related to project communication, coordination and control processes. There is growing interest in applying agile practices in GSD projects in order to leverage the advantages of both approaches. In some cases, GSD project managers use agile practices to reduce project distribution challenges. We use an existing coordination framework to identify GSD coordination problems due to temporal, geographical and socio-cultural distances. An industry-based case study is used to describe, explore and explain the use of agile practices to reduce development coordination challenges.

  16. Opening up the Agile Innovation Process

    NASA Astrophysics Data System (ADS)

    Conboy, Kieran; Donnellan, Brian; Morgan, Lorraine; Wang, Xiaofeng

    The objective of this panel is to discuss how firms can operate both an open and agile innovation process. In an era of unprecedented changes, companies need to be open and agile in order to adapt rapidly and maximize their innovation processes. Proponents of agile methods claim that one of the main distinctions between agile methods and their traditional bureaucratic counterparts is their drive toward creativity and innovation. However, agile methods are rarely adopted in their textbook, "vanilla" format, and are usually adopted in part or are tailored or modified to suit the organization. While we are aware that this happens, there is still limited understanding of what is actually happening in practice. Using innovation adoption theory, this panel will discuss the issues and challenges surrounding the successful adoption of agile practices. In addition, this panel will report on the obstacles and benefits reported by over 20 industrial partners engaged in a pan-European research project into agile practices between 2006 and 2009.

  17. An Agile Constructionist Mentoring Methodology for Software Projects in the High School

    ERIC Educational Resources Information Center

    Meerbaum-Salant, Orni; Hazzan, Orit

    2010-01-01

    This article describes the construction process and evaluation of the Agile Constructionist Mentoring Methodology (ACMM), a mentoring method for guiding software development projects in the high school. The need for such a methodology has arisen due to the complexity of mentoring software project development in the high school. We introduce the…

  18. The NERV Methodology: Non-Functional Requirements Elicitation, Reasoning and Validation in Agile Processes

    ERIC Educational Resources Information Center

    Domah, Darshan

    2013-01-01

    Agile software development has become very popular around the world in recent years, with methods such as Scrum and Extreme Programming (XP). Literature suggests that functionality is the primary focus in Agile processes while non-functional requirements (NFR) are either ignored or ill-defined. However, for software to be of good quality both…

  19. Agile Software Development Methods: A Comparative Review1

    NASA Astrophysics Data System (ADS)

    Abrahamsson, Pekka; Oza, Nilay; Siponen, Mikko T.

    Although agile software development methods have caught the attention of software engineers and researchers worldwide, scientific research still remains quite scarce. The aim of this study is to order and make sense of the different agile approaches that have been proposed. This comparative review is performed from the standpoint of using the following features as the analytical perspectives: project management support, life-cycle coverage, type of practical guidance, adaptability in actual use, type of research objectives and existence of empirical evidence. The results show that agile software development methods cover, without offering any rationale, different phases of the software development life-cycle and that most of these methods fail to provide adequate project management support. Moreover, quite a few methods continue to offer little concrete guidance on how to use their solutions or how to adapt them in different development situations. Empirical evidence after ten years of application remains quite limited. Based on the results, new directions on agile methods are outlined.

  20. Adopting best practices: "Agility" moves from software development to healthcare project management.

    PubMed

    Kitzmiller, Rebecca; Hunt, Eleanor; Sproat, Sara Breckenridge

    2006-01-01

    It is time for a change in mindset in how nurses operationalize system implementations and manage projects. Computers and systems have evolved over time from unwieldy mysterious machines of the past to ubiquitous computer use in every aspect of daily lives and work sites. Yet, disconcertingly, the process used to implement these systems has not evolved. Technology implementation does not need to be a struggle. It is time to adapt traditional plan-driven implementation methods to incorporate agile techniques. Agility is a concept borrowed from software development and is presented here because it encourages flexibility, adaptation, and continuous learning as part of the implementation process. Agility values communication and harnesses change to an advantage, which facilitates the natural evolution of an adaptable implementation process. Specific examples of agility in an implementation are described, and plan-driven implementation stages are adapted to incorporate relevant agile techniques. This comparison demonstrates how an agile approach enhances traditional implementation techniques to meet the demands of today's complex healthcare environments. PMID:16554690

  1. Cross Sectional Study of Agile Software Development Methods and Project Performance

    ERIC Educational Resources Information Center

    Lambert, Tracy

    2011-01-01

    Agile software development methods, characterized by delivering customer value via incremental and iterative time-boxed development processes, have moved into the mainstream of the Information Technology (IT) industry. However, despite a growing body of research which suggests that a predictive manufacturing approach, with big up-front…

  2. Chaste: using agile programming techniques to develop computational biology software.

    PubMed

    Pitt-Francis, Joe; Bernabeu, Miguel O; Cooper, Jonathan; Garny, Alan; Momtahan, Lee; Osborne, James; Pathmanathan, Pras; Rodriguez, Blanca; Whiteley, Jonathan P; Gavaghan, David J

    2008-09-13

    Cardiac modelling is the area of physiome modelling where the available simulation software is perhaps most mature, and it therefore provides an excellent starting point for considering the software requirements for the wider physiome community. In this paper, we will begin by introducing some of the most advanced existing software packages for simulating cardiac electrical activity. We consider the software development methods used in producing codes of this type, and discuss their use of numerical algorithms, relative computational efficiency, usability, robustness and extensibility. We then go on to describe a class of software development methodologies known as test-driven agile methods and argue that such methods are more suitable for scientific software development than the traditional academic approaches. A case study is a project of our own, Cancer, Heart and Soft Tissue Environment, which is a library of computational biology software that began as an experiment in the use of agile programming methods. We present our experiences with a review of our progress thus far, focusing on the advantages and disadvantages of this new approach compared with the development methods used in some existing packages. We conclude by considering whether the likely wider needs of the cardiac modelling community are currently being met and suggest that, in order to respond effectively to changing requirements, it is essential that these codes should be more malleable. Such codes will allow for reliable extensions to include both detailed mathematical models--of the heart and other organs--and more efficient numerical techniques that are currently being developed by many research groups worldwide. PMID:18565813

  3. Software Product Line Engineering Approach for Enhancing Agile Methodologies

    NASA Astrophysics Data System (ADS)

    Martinez, Jabier; Diaz, Jessica; Perez, Jennifer; Garbajosa, Juan

    One of the main principles of Agile methodologies consists in the early and continuous delivery of valuable software by short time-framed iterations. After each iteration, a working product is delivered according to the requirements defined at the beginning of the iteration. Testing tools facilitate the task of checking if the system provides the expected behavior according to the specified requirements. However, since testing tools need to be adapted in order to test new working products in each iteration, a significant effort has to be invested. This work presents a Software Product Line Engineering (SPLE) approach that allows flexibility in the adaption of testing tools with the working products in an iterative way. A case study is also presented using PLUM (Product Line Unified Modeller) as the tool suite for SPL implementation and management.

  4. Agile methods in biomedical software development: a multi-site experience report

    PubMed Central

    Kane, David W; Hohman, Moses M; Cerami, Ethan G; McCormick, Michael W; Kuhlmman, Karl F; Byrd, Jeff A

    2006-01-01

    Background Agile is an iterative approach to software development that relies on strong collaboration and automation to keep pace with dynamic environments. We have successfully used agile development approaches to create and maintain biomedical software, including software for bioinformatics. This paper reports on a qualitative study of our experiences using these methods. Results We have found that agile methods are well suited to the exploratory and iterative nature of scientific inquiry. They provide a robust framework for reproducing scientific results and for developing clinical support systems. The agile development approach also provides a model for collaboration between software engineers and researchers. We present our experience using agile methodologies in projects at six different biomedical software development organizations. The organizations include academic, commercial and government development teams, and included both bioinformatics and clinical support applications. We found that agile practices were a match for the needs of our biomedical projects and contributed to the success of our organizations. Conclusion We found that the agile development approach was a good fit for our organizations, and that these practices should be applicable and valuable to other biomedical software development efforts. Although we found differences in how agile methods were used, we were also able to identify a set of core practices that were common to all of the groups, and that could be a focus for others seeking to adopt these methods. PMID:16734914

  5. The Dilemma of High Level Planning in Distributed Agile Software Projects: An Action Research Study in a Danish Bank

    NASA Astrophysics Data System (ADS)

    Svejvig, Per; Fladkjær Nielsen, Ann-Dorte

    The chapter reports on an action research study with the aim to design a high level planning process in distributed and co-located software projects based on agile methods. The main contributions are the insight that high level planning process is highly integrated with other project disciplines and specific steps has to be taken to apply the process in distributed projects; and the action research approach is indeed suitable to software process improvements.

  6. Towards a Framework for Using Agile Approaches in Global Software Development

    NASA Astrophysics Data System (ADS)

    Hossain, Emam; Ali Babar, Muhammad; Verner, June

    As agile methods and Global Software Development (GSD) are become increasingly popular, GSD project managers have been exploring the viability of using agile approaches in their development environments. Despite the expected benefits of using an agile approach with a GSD project, the overall combining mechanisms of the two approaches are not clearly understood. To address this challenge, we propose a conceptual framework, based on the research literature. This framework is expected to aid a project manager in deciding what agile strategies are effective for a particular GSD project, taking into account project context. We use an industry-based case study to explore the components of our conceptual framework. Our case study is planned and conducted according to specific published case study guidelines. We identify the agile practices and agile supporting practices used by a GSD project manager in our case study and conclude with future research directions.

  7. Agile Development Processes: Delivering a Successful Data Management Platform Now and in the Future

    NASA Astrophysics Data System (ADS)

    Deaubl, E.; Lowry, S.

    2007-10-01

    Developing a flexible, extensible architecture for scientific data archival and management is a monumental task under older, big design, up-front methodologies. We will describe how we are using agile development techniques in our service oriented architecture (SOA)-based platform to integrate astronomer and operator input into the development process, deliver functional software earlier, and ensure that the software is maintainable and extensible in the future.

  8. Agile hardware and software systems engineering for critical military space applications

    NASA Astrophysics Data System (ADS)

    Huang, Philip M.; Knuth, Andrew A.; Krueger, Robert O.; Garrison-Darrin, Margaret A.

    2012-06-01

    The Multi Mission Bus Demonstrator (MBD) is a successful demonstration of agile program management and system engineering in a high risk technology application where utilizing and implementing new, untraditional development strategies were necessary. MBD produced two fully functioning spacecraft for a military/DOD application in a record breaking time frame and at dramatically reduced costs. This paper discloses the adaptation and application of concepts developed in agile software engineering to hardware product and system development for critical military applications. This challenging spacecraft did not use existing key technology (heritage hardware) and created a large paradigm shift from traditional spacecraft development. The insertion of new technologies and methods in space hardware has long been a problem due to long build times, the desire to use heritage hardware, and lack of effective process. The role of momentum in the innovative process can be exploited to tackle ongoing technology disruptions and allowing risk interactions to be mitigated in a disciplined manner. Examples of how these concepts were used during the MBD program will be delineated. Maintaining project momentum was essential to assess the constant non recurring technological challenges which needed to be retired rapidly from the engineering risk liens. Development never slowed due to tactical assessment of the hardware with the adoption of the SCRUM technique. We adapted this concept as a representation of mitigation of technical risk while allowing for design freeze later in the program's development cycle. By using Agile Systems Engineering and Management techniques which enabled decisive action, the product development momentum effectively was used to produce two novel space vehicles in a fraction of time with dramatically reduced cost.

  9. Balancing Plan-Driven and Agile Methods in Software Engineering Project Courses

    NASA Astrophysics Data System (ADS)

    Boehm, Barry; Port, Dan; Winsor Brown, A.

    2002-09-01

    For the past 6 years, we have been teaching a two-semester software engineering project course. The students organize into 5-person teams and develop largely web-based electronic services projects for real USC campus clients. We have been using and evolving a method called Model- Based (System) Architecting and Software Engineering (MBASE) for use in both the course and in industrial applications. The MBASE Guidelines include a lot of documents. We teach risk-driven documentation: if it is risky to document something, and not risky to leave it out (e.g., GUI screen placements), leave it out. Even so, students tend to associate more documentation with higher grades, although our grading eventually discourages this. We are always on the lookout for ways to have students learn best practices without having to produce excessive documentation. Thus, we were very interested in analyzing the various emerging agile methods. We found that agile methods and milestone plan-driven methods are part of a “how much planning is enough?” spectrum. Both agile and plan-driven methods have home grounds of project characteristics where they clearly work best, and where the other will have difficulties. Hybrid agile/plan-driven approaches are feasible, and necessary for projects having a mix of agile and plan-driven home ground characteristics. Information technology trends are going more toward the agile methods' home ground characteristics of emergent requirements and rapid change, although there is a concurrent increase in concern with dependability. As a result, we are currently experimenting with risk-driven combinations of MBASE and agile methods, such as integrating requirements, test plans, peer reviews, and pair programming into “agile quality management.”

  10. Collaboration, Communication and Co-ordination in Agile Software Development Practice

    NASA Astrophysics Data System (ADS)

    Robinson, Hugh; Sharp, Helen

    This chapter analyses the results of a series of observational studies of agile software developmentagile software development teams, identifying commonalities in collaboration, co-ordination and communication activities. Pairing and customer collaborationcustomer collaboration are focussed on to illustrate the nature of collaboration and communication, as are two simple physical artefacts that emerged through analysis as being an information-rich focal point for the co-ordination of collaboration and communication activities. The analysis shows that pairingpairing has common characteristics across all teams, while customer collaboration differs between the teams depending on the application and organisational context of development.

  11. Planning and scheduling for agile manufacturers: The Pantex Process Model

    SciTech Connect

    Kjeldgaard, E.A.; Jones, D.A.; List, G.F.; Tumquist, M.A.

    1998-02-01

    Effective use of resources that are shared among multiple products or processes is critical for agile manufacturing. This paper describes the development and implementation of a computerized model to support production planning in a complex manufacturing system at the Pantex Plant, a US Department of Energy facility. The model integrates two different production processes (nuclear weapon disposal and stockpile evaluation) that use common facilities and personnel at the plant. The two production processes are characteristic of flow-shop and job shop operations. The model reflects the interactions of scheduling constraints, material flow constraints, and the availability of required technicians and facilities. Operational results show significant productivity increases from use of the model.

  12. Insights into Global Health Practice from the Agile Software Development Movement

    PubMed Central

    Flood, David; Chary, Anita; Austad, Kirsten; Diaz, Anne Kraemer; García, Pablo; Martinez, Boris; Canú, Waleska López; Rohloff, Peter

    2016-01-01

    Global health practitioners may feel frustration that current models of global health research, delivery, and implementation are overly focused on specific interventions, slow to provide health services in the field, and relatively ill-equipped to adapt to local contexts. Adapting design principles from the agile software development movement, we propose an analogous approach to designing global health programs that emphasizes tight integration between research and implementation, early involvement of ground-level health workers and program beneficiaries, and rapid cycles of iterative program improvement. Using examples from our own fieldwork, we illustrate the potential of ‘agile global health’ and reflect on the limitations, trade-offs, and implications of this approach. PMID:27134081

  13. TriBITS lifecycle model. Version 1.0, a lean/agile software lifecycle model for research-based computational science and engineering and applied mathematical software.

    SciTech Connect

    Willenbring, James M.; Bartlett, Roscoe Ainsworth; Heroux, Michael Allen

    2012-01-01

    Software lifecycles are becoming an increasingly important issue for computational science and engineering (CSE) software. The process by which a piece of CSE software begins life as a set of research requirements and then matures into a trusted high-quality capability is both commonplace and extremely challenging. Although an implicit lifecycle is obviously being used in any effort, the challenges of this process - respecting the competing needs of research vs. production - cannot be overstated. Here we describe a proposal for a well-defined software lifecycle process based on modern Lean/Agile software engineering principles. What we propose is appropriate for many CSE software projects that are initially heavily focused on research but also are expected to eventually produce usable high-quality capabilities. The model is related to TriBITS, a build, integration and testing system, which serves as a strong foundation for this lifecycle model, and aspects of this lifecycle model are ingrained in the TriBITS system. Here, we advocate three to four phases or maturity levels that address the appropriate handling of many issues associated with the transition from research to production software. The goals of this lifecycle model are to better communicate maturity levels with customers and to help to identify and promote Software Engineering (SE) practices that will help to improve productivity and produce better software. An important collection of software in this domain is Trilinos, which is used as the motivation and the initial target for this lifecycle model. However, many other related and similar CSE (and non-CSE) software projects can also make good use of this lifecycle model, especially those that use the TriBITS system. Indeed this lifecycle process, if followed, will enable large-scale sustainable integration of many complex CSE software efforts across several institutions.

  14. Kedalion: NASA's Adaptable and Agile Hardware/Software Integration and Test Lab

    NASA Technical Reports Server (NTRS)

    Mangieri, Mark L.; Vice, Jason

    2011-01-01

    NASA fs Kedalion engineering analysis lab at Johnson Space Center is on the forefront of validating and using many contemporary avionics hardware/software development and integration techniques, which represent new paradigms to heritage NASA culture. Kedalion has validated many of the Orion hardware/software engineering techniques borrowed from the adjacent commercial aircraft avionics solution space, with the intention to build upon such techniques to better align with today fs aerospace market. Using agile techniques, commercial products, early rapid prototyping, in-house expertise and tools, and customer collaboration, Kedalion has demonstrated that cost effective contemporary paradigms hold the promise to serve future NASA endeavors within a diverse range of system domains. Kedalion provides a readily adaptable solution for medium/large scale integration projects. The Kedalion lab is currently serving as an in-line resource for the project and the Multipurpose Crew Vehicle (MPCV) program.

  15. RFID-Based Critical Path Expert System for Agility Manufacture Process Management

    NASA Astrophysics Data System (ADS)

    Cheng, Haifang; Xiang, Yuli

    This paper presents a critical path expert system for the agility manufacture process management based on radio frequency identification (RFID) technology. The paper explores that the agility manufacture processes can be visible and controllable with RFID. The critical paths or activities can be easily found out and tracked by the RFID tracing technology. And the expert system can optimize the bottle neck of the task process of the agility management with the critical path adjusting and reforming method. Finally, the paper gives a simple application example of the system to discuss how to adjust the critical paths and how to make the process more agility and flexibility with the critical path expert system. With an RFID-based critical path expert system, the agility manufacture process management will be more effective and efficient.

  16. COTS software selection process.

    SciTech Connect

    Watkins, William M. (Strike Wire Technologies, Louisville, CO); Lin, Han Wei; McClelland, Kelly (U.S. Security Associates, Livermore, CA); Ullrich, Rebecca Ann; Khanjenoori, Soheil; Dalton, Karen; Lai, Anh Tri; Kuca, Michal; Pacheco, Sandra; Shaffer-Gant, Jessica

    2006-05-01

    Today's need for rapid software development has generated a great interest in employing Commercial-Off-The-Shelf (COTS) software products as a way of managing cost, developing time, and effort. With an abundance of COTS software packages to choose from, the problem now is how to systematically evaluate, rank, and select a COTS product that best meets the software project requirements and at the same time can leverage off the current corporate information technology architectural environment. This paper describes a systematic process for decision support in evaluating and ranking COTS software. Performed right after the requirements analysis, this process provides the evaluators with more concise, structural, and step-by-step activities for determining the best COTS software product with manageable risk. In addition, the process is presented in phases that are flexible to allow for customization or tailoring to meet various projects' requirements.

  17. Software process assessments

    NASA Technical Reports Server (NTRS)

    Miller, Sharon E.; Tucker, George T.; Verducci, Anthony J., Jr.

    1992-01-01

    Software process assessments (SPA's) are part of an ongoing program of continuous quality improvements in AT&T. Their use was found to be very beneficial by software development organizations in identifying the issues facing the organization and the actions required to increase both quality and productivity in the organization.

  18. Software Model Of Software-Development Process

    NASA Technical Reports Server (NTRS)

    Lin, Chi Y.; Synott, Debra J.; Levary, Reuven R.

    1990-01-01

    Collection of computer programs constitutes software tool for simulation of medium- to large-scale software-development projects. Necessary to include easily identifiable and more-readily quantifiable characteristics like costs, times, and numbers of errors. Mathematical model incorporating these and other factors of dynamics of software-development process implemented in the Software Life Cycle Simulator (SLICS) computer program. Simulates dynamics of software-development process. In combination with input and output expert software systems and knowledge-based management software system, develops information for use in managing large software-development project. Intended to aid managers in planning, managing, and controlling software-development processes by reducing uncertainties in budgets, required personnel, and schedules.

  19. Accelerating Software Development through Agile Practices--A Case Study of a Small-Scale, Time-Intensive Web Development Project at a College-Level IT Competition

    ERIC Educational Resources Information Center

    Zhang, Xuesong; Dorn, Bradley

    2012-01-01

    Agile development has received increasing interest both in industry and academia due to its benefits in developing software quickly, meeting customer needs, and keeping pace with the rapidly changing requirements. However, agile practices and scrum in particular have been mainly tested in mid- to large-size projects. In this paper, we present…

  20. RisaAligner software for aligning fluorescence data between Agilent 2100 Bioanalyzer chips: Application to soil microbial community analysis.

    PubMed

    Navarro, Elisabeth; Fabrègue, Olivier; Scorretti, Riccardo; Reboulet, Jérémy; Simonet, Pascal; Dawson, Lorna; Demanèche, Sandrine

    2015-12-01

    Ribosomal Intergenic Spacer Analysis (RISA) is a high-resolution and highly reproducible fingerprinting technique for discriminating between microbial communities. The community profiles can be visualized using the Agilent 2100 Bioanalyzer. Comparison between fingerprints relies upon precise estimation of all amplified DNA fragment lengths; however, size standard computation can vary between gel runs. For complex samples such as soil microbial communities, discrimination by fragment size is not always sufficient. In such cases, the comparison of whole fluorescence data as a function of time (electrophoregrams) is more appropriate. When electrophoregrams [fluorescence = f (time)] are used, and more than one chip is involved, electrophoregram comparisons are challenging due to experimental variations between chips and the lack of correction by the Agilent software in such situations. Here we present RisaAligner software for analyzing and comparing electrophoregrams from Agilent chips using a nonlinear ladder-alignment algorithm. We demonstrate the robustness and substantial improvement of data analysis by analyzing soil microbial profiles obtained with Agilent DNA 1000 and High Sensitivity chips. PMID:26651514

  1. Software Process Assessment (SPA)

    NASA Technical Reports Server (NTRS)

    Rosenberg, Linda H.; Sheppard, Sylvia B.; Butler, Scott A.

    1994-01-01

    NASA's environment mirrors the changes taking place in the nation at large, i.e. workers are being asked to do more work with fewer resources. For software developers at NASA's Goddard Space Flight Center (GSFC), the effects of this change are that we must continue to produce quality code that is maintainable and reusable, but we must learn to produce it more efficiently and less expensively. To accomplish this goal, the Data Systems Technology Division (DSTD) at GSFC is trying a variety of both proven and state-of-the-art techniques for software development (e.g., object-oriented design, prototyping, designing for reuse, etc.). In order to evaluate the effectiveness of these techniques, the Software Process Assessment (SPA) program was initiated. SPA was begun under the assumption that the effects of different software development processes, techniques, and tools, on the resulting product must be evaluated in an objective manner in order to assess any benefits that may have accrued. SPA involves the collection and analysis of software product and process data. These data include metrics such as effort, code changes, size, complexity, and code readability. This paper describes the SPA data collection and analysis methodology and presents examples of benefits realized thus far by DSTD's software developers and managers.

  2. Final Report of the NASA Office of Safety and Mission Assurance Agile Benchmarking Team

    NASA Technical Reports Server (NTRS)

    Wetherholt, Martha

    2016-01-01

    To ensure that the NASA Safety and Mission Assurance (SMA) community remains in a position to perform reliable Software Assurance (SA) on NASAs critical software (SW) systems with the software industry rapidly transitioning from waterfall to Agile processes, Terry Wilcutt, Chief, Safety and Mission Assurance, Office of Safety and Mission Assurance (OSMA) established the Agile Benchmarking Team (ABT). The Team's tasks were: 1. Research background literature on current Agile processes, 2. Perform benchmark activities with other organizations that are involved in software Agile processes to determine best practices, 3. Collect information on Agile-developed systems to enable improvements to the current NASA standards and processes to enhance their ability to perform reliable software assurance on NASA Agile-developed systems, 4. Suggest additional guidance and recommendations for updates to those standards and processes, as needed. The ABT's findings and recommendations for software management, engineering and software assurance are addressed herein.

  3. Image Processing Software

    NASA Technical Reports Server (NTRS)

    1992-01-01

    To convert raw data into environmental products, the National Weather Service and other organizations use the Global 9000 image processing system marketed by Global Imaging, Inc. The company's GAE software package is an enhanced version of the TAE, developed by Goddard Space Flight Center to support remote sensing and image processing applications. The system can be operated in three modes and is combined with HP Apollo workstation hardware.

  4. Software Engineering Program: Software Process Improvement Guidebook

    NASA Technical Reports Server (NTRS)

    1996-01-01

    The purpose of this document is to provide experience-based guidance in implementing a software process improvement program in any NASA software development or maintenance community. This guidebook details how to define, operate, and implement a working software process improvement program. It describes the concept of the software process improvement program and its basic organizational components. It then describes the structure, organization, and operation of the software process improvement program, illustrating all these concepts with specific NASA examples. The information presented in the document is derived from the experiences of several NASA software organizations, including the SEL, the SEAL, and the SORCE. Their experiences reflect many of the elements of software process improvement within NASA. This guidebook presents lessons learned in a form usable by anyone considering establishing a software process improvement program within his or her own environment. This guidebook attempts to balance general and detailed information. It provides material general enough to be usable by NASA organizations whose characteristics do not directly match those of the sources of the information and models presented herein. It also keeps the ideas sufficiently close to the sources of the practical experiences that have generated the models and information.

  5. A process for the agile product realization of electro-mechanical devices

    SciTech Connect

    Forsythe, C.; Ashby, M.R.; Benavides, G.L.; Diegert, K.V.; Jones, R.E.; Longcope, D.B.; Parratt, S.W.

    1995-09-01

    This paper describes a product realization process developed and demonstrated at Sandia by the A-PRIMED (Agile Product Realization for Innovative Electro MEchanical Devices) project that integrates many of the key components of ``agile manufacturing`` into a complete, design-to-production process. Evidence indicates that the process has reduced the product realization cycle and assured product quality. Products included discriminators for a robotic quick change adapter and for an electronic defense system. These discriminators, built using A-PRIMED, met random vibration requirements and had life cycles that far surpass the performance obtained from earlier efforts.

  6. Developments in Agile Manufacturing

    SciTech Connect

    Clinesmith, M.G.

    1993-09-01

    As part of a project design initiative, Sandia National Laboratories and AlliedSignal Inc. Kansas City Division have joined efforts to develop a concurrent engineering capability for the manufacturing of complex precision components. The primary effort of this project, called Agile Manufacturing, is directed toward: (1) Understand the error associated with manufacturing and inspection. (2) Develop methods for correcting error. (3) Integrate diverse software technologies into a compatible process. The Agile Manufacturing System (AMS) is a system that integrates product design, manufacturing, and inspection into a closed loop, concurrent engineering process. The goal of developing the Agile Manufacturing System is to: (1) Optimize accuracy in manufacturing and inspection. (A) Use of softgage software for product evaluation. This will ensure ANSI Y14.5 compliance. (B) Establish and monitor bias between CMM and machine center. (C) Map probe deflection error and apply correction to inspection results. This applies to both on machine probing and CMM inspections. (D) Inspection process. (2) Compress the cycle time from product concept to production level manufacturing and verification. (3) Create a self-correcting process that feeds inspection results back into the machining process. (4) Link subordinate processes (cutting/probing path, softgage model, etc.) to the solid model definition.

  7. A Roadmap for Using Agile Development in a Traditional Environment

    NASA Technical Reports Server (NTRS)

    Streiffert, Barbara; Starbird, Thomas; Grenander, Sven

    2006-01-01

    One of the newer classes of software engineering techniques is called 'Agile Development'. In Agile Development software engineers take small implementation steps and, in some cases, they program in pairs. In addition, they develop automatic tests prior to implementing their small functional piece. Agile Development focuses on rapid turnaround, incremental planning, customer involvement and continuous integration. Agile Development is not the traditional waterfall method or even a rapid prototyping method (although this methodology is closer to Agile Development). At the Jet Propulsion Laboratory (JPL) a few groups have begun Agile Development software implementations. The difficulty with this approach becomes apparent when Agile Development is used in an organization that has specific criteria and requirements handed down for how software development is to be performed. The work at the JPL is performed for the National Aeronautics and Space Agency (NASA). Both organizations have specific requirements, rules and processes for developing software. This paper will discuss some of the initial uses of the Agile Development methodology, the spread of this method and the current status of the successful incorporation into the current JPL development policies and processes.

  8. Computer Software for Process Control.

    ERIC Educational Resources Information Center

    Spector, Alfred Z.

    1984-01-01

    Computer software for process control has the primary function of communicating with and governing physical devices. The structure of such software, process-control systems, multitask systems, message passing, problems of deadlock, distributed computer systems, and protection against failure in process-control systems are among the areas examined.…

  9. Agile based "Semi-"Automated Data ingest process : ORNL DAAC example

    NASA Astrophysics Data System (ADS)

    Santhana Vannan, S. K.; Beaty, T.; Cook, R. B.; Devarakonda, R.; Hook, L.; Wei, Y.; Wright, D.

    2015-12-01

    The ORNL DAAC archives and publishes data and information relevant to biogeochemical, ecological, and environmental processes. The data archived at the ORNL DAAC must be well formatted, self-descriptive, and documented, as well as referenced in a peer-reviewed publication. The ORNL DAAC ingest team curates diverse data sets from multiple data providers simultaneously. To streamline the ingest process, the data set submission process at the ORNL DAAC has been recently updated to use an agile process and a semi-automated workflow system has been developed to provide a consistent data provider experience and to create a uniform data product. The goals of semi-automated agile ingest process are to: 1.Provide the ability to track a data set from acceptance to publication 2. Automate steps that can be automated to improve efficiencies and reduce redundancy 3.Update legacy ingest infrastructure 4.Provide a centralized system to manage the various aspects of ingest. This talk will cover the agile methodology, workflow, and tools developed through this system.

  10. An agile implementation of SCRUM

    NASA Astrophysics Data System (ADS)

    Gannon, Michele

    Is Agile a way to cut corners? To some, the use of an Agile Software Development Methodology has a negative connotation - “ Oh, you're just not producing any documentation” . So can a team with no experience in Agile successfully implement and use SCRUM?

  11. FAST - A Framework for Agile Software Testing v. 2.0

    2009-03-25

    The FAST software package contains a variety of Python packages for applying and managing software tests. In version 2.0, FAST includes (1) the EXACT package, which supports the definition and execution of computational experiments, (2) the FAST package, which manages the distributed execution of software builds, and (3) general tools related to the PyUnit testing framework.

  12. Choosing Software for Text Processing.

    ERIC Educational Resources Information Center

    Mason, Robert M.

    1983-01-01

    Review of text processing software for microcomputers covers data entry, text editing, document formatting, and spelling and proofreading programs including "Wordstar,""PeachText,""PerfectWriter,""Select," and "The Word Plus.""The Whole Earth Software Catalog" and a new terminal to be manufactured for OCLC by IBM are mentioned. (EJS)

  13. Software Development Standard Processes (SDSP)

    NASA Technical Reports Server (NTRS)

    Lavin, Milton L.; Wang, James J.; Morillo, Ronald; Mayer, John T.; Jamshidian, Barzia; Shimizu, Kenneth J.; Wilkinson, Belinda M.; Hihn, Jairus M.; Borgen, Rosana B.; Meyer, Kenneth N.; Crean, Kathleen A.; Rinker, George C.; Smith, Thomas P.; Lum, Karen T.; Hanna, Robert A.; Erickson, Daniel E.; Gamble, Edward B., Jr.; Morgan, Scott C.; Kelsay, Michael G.; Newport, Brian J.; Lewicki, Scott A.; Stipanuk, Jeane G.; Cooper, Tonja M.; Meshkat, Leila

    2011-01-01

    A JPL-created set of standard processes is to be used throughout the lifecycle of software development. These SDSPs cover a range of activities, from management and engineering activities, to assurance and support activities. These processes must be applied to software tasks per a prescribed set of procedures. JPL s Software Quality Improvement Project is currently working at the behest of the JPL Software Process Owner to ensure that all applicable software tasks follow these procedures. The SDSPs are captured as a set of 22 standards in JPL s software process domain. They were developed in-house at JPL by a number of Subject Matter Experts (SMEs) residing primarily within the Engineering and Science Directorate, but also from the Business Operations Directorate and Safety and Mission Success Directorate. These practices include not only currently performed best practices, but also JPL-desired future practices in key thrust areas like software architecting and software reuse analysis. Additionally, these SDSPs conform to many standards and requirements to which JPL projects are beholden.

  14. Software quality: Process or people

    NASA Technical Reports Server (NTRS)

    Palmer, Regina; Labaugh, Modenna

    1993-01-01

    This paper will present data related to software development processes and personnel involvement from the perspective of software quality assurance. We examine eight years of data collected from six projects. Data collected varied by project but usually included defect and fault density with limited use of code metrics, schedule adherence, and budget growth information. The data are a blend of AFSCP 800-14 and suggested productivity measures in Software Metrics: A Practioner's Guide to Improved Product Development. A software quality assurance database tool, SQUID, was used to store and tabulate the data.

  15. A software surety analysis process

    SciTech Connect

    Trauth, S.; Tempel, P.

    1995-11-01

    As part of the High Consequence System Surety project, this work was undertaken to explore, one approach to conducting a surety theme analysis for a software-driven system. Originally, plans were to develop a theoretical approach to the analysis, and then to validate and refine this process by applying it to the software being developed for the Weight and Leak Check System (WALS), an automated nuclear weapon component handling system. As with the development of the higher level High consequence System surety Process, this work was not completed due to changes in funding levels. This document describes the software analysis process, discusses its application in a software, environment, and outlines next steps that could be taken to further develop and apply the approach to projects.

  16. Customer Communication Challenges and Solutions in Globally Distributed Agile Software Development

    NASA Astrophysics Data System (ADS)

    Pikkarainen, Minna; Korkala, Mikko

    Working in the globally distributed market is one of the key trends among the software organizations all over the world. [1-5]. Several factors have contributed to the growth of distributed software development; time-zone independent ”follow the sun” development, access to well-educated labour, maturation of the technical infrastructure and reduced costs are some of the most commonly cited benefits of distributed development [3, 6-8]. Furthermore, customers are often located in different countries because of the companies’ internationalization purposes or good market opportunities.

  17. Are we unnecessarily constraining the agility of complex process-based models?

    NASA Astrophysics Data System (ADS)

    Mendoza, Pablo A.; Clark, Martyn P.; Barlage, Michael; Rajagopalan, Balaji; Samaniego, Luis; Abramowitz, Gab; Gupta, Hoshin

    2015-01-01

    In this commentary we suggest that hydrologists and land-surface modelers may be unnecessarily constraining the behavioral agility of very complex physics-based models. We argue that the relatively poor performance of such models can occur due to restrictions on their ability to refine their portrayal of physical processes, in part because of strong a priori constraints in: (i) the representation of spatial variability and hydrologic connectivity, (ii) the choice of model parameterizations, and (iii) the choice of model parameter values. We provide a specific example of problems associated with strong a priori constraints on parameters in a land surface model. Moving forward, we assert that improving hydrological models requires integrating the strengths of the "physics-based" modeling philosophy (which relies on prior knowledge of hydrologic processes) with the strengths of the "conceptual" modeling philosophy (which relies on data driven inference). Such integration will accelerate progress on methods to define and discriminate among competing modeling options, which should be ideally incorporated in agile modeling frameworks and tested through a diagnostic evaluation approach.

  18. Introduction to Stand-up Meetings in Agile Methods

    NASA Astrophysics Data System (ADS)

    Hasnain, Eisha; Hall, Tracy

    2009-05-01

    In recent years, agile methods have become more popular in the software industry. Agile methods are a new approach compared to plan-driven approaches. One of the most important shifts in adopting an agile approach is the central focus given to people in the process. This is exemplified by the independence afforded to developers in the development work they do. This work investigates the opinions of practitioners about daily stand-up meetings in the agile methods and the role of developer in that. For our investigation we joined a yahoo group called "Extreme Programming". Our investigation suggests that although trust is an important factor in agile methods. But stand-ups are not the place to build trust.

  19. Product-oriented Software Certification Process for Software Synthesis

    NASA Technical Reports Server (NTRS)

    Nelson, Stacy; Fischer, Bernd; Denney, Ewen; Schumann, Johann; Richardson, Julian; Oh, Phil

    2004-01-01

    The purpose of this document is to propose a product-oriented software certification process to facilitate use of software synthesis and formal methods. Why is such a process needed? Currently, software is tested until deemed bug-free rather than proving that certain software properties exist. This approach has worked well in most cases, but unfortunately, deaths still occur due to software failure. Using formal methods (techniques from logic and discrete mathematics like set theory, automata theory and formal logic as opposed to continuous mathematics like calculus) and software synthesis, it is possible to reduce this risk by proving certain software properties. Additionally, software synthesis makes it possible to automate some phases of the traditional software development life cycle resulting in a more streamlined and accurate development process.

  20. Bringing Agility to Business Process Management: Rules Deployment in an SOA

    NASA Astrophysics Data System (ADS)

    El Kharbili, Marwane; Keil, Tobias

    Business process management (BPM) has emerged as paradigm for integrating business strategies and enterprise architecture (EA). In this context, BPM implementation on top of web-service-based service oriented architectures is an accepted approach as shown by great amount of literature. One concern in this regard is how-to make BPs reactive to change. Our approach to the problem is the integration of business rule management (BRM) and BPM by allowing modeling of decisions hard-coded in BPs as separate business rules (BRs). These BRs become EA assets and need to be exploited when executing BPs. We motivate why BPM needs agility and discuss what requirements on BPM this poses. This paper presents prototyping work conducted at a BP modeling and analysis vendor which weeks to showcase how using business rule management (BRM) as a mean for modeling decisions can help achieve a much sought-after agility to BPM. This prototype relies on the integrated modeling of business rules (BRs) and BPs, and rule deployment as web services part of an SOA.

  1. Micro-milling process improvement using an agile pulse-shaping fiber laser

    NASA Astrophysics Data System (ADS)

    Gay, David; Cournoyer, Alain; Deladurantaye, Pascal; Briand, Martin; Roy, Vincent; Labranche, Bruno; Levesque, Marc; Taillon, Y.

    2009-06-01

    We demonstrate the usefulness of INO's pulse-shaping fiber laser platform to rapidly develop complex laser micromachining processes. The versatility of such laser sources allows for straightforward control of the emitting energy envelop on the nanosecond timescale to create multi-amplitude level pulses and/or multi-pulse regimes. The pulses are amplified in an amplifier chain in a MOPA configuration that delivers output energy per pulse up to 60 μJ at 1064 nm at a repetition rate of 200 kHz with excellent beam quality (M2 < 1.1) and narrow line widths suitable for efficient frequency conversion. Also, their pulse-on-demand and pulse-to-pulse shape selection capability at high repetition rates makes those agile laser sources suitable for the implementation of high-throughput complex laser processing. Micro-milling experiments were carried out on two metals, aluminum and stainless steel, having very different thermal properties. For aluminum, our results show that the material removal efficiency depends strongly on the pulse shape, especially near the ablation threshold, and can be maximized to develop efficient laser micro-milling processes. But, the material removal efficiency is not always correlated with a good surface quality. However, the roughness of the milled surface can be improved by removing a few layers of material using another type of pulse shape. The agility of INO's fiber laser enables the implementation of a fast laser process including two steps employing different pulse characteristics for maximizing the material removal rate and obtaining a good surface quality at the same time. A comparison of material removal efficiency with stainless steel, well known to be difficult to mill on the micron scale, is also presented.

  2. Software handlers for process interfaces

    NASA Technical Reports Server (NTRS)

    Bercaw, R. W.

    1976-01-01

    Process interfaces are developed in an effort to reduce the time, effort, and money required to install computer systems. Probably the chief obstacle to the achievement of these goals lies in the problem of developing software handlers having the same degree of generality and modularity as the hardware. The problem of combining the advantages of modular instrumentation with those of modern multitask operating systems has not been completely solved, but there are a number of promising developments. The essential principles involved are considered.

  3. A process for the agile product realization of electro-mechanical devices

    SciTech Connect

    Forsythe, C.; Diegert, K.V.; Ashby, M.R.; Parratt, S.W.; Benavides, G.L.; Jones, R.E.; Longcope, D.B.

    1995-08-01

    This paper describes a product realization process developed at Sandia National Laboratories by the A-PRIMED project that integrates many of the key components of ``agile manufacturing`` into a complete, step-by-step, design-to-production process. For three separate product realization efforts, each geared to a different set of requirements, A-PRIMED demonstrated product realization of a custom device in less than a month. A-PRIMED used a discriminator (a precision electro-mechanical device) as the demonstration device, but the process is readily adaptable to other electro-mechanical products. The process begins with a qualified design parameter space. From that point, the product realization process encompasses all facets of requirements development, analysis and testing, design, manufacturing, robotic assembly and quality assurance, as well as product data management and concurrent engineering. In developing the product realization process, A-PRIMED employed an iterative approach whereby after each of three builds, the process was reviewed and refinements made on the basis of lessons learned. This paper describes the integration of project functions and product realization technologies, with references to reports detailing specific facets of the overall process. The process described herein represents the outcome of an empirically-based process development effort that on repeated iterations, was proven successful.

  4. Collaborative business processes for enhancing partnerships among software services providers

    NASA Astrophysics Data System (ADS)

    Heil Cancian, Maiara; Rabelo, Ricardo; Gresse von Wangenheim, Christiane

    2015-08-01

    Software services have represented a powerful view to support the realisation of the service-oriented architecture (SOA) paradigm. Using open standards and facilitating systems projects, they have increasingly been used as a corporate architectural approach to create interoperable services-based software solutions that can more easily be reused and shared across disparate applications. In the context of software companies, most of them are small firms having enormous difficulties to keep competitive. One strategy to enhance their sustainability is to enlarge partnerships among them at a more valuable level by jointly offering (web) services-based solutions. However, their culture of collaboration is low, and partnerships are usually done with the same companies and sporadically. This article presents an approach to support a more intense collaboration among software companies to attend business opportunities in a more agile way, joining capacities and capabilities which they would not have if they worked alone. This requires, however, some preparedness. From the perspective of business processes, they should understand how to carry out a collaboration more properly. This is essentially what this article is about. It presents a comprehensive list of collaborative business processes and base practices that can also act as a guide for service providers' managers to implement and manage the collaboration along its lifecycle. Processes have been validated and results are discussed.

  5. A process for the agile product realization of electromechanical devices (A-primed)

    SciTech Connect

    Forsythe, C.; Ashby, M.R.; Benavides, G.L.; Diegert, K.V.; Jones, R.E.; Longcope, D.B.; Parratt, S.W.

    1996-02-01

    This paper describes a product realization process developed at Sandia National Laboratories by the A-PRIMED project that integrates many of the key components of ``agile manufacturing`` (Nagel & Dove, 1992) into a complete, step-by-step, design-to-production process. For two separate product realization efforts, each geared to a different set of requirements, A-PRIMED demonstrated product realization of a custom device in less than a month. A-PRIMED used a discriminator (a precision electro mechanical device) as the demonstration device, but the process is readily adaptable to other electro mechanical products. The process begins with a qualified design parameter space (Diegert et al, 1995). From that point, the product realization process encompasses all facets of requirements development, analysis and testing, design, manufacturing, robot assembly and quality assurance, as well as product data management and concurrent engineering. In developing the product realization process, A-PRIMED employed an iterative approach whereby after each build, the process was reviewed and refinements were made on the basis of lessons learned. This paper describes the integration of project functions and product realization technologies to develop a product realization process that on repeated iterations, was proven successful.

  6. Tools for Supporting Distributed Agile Project Planning

    NASA Astrophysics Data System (ADS)

    Wang, Xin; Maurer, Frank; Morgan, Robert; Oliveira, Josyleuda

    Agile project planning plays an important part in agile software development. In distributed settings, project planning is severely impacted by the lack of face-to-face communication and the inability to share paper index cards amongst all meeting participants. To address these issues, several distributed agile planning tools were developed. The tools vary in features, functions and running platforms. In this chapter, we first summarize the requirements for distributed agile planning. Then we give an overview on existing agile planning tools. We also evaluate existing tools based on tool requirements. Finally, we present some practical advices for both designers and users of distributed agile planning tools.

  7. What Does an Agile Coach Do?

    NASA Astrophysics Data System (ADS)

    Davies, Rachel; Pullicino, James

    The surge in Agile adoption has created a demand for project managers rather than direct their teams. A sign of this trend is the ever-increasing number of people getting certified as scrum masters and agile leaders. Training courses that introduce agile practices are easy to find. But making the transition to coach is not as simple as understanding what agile practices are. Your challenge as an Agile Coach is to support your team in learning how to wield their new Agile tools in creating great software.

  8. Agile Development Methods for Space Operations

    NASA Technical Reports Server (NTRS)

    Trimble, Jay; Webster, Chris

    2012-01-01

    Main stream industry software development practice has gone from a traditional waterfall process to agile iterative development that allows for fast response to customer inputs and produces higher quality software at lower cost. How can we, the space ops community, adopt state of the art software development practice, achieve greater productivity at lower cost, and maintain safe and effective space flight operations? At NASA Ames, we are developing Mission Control Technologies Software, in collaboration with Johnson Space Center (JSC) and, more recently, the Jet Propulsion Laboratory (JPL).

  9. The Earth System Documentation (ES-DOC) Software Process

    NASA Astrophysics Data System (ADS)

    Greenslade, M. A.; Murphy, S.; Treshansky, A.; DeLuca, C.; Guilyardi, E.; Denvil, S.

    2013-12-01

    Earth System Documentation (ES-DOC) is an international project supplying high-quality tools & services in support of earth system documentation creation, analysis and dissemination. It is nurturing a sustainable standards based documentation eco-system that aims to become an integral part of the next generation of exa-scale dataset archives. ES-DOC leverages open source software, and applies a software development methodology that places end-user narratives at the heart of all it does. ES-DOC has initially focused upon nurturing the Earth System Model (ESM) documentation eco-system and currently supporting the following projects: * Coupled Model Inter-comparison Project Phase 5 (CMIP5); * Dynamical Core Model Inter-comparison Project (DCMIP); * National Climate Predictions and Projections Platforms Quantitative Evaluation of Downscaling Workshop. This talk will demonstrate that ES-DOC implements a relatively mature software development process. Taking a pragmatic Agile process as inspiration, ES-DOC: * Iteratively develops and releases working software; * Captures user requirements via a narrative based approach; * Uses online collaboration tools (e.g. Earth System CoG) to manage progress; * Prototypes applications to validate their feasibility; * Leverages meta-programming techniques where appropriate; * Automates testing whenever sensibly feasible; * Streamlines complex deployments to a single command; * Extensively leverages GitHub and Pivotal Tracker; * Enforces strict separation of the UI from underlying API's; * Conducts code reviews.

  10. Software process improvement in the NASA software engineering laboratory

    NASA Technical Reports Server (NTRS)

    Mcgarry, Frank; Pajerski, Rose; Page, Gerald; Waligora, Sharon; Basili, Victor; Zelkowitz, Marvin

    1994-01-01

    The Software Engineering Laboratory (SEL) was established in 1976 for the purpose of studying and measuring software processes with the intent of identifying improvements that could be applied to the production of ground support software within the Flight Dynamics Division (FDD) at the National Aeronautics and Space Administration (NASA)/Goddard Space Flight Center (GSFC). The SEL has three member organizations: NASA/GSFC, the University of Maryland, and Computer Sciences Corporation (CSC). The concept of process improvement within the SEL focuses on the continual understanding of both process and product as well as goal-driven experimentation and analysis of process change within a production environment.

  11. An Investigation of Agility Issues in Scrum Teams Using Agility Indicators

    NASA Astrophysics Data System (ADS)

    Pikkarainen, Minna; Wang, Xiaofeng

    Agile software development methods have emerged and become increasingly popular in recent years; yet the issues encountered by software development teams that strive to achieve agility using agile methods are yet to be explored systematically. Built upon a previous study that has established a set of indicators of agility, this study investigates what issues are manifested in software development teams using agile methods. It is focussed on Scrum teams particularly. In other words, the goal of the chapter is to evaluate Scrum teams using agility indicators and therefore to further validate previously presented agility indicators within the additional cases. A multiple case study research method is employed. The findings of the study reveal that the teams using Scrum do not necessarily achieve agility in terms of team autonomy, sharing, stability and embraced uncertainty. The possible reasons include previous organizational plan-driven culture, resistance towards the Scrum roles and changing resources.

  12. Managing the Software Development Process

    NASA Astrophysics Data System (ADS)

    Lubelczyk, J.; Parra, A.

    The goal of any software development project is to produce a product that is delivered on time, within the allocated budget, and with the capabilities expected by the customer and unfortunately, this goal is rarely achieved. However, a properly managed project in a mature software engineering environment can consistently achieve this goal. In this paper we provide an introduction to three project success factors, a properly managed project, a competent project manager, and a mature software engineering environment. We will also present an overview of the benefits of a mature software engineering environment based on 24 years of data from the Software Engineering Lab, and suggest some first steps that an organization can take to begin benefiting from this environment. The depth and breadth of software engineering exceeds this paper, various references are cited with a goal of raising awareness and encouraging further investigation into software engineering and project management practices.

  13. Managing the Software Development Process

    NASA Technical Reports Server (NTRS)

    Lubelczky, Jeffrey T.; Parra, Amy

    1999-01-01

    The goal of any software development project is to produce a product that is delivered on time, within the allocated budget, and with the capabilities expected by the customer and unfortunately, this goal is rarely achieved. However, a properly managed project in a mature software engineering environment can consistently achieve this goal. In this paper we provide an introduction to three project success factors, a properly managed project, a competent project manager, and a mature software engineering environment. We will also present an overview of the benefits of a mature software engineering environment based on 24 years of data from the Software Engineering Lab, and suggest some first steps that an organization can take to begin benefiting from this environment. The depth and breadth of software engineering exceeds this paper, various references are cited with a goal of raising awareness and encouraging further investigation into software engineering and project management practices.

  14. Reliable software and communication 2: Controlling the software development process

    NASA Astrophysics Data System (ADS)

    Dalal, Siddhartha R.; Horgan, Joseph R.; Kettenring, Jon R.

    1994-01-01

    The software created by industrial, educational, and research organizations is increasingly large and complex. It also occupies a central role in the reliability and safety of many essential services. We examine the software development process and suggest opportunities for improving the process by using a combination of statistical and other process control techniques. Data, analysis of data, and tools for collecting data are crucial to our approach. Although our views are based upon experiences with large telecommunications systems, they are likely to be useful to many other developers of large software systems.

  15. A Matrix Approach to Software Process Definition

    NASA Technical Reports Server (NTRS)

    Schultz, David; Bachman, Judith; Landis, Linda; Stark, Mike; Godfrey, Sally; Morisio, Maurizio; Powers, Edward I. (Technical Monitor)

    2000-01-01

    The Software Engineering Laboratory (SEL) is currently engaged in a Methodology and Metrics program for the Information Systems Center (ISC) at Goddard Space Flight Center (GSFC). This paper addresses the Methodology portion of the program. The purpose of the Methodology effort is to assist a software team lead in selecting and tailoring a software development or maintenance process for a specific GSFC project. It is intended that this process will also be compliant with both ISO 9001 and the Software Engineering Institute's Capability Maturity Model (CMM). Under the Methodology program, we have defined four standard ISO-compliant software processes for the ISC, and three tailoring criteria that team leads can use to categorize their projects. The team lead would select a process and appropriate tailoring factors, from which a software process tailored to the specific project could be generated. Our objective in the Methodology program is to present software process information in a structured fashion, to make it easy for a team lead to characterize the type of software engineering to be performed, and to apply tailoring parameters to search for an appropriate software process description. This will enable the team lead to follow a proven, effective software process and also satisfy NASA's requirement for compliance with ISO 9001 and the anticipated requirement for CMM assessment. This work is also intended to support the deployment of sound software processes across the ISC.

  16. Agile automated vision

    NASA Astrophysics Data System (ADS)

    Fandrich, Juergen; Schmitt, Lorenz A.

    1994-11-01

    The microelectronic industry is a protagonist in driving automated vision to new paradigms. Today semiconductor manufacturers use vision systems quite frequently in their fabs in the front-end process. In fact, the process depends on reliable image processing systems. In the back-end process, where ICs are assembled and packaged, today vision systems are only partly used. But in the next years automated vision will become compulsory for the back-end process as well. Vision will be fully integrated into every IC package production machine to increase yields and reduce costs. Modem high-speed material processing requires dedicated and efficient concepts in image processing. But the integration of various equipment in a production plant leads to unifying handling of data flow and interfaces. Only agile vision systems can act with these contradictions: fast, reliable, adaptable, scalable and comprehensive. A powerful hardware platform is a unneglectable requirement for the use of advanced and reliable, but unfortunately computing intensive image processing algorithms. The massively parallel SIMD hardware product LANTERN/VME supplies a powerful platform for existing and new functionality. LANTERN/VME is used with a new optical sensor for IC package lead inspection. This is done in 3D, including horizontal and coplanarity inspection. The appropriate software is designed for lead inspection, alignment and control tasks in IC package production and handling equipment, like Trim&Form, Tape&Reel and Pick&Place machines.

  17. Software-Engineering Process Simulation (SEPS) model

    NASA Technical Reports Server (NTRS)

    Lin, C. Y.; Abdel-Hamid, T.; Sherif, J. S.

    1992-01-01

    The Software Engineering Process Simulation (SEPS) model is described which was developed at JPL. SEPS is a dynamic simulation model of the software project development process. It uses the feedback principles of system dynamics to simulate the dynamic interactions among various software life cycle development activities and management decision making processes. The model is designed to be a planning tool to examine tradeoffs of cost, schedule, and functionality, and to test the implications of different managerial policies on a project's outcome. Furthermore, SEPS will enable software managers to gain a better understanding of the dynamics of software project development and perform postmodern assessments.

  18. Model-based software process improvement

    NASA Technical Reports Server (NTRS)

    Zettervall, Brenda T.

    1994-01-01

    The activities of a field test site for the Software Engineering Institute's software process definition project are discussed. Products tested included the improvement model itself, descriptive modeling techniques, the CMM level 2 framework document, and the use of process definition guidelines and templates. The software process improvement model represents a five stage cyclic approach for organizational process improvement. The cycles consist of the initiating, diagnosing, establishing, acting, and leveraging phases.

  19. Overview of the software inspection process

    SciTech Connect

    Lane, G.L.; Dabbs, R.

    1997-11-01

    This tutorial introduces attendees to the Inspection Process and teaches them how to organize and participate in a software inspection. The tutorial advocates the benefits of inspections and encourages attendees to socialize the inspection process in their organizations.

  20. Towards a Better Understanding of CMMI and Agile Integration - Multiple Case Study of Four Companies

    NASA Astrophysics Data System (ADS)

    Pikkarainen, Minna

    The amount of software is increasing in the different domains in Europe. This provides the industries in smaller countries good opportunities to work in the international markets. Success in the global markets however demands the rapid production of high quality, error free software. Both CMMI and agile methods seem to provide a ready solution for quality and lead time improvements. There is not, however, much empirical evidence available either about 1) how the integration of these two aspects can be done in practice or 2) what it actually demands from assessors and software process improvement groups. The goal of this paper is to increase the understanding of CMMI and agile integration, in particular, focusing on the research question: how to use ‘lightweight’ style of CMMI assessments in agile contexts. This is done via four case studies in which assessments were conducted using the goals of CMMI integrated project management and collaboration and coordination with relevant stakeholder process areas and practices from XP and Scrum. The study shows that the use of agile practices may support the fulfilment of the goals of CMMI process areas but there are still many challenges for the agile teams to be solved within the continuous improvement programs. It also identifies practical advices to the assessors and improvement groups to take into consideration when conducting assessment in the context of agile software development.

  1. Agile manufacturing prototyping system (AMPS)

    SciTech Connect

    Garcia, P.

    1998-05-09

    The Agile Manufacturing Prototyping System (AMPS) is being integrated at Sandia National Laboratories. AMPS consists of state of the industry flexible manufacturing hardware and software enhanced with Sandia advancements in sensor and model based control; automated programming, assembly and task planning; flexible fixturing; and automated reconfiguration technology. AMPS is focused on the agile production of complex electromechanical parts. It currently includes 7 robots (4 Adept One, 2 Adept 505, 1 Staubli RX90), conveyance equipment, and a collection of process equipment to form a flexible production line capable of assembling a wide range of electromechanical products. This system became operational in September 1995. Additional smart manufacturing processes will be integrated in the future. An automated spray cleaning workcell capable of handling alcohol and similar solvents was added in 1996 as well as parts cleaning and encapsulation equipment, automated deburring, and automated vision inspection stations. Plans for 1997 and out years include adding manufacturing processes for the rapid prototyping of electronic components such as soldering, paste dispensing and pick-and-place hardware.

  2. Impact of Growing Business on Software Processes

    NASA Astrophysics Data System (ADS)

    Nikitina, Natalja; Kajko-Mattsson, Mira

    When growing their businesses, software organizations should not only put effort into developing and executing their business strategies, but also into managing and improving their internal software development processes and aligning them with business growth strategies. It is only in this way they may confirm that their businesses grow in a healthy and sustainable way. In this paper, we map out one software company's business growth on the course of its historical events and identify its impact on the company's software production processes and capabilities. The impact concerns benefits, challenges, problems and lessons learned. The most important lesson learned is that although business growth has become a stimulus for starting thinking and improving software processes, the organization lacked guidelines aiding it in and aligning it to business growth. Finally, the paper generates research questions providing a platform for future research.

  3. Computer-aided software development process design

    NASA Technical Reports Server (NTRS)

    Lin, Chi Y.; Levary, Reuven R.

    1989-01-01

    The authors describe an intelligent tool designed to aid managers of software development projects in planning, managing, and controlling the development process of medium- to large-scale software projects. Its purpose is to reduce uncertainties in the budget, personnel, and schedule planning of software development projects. It is based on dynamic model for the software development and maintenance life-cycle process. This dynamic process is composed of a number of time-varying, interacting developmental phases, each characterized by its intended functions and requirements. System dynamics is used as a modeling methodology. The resulting Software LIfe-Cycle Simulator (SLICS) and the hybrid expert simulation system of which it is a subsystem are described.

  4. Buyer Beware: Managing the Software Selection Process.

    ERIC Educational Resources Information Center

    Carse, James W.

    1983-01-01

    Purchased application software has created a new set of problems for project managers charged with selection and implementation responsibilities. Eight activity phases in the software selection process are identified. Development and utilization of this approach at the University of Houston are described. (Author/MLW)

  5. The Personal Software Process: Downscaling the factory

    NASA Technical Reports Server (NTRS)

    Roy, Daniel M.

    1994-01-01

    It is argued that the next wave of software process improvement (SPI) activities will be based on a people-centered paradigm. The most promising such paradigm, Watts Humphrey's personal software process (PSP), is summarized and its advantages are listed. The concepts of the PSP are shown also to fit a down-scaled version of Basili's experience factory. The author's data and lessons learned while practicing the PSP are presented along with personal experience, observations, and advice from the perspective of a consultant and teacher for the personal software process.

  6. Software engineering processes for Class D missions

    NASA Astrophysics Data System (ADS)

    Killough, Ronnie; Rose, Debi

    2013-09-01

    Software engineering processes are often seen as anathemas; thoughts of CMMI key process areas and NPR 7150.2A compliance matrices can motivate a software developer to consider other career fields. However, with adequate definition, common-sense application, and an appropriate level of built-in flexibility, software engineering processes provide a critical framework in which to conduct a successful software development project. One problem is that current models seem to be built around an underlying assumption of "bigness," and assume that all elements of the process are applicable to all software projects regardless of size and tolerance for risk. This is best illustrated in NASA's NPR 7150.2A in which, aside from some special provisions for manned missions, the software processes are to be applied based solely on the criticality of the software to the mission, completely agnostic of the mission class itself. That is, the processes applicable to a Class A mission (high priority, very low risk tolerance, very high national significance) are precisely the same as those applicable to a Class D mission (low priority, high risk tolerance, low national significance). This paper will propose changes to NPR 7150.2A, taking mission class into consideration, and discuss how some of these changes are being piloted for a current Class D mission—the Cyclone Global Navigation Satellite System (CYGNSS).

  7. An Agile Course-Delivery Approach

    ERIC Educational Resources Information Center

    Capellan, Mirkeya

    2009-01-01

    In the world of software development, agile methodologies have gained popularity thanks to their lightweight methodologies and flexible approach. Many advocates believe that agile methodologies can provide significant benefits if applied in the educational environment as a teaching method. The need for an approach that engages and motivates…

  8. Onshore and Offshore Outsourcing with Agility: Lessons Learned

    NASA Astrophysics Data System (ADS)

    Kussmaul, Clifton

    This chapter reflects on case study based an agile distributed project that ran for approximately three years (from spring 2003 to spring 2006). The project involved (a) a customer organization with key personnel distributed across the US, developing an application with rapidly changing requirements; (b) onshore consultants with expertise in project management, development processes, offshoring, and relevant technologies; and (c) an external offsite development team in a CMM-5 organization in southern India. This chapter is based on surveys and discussions with multiple participants. The several years since the project was completed allow greater perspective on both the strengths and weaknesses, since the participants can reflect on the entire life of the project, and compare it to subsequent experiences. Our findings emphasize the potential for agile project management in distributed software development, and the importance of people and interactions, taking many small steps to find and correct errors, and matching the structures of the project and product to support implementation of agility.

  9. Development Process for Science Operation Software

    NASA Astrophysics Data System (ADS)

    Ballester, Pascal

    2015-12-01

    Scientific software development at ESO involves defined processes for the main phases of project inception, monitoring of development performed by instrument consortia, application maintenance, and application support. We discuss the lessons learnt and evolution of the process for the next generation of tools and observing facilities.

  10. Agent-based scheduling system to achieve agility

    NASA Astrophysics Data System (ADS)

    Akbulut, Muhtar B.; Kamarthi, Sagar V.

    2000-12-01

    Today's competitive enterprises need to design, develop, and manufacture their products rapidly and inexpensively. Agile manufacturing has emerged as a new paradigm to meet these challenges. Agility requires, among many other things, scheduling and control software systems that are flexible, robust, and adaptive. In this paper a new agent-based scheduling system (ABBS) is developed to meet the challenges of an agile manufacturing system. In ABSS, unlike in the traditional approaches, information and decision making capabilities are distributed among the system entities called agents. In contrast with the most agent-based scheduling systems which commonly use a bidding approach, the ABBS employs a global performance monitoring strategy. A production-rate-based global performance metric which effectively assesses the system performance is developed to assist the agents' decision making process. To test the architecture, an agent-based discrete event simulation software is developed. The experiments performed using the simulation software yielded encouraging results in supporting the applicability of agent-based systems to address the scheduling and control needs of an agile manufacturing system.

  11. Software Process Assurance for Complex Electronics (SPACE)

    NASA Technical Reports Server (NTRS)

    Plastow, Richard A.

    2007-01-01

    Complex Electronics (CE) are now programmed to perform tasks that were previously handled in software, such as communication protocols. Many of the methods used to develop software bare a close resemblance to CE development. For instance, Field Programmable Gate Arrays (FPGAs) can have over a million logic gates while system-on-chip (SOC) devices can combine a microprocessor, input and output channels, and sometimes an FPGA for programmability. With this increased intricacy, the possibility of software-like bugs such as incorrect design, logic, and unexpected interactions within the logic is great. Since CE devices are obscuring the hardware/software boundary, we propose that mature software methodologies may be utilized with slight modifications in the development of these devices. Software Process Assurance for Complex Electronics (SPACE) is a research project that looks at using standardized S/W Assurance/Engineering practices to provide an assurance framework for development activities. Tools such as checklists, best practices and techniques can be used to detect missing requirements and bugs earlier in the development cycle creating a development process for CE that will be more easily maintained, consistent and configurable based on the device used.

  12. Software Process Assurance for Complex Electronics

    NASA Technical Reports Server (NTRS)

    Plastow, Richard A.

    2007-01-01

    Complex Electronics (CE) now perform tasks that were previously handled in software, such as communication protocols. Many methods used to develop software bare a close resemblance to CE development. Field Programmable Gate Arrays (FPGAs) can have over a million logic gates while system-on-chip (SOC) devices can combine a microprocessor, input and output channels, and sometimes an FPGA for programmability. With this increased intricacy, the possibility of software-like bugs such as incorrect design, logic, and unexpected interactions within the logic is great. With CE devices obscuring the hardware/software boundary, we propose that mature software methodologies may be utilized with slight modifications in the development of these devices. Software Process Assurance for Complex Electronics (SPACE) is a research project that used standardized S/W Assurance/Engineering practices to provide an assurance framework for development activities. Tools such as checklists, best practices and techniques were used to detect missing requirements and bugs earlier in the development cycle creating a development process for CE that was more easily maintained, consistent and configurable based on the device used.

  13. Simulation Modeling of Software Development Processes

    NASA Technical Reports Server (NTRS)

    Calavaro, G. F.; Basili, V. R.; Iazeolla, G.

    1996-01-01

    A simulation modeling approach is proposed for the prediction of software process productivity indices, such as cost and time-to-market, and the sensitivity analysis of such indices to changes in the organization parameters and user requirements. The approach uses a timed Petri Net and Object Oriented top-down model specification. Results demonstrate the model representativeness, and its usefulness in verifying process conformance to expectations, and in performing continuous process improvement and optimization.

  14. Software Engineering Laboratory (SEL) cleanroom process model

    NASA Technical Reports Server (NTRS)

    Green, Scott; Basili, Victor; Godfrey, Sally; Mcgarry, Frank; Pajerski, Rose; Waligora, Sharon

    1991-01-01

    The Software Engineering Laboratory (SEL) cleanroom process model is described. The term 'cleanroom' originates in the integrated circuit (IC) production process, where IC's are assembled in dust free 'clean rooms' to prevent the destructive effects of dust. When applying the clean room methodology to the development of software systems, the primary focus is on software defect prevention rather than defect removal. The model is based on data and analysis from previous cleanroom efforts within the SEL and is tailored to serve as a guideline in applying the methodology to future production software efforts. The phases that are part of the process model life cycle from the delivery of requirements to the start of acceptance testing are described. For each defined phase, a set of specific activities is discussed, and the appropriate data flow is described. Pertinent managerial issues, key similarities and differences between the SEL's cleanroom process model and the standard development approach used on SEL projects, and significant lessons learned from prior cleanroom projects are presented. It is intended that the process model described here will be further tailored as additional SEL cleanroom projects are analyzed.

  15. DOCLIB: a software library for document processing

    NASA Astrophysics Data System (ADS)

    Jaeger, Stefan; Zhu, Guangyu; Doermann, David; Chen, Kevin; Sampat, Summit

    2006-01-01

    Most researchers would agree that research in the field of document processing can benefit tremendously from a common software library through which institutions are able to develop and share research-related software and applications across academic, business, and government domains. However, despite several attempts in the past, the research community still lacks a widely-accepted standard software library for document processing. This paper describes a new library called DOCLIB, which tries to overcome the drawbacks of earlier approaches. Many of DOCLIB's features are unique either in themselves or in their combination with others, e.g. the factory concept for support of different image types, the juxtaposition of image data and metadata, or the add-on mechanism. We cherish the hope that DOCLIB serves the needs of researchers better than previous approaches and will readily be accepted by a larger group of scientists.

  16. Software for MR image overlay guided needle insertions: the clinical translation process

    NASA Astrophysics Data System (ADS)

    Ungi, Tamas; U-Thainual, Paweena; Fritz, Jan; Iordachita, Iulian I.; Flammang, Aaron J.; Carrino, John A.; Fichtinger, Gabor

    2013-03-01

    PURPOSE: Needle guidance software using augmented reality image overlay was translated from the experimental phase to support preclinical and clinical studies. Major functional and structural changes were needed to meet clinical requirements. We present the process applied to fulfill these requirements, and selected features that may be applied in the translational phase of other image-guided surgical navigation systems. METHODS: We used an agile software development process for rapid adaptation to unforeseen clinical requests. The process is based on iterations of operating room test sessions, feedback discussions, and software development sprints. The open-source application framework of 3D Slicer and the NA-MIC kit provided sufficient flexibility and stable software foundations for this work. RESULTS: All requirements were addressed in a process with 19 operating room test iterations. Most features developed in this phase were related to workflow simplification and operator feedback. CONCLUSION: Efficient and affordable modifications were facilitated by an open source application framework and frequent clinical feedback sessions. Results of cadaver experiments show that software requirements were successfully solved after a limited number of operating room tests.

  17. The Telemetry Agile Manufacturing Effort

    SciTech Connect

    Brown, K.D.

    1995-01-01

    The Telemetry Agile Manufacturing Effort (TAME) is an agile enterprising demonstration sponsored by the US Department of Energy (DOE). The project experimented with new approaches to product realization and assessed their impacts on performance, cost, flow time, and agility. The purpose of the project was to design the electrical and mechanical features of an integrated telemetry processor, establish the manufacturing processes, and produce an initial production lot of two to six units. This paper outlines the major methodologies utilized by the TAME, describes the accomplishments that can be attributed to each methodology, and finally, examines the lessons learned and explores the opportunities for improvement associated with the overall effort. The areas for improvement are discussed relative to an ideal vision of the future for agile enterprises. By the end of the experiment, the TAME reduced production flow time by approximately 50% and life cycle cost by more than 30%. Product performance was improved compared with conventional DOE production approaches.

  18. AIRS Maps from Space Processing Software

    NASA Technical Reports Server (NTRS)

    Thompson, Charles K.; Licata, Stephen J.

    2012-01-01

    This software package processes Atmospheric Infrared Sounder (AIRS) Level 2 swath standard product geophysical parameters, and generates global, colorized, annotated maps. It automatically generates daily and multi-day averaged colorized and annotated maps of various AIRS Level 2 swath geophysical parameters. It also generates AIRS input data sets for Eyes on Earth, Puffer-sphere, and Magic Planet. This program is tailored to AIRS Level 2 data products. It re-projects data into 1/4-degree grids that can be combined and averaged for any number of days. The software scales and colorizes global grids utilizing AIRS-specific color tables, and annotates images with title and color bar. This software can be tailored for use with other swath data products for the purposes of visualization.

  19. Human factors in agile manufacturing

    SciTech Connect

    Forsythe, C.

    1995-03-01

    As industries position themselves for the competitive markets of today, and the increasingly competitive global markets of the 21st century, agility, or the ability to rapidly develop and produce new products, represents a common trend. Agility manifests itself in many different forms, with the agile manufacturing paradigm proposed by the Iacocca Institute offering a generally accepted, long-term vision. In its many forms, common elements of agility or agile manufacturing include: changes in business, engineering and production practices, seamless information flow from design through production, integration of computer and information technologies into all facets of the product development and production process, application of communications technologies to enable collaborative work between geographically dispersed product development team members and introduction of flexible automation of production processes. Industry has rarely experienced as dramatic an infusion of new technologies or as extensive a change in culture and work practices. Human factors will not only play a vital role in accomplishing the technical and social objectives of agile manufacturing. but has an opportunity to participate in shaping the evolution of industry paradigms for the 21st century.

  20. Adopting software quality measures for healthcare processes.

    PubMed

    Yildiz, Ozkan; Demirörs, Onur

    2009-01-01

    In this study, we investigated the adoptability of software quality measures for healthcare process measurement. Quality measures of ISO/IEC 9126 are redefined from a process perspective to build a generic healthcare process quality measurement model. Case study research method is used, and the model is applied to a public hospital's Entry to Care process. After the application, weak and strong aspects of the process can be easily observed. Access audibility, fault removal, completeness of documentation, and machine utilization are weak aspects and these aspects are the candidates for process improvement. On the other hand, functional completeness, fault ratio, input validity checking, response time, and throughput time are the strong aspects of the process. PMID:19745339

  1. Implementing Extreme Programming in Distributed Software Project Teams: Strategies and Challenges

    NASA Astrophysics Data System (ADS)

    Maruping, Likoebe M.

    Agile software development methods and distributed forms of organizing teamwork are two team process innovations that are gaining prominence in today's demanding software development environment. Individually, each of these innovations has yielded gains in the practice of software development. Agile methods have enabled software project teams to meet the challenges of an ever turbulent business environment through enhanced flexibility and responsiveness to emergent customer needs. Distributed software project teams have enabled organizations to access highly specialized expertise across geographic locations. Although much progress has been made in understanding how to more effectively manage agile development teams and how to manage distributed software development teams, managers have little guidance on how to leverage these two potent innovations in combination. In this chapter, I outline some of the strategies and challenges associated with implementing agile methods in distributed software project teams. These are discussed in the context of a study of a large-scale software project in the United States that lasted four months.

  2. Product review: lucis image processing software.

    PubMed

    Johnson, J E

    1999-04-01

    Lucis is a software program that allows the manipulation of images through the process of selective contrast pattern emphasis. Using an image-processing algorithm called Differential Hysteresis Processing (DHP), Lucis extracts and highlights patterns based on variations in image intensity (luminance). The result is that details can be seen that would otherwise be hidden in deep shadow or excessive brightness. The software is contained on a single floppy disk, is easy to install on a PC, simple to use, and runs on Windows 95, Windows 98, and Windows NT operating systems. The cost is $8,500 for a license, but is estimated to save a great deal of money in photographic materials, time, and labor that would have otherwise been spent in the darkroom. Superb images are easily obtained from unstained (no lead or uranium) sections, and stored image files sent to laser printers are of publication quality. The software can be used not only for all types of microscopy, including color fluorescence light microscopy, biological and materials science electron microscopy (TEM and SEM), but will be beneficial in medicine, such as X-ray films (pending approval by the FDA), and in the arts. PMID:10206154

  3. AGILE Data Center and AGILE science highlights

    NASA Astrophysics Data System (ADS)

    Pittori, C.

    2013-06-01

    AGILE is a scientific mission of the Italian Space Agency (ASI) with INFN, INAF e CIFS participation, devoted to gamma-ray astrophysics. The satellite is in orbit since April 23rd, 2007. Gamma-ray astrophysics above 100 MeV is an exciting field of astronomical sciences that has received a strong impulse in recent years. Despite the small size and budget, AGILE produced several important scientific results, among which the unexpected discovery of strong and rapid gamma-ray flares from the Crab Nebula. This discovery won to the AGILE PI and the AGILE Team the prestigious Bruno Rossi Prize for 2012, an international recognition in the field of high energy astrophysics. We present here the AGILE data center main activities, and we give an overview of the AGILE scientific highlights after 5 years of operations.

  4. A general software reliability process simulation technique

    NASA Technical Reports Server (NTRS)

    Tausworthe, Robert C.

    1991-01-01

    The structure and rationale of the generalized software reliability process, together with the design and implementation of a computer program that simulates this process are described. Given assumed parameters of a particular project, the users of this program are able to generate simulated status timelines of work products, numbers of injected anomalies, and the progress of testing, fault isolation, repair, validation, and retest. Such timelines are useful in comparison with actual timeline data, for validating the project input parameters, and for providing data for researchers in reliability prediction modeling.

  5. SUPRIM: easily modified image processing software.

    PubMed

    Schroeter, J P; Bretaudiere, J P

    1996-01-01

    A flexible, modular software package intended for the processing of electron microscopy images is presented. The system consists of a set of image processing tools or filters, written in the C programming language, and a command line style user interface based on the UNIX shell. The pipe and filter structure of UNIX and the availability of command files in the form of shell scripts eases the construction of complex image processing procedures from the simpler tools. Implementation of a new image processing algorithm in SUPRIM may often be performed by construction of a new shell script, using already existing tools. Currently, the package has been used for two- and three-dimensional image processing and reconstruction of macromolecules and other structures of biological interest. PMID:8742734

  6. Choosing CALL Software: Beginning the Evaluation Process.

    ERIC Educational Resources Information Center

    Bader, Melissa J.

    2000-01-01

    Synthesizes information that is available on software evaluation and provides a software evaluation checklist to help educators examine software based on linguistic and pedagogical criteria. The checklist allows educators to compare and contrast software products, enabling them to select software that is best suited to their classrooms.…

  7. The ATLAS data management software engineering process

    NASA Astrophysics Data System (ADS)

    Lassnig, M.; Garonne, V.; Stewart, G. A.; Barisits, M.; Beermann, T.; Vigne, R.; Serfon, C.; Goossens, L.; Nairz, A.; Molfetas, A.; Atlas Collaboration

    2014-06-01

    Rucio is the next-generation data management system of the ATLAS experiment. The software engineering process to develop Rucio is fundamentally different to existing software development approaches in the ATLAS distributed computing community. Based on a conceptual design document, development takes place using peer-reviewed code in a test-driven environment. The main objectives are to ensure that every engineer understands the details of the full project, even components usually not touched by them, that the design and architecture are coherent, that temporary contributors can be productive without delay, that programming mistakes are prevented before being committed to the source code, and that the source is always in a fully functioning state. This contribution will illustrate the workflows and products used, and demonstrate the typical development cycle of a component from inception to deployment within this software engineering process. Next to the technological advantages, this contribution will also highlight the social aspects of an environment where every action is subject to detailed scrutiny.

  8. Preparing your Offshore Organization for Agility: Experiences in India

    NASA Astrophysics Data System (ADS)

    Srinivasan, Jayakanth

    Two strategies that have significantly changed the way we conventionally think about managing software development and sustainment are the family of development approaches collectively referred to as agile methods, and the distribution of development efforts on a global scale. When you combine the two strategies, organizations have to address not only the technical challenges that arise from introducing new ways of working, but more importantly have to manage the 'soft' factors that if ignored lead to hard challenges. Using two case studies of distributed agile software development in India we illustrate the areas that organizations need to be aware of when transitioning work to India. The key issues that we emphasize are the need to recruit and retain personnel; the importance of teaching, mentoring and coaching; the need to manage customer expectations; the criticality of well-articulated senior leadership vision and commitment; and the reality of operating in a heterogeneous process environment.

  9. Software Replica of Minimal Living Processes

    NASA Astrophysics Data System (ADS)

    Bersini, Hugues

    2010-04-01

    There is a long tradition of software simulations in theoretical biology to complement pure analytical mathematics which are often limited to reproduce and understand the self-organization phenomena resulting from the non-linear and spatially grounded interactions of the huge number of diverse biological objects. Since John Von Neumann and Alan Turing pioneering works on self-replication and morphogenesis, proponents of artificial life have chosen to resolutely neglecting a lot of materialistic and quantitative information deemed not indispensable and have focused on the rule-based mechanisms making life possible, supposedly neutral with respect to their underlying material embodiment. Minimal life begins at the intersection of a series of processes which need to be isolated, differentiated and duplicated as such in computers. Only software developments and running make possible to understand the way these processes are intimately interconnected in order for life to appear at the crossroad. In this paper, I will attempt to set out the history of life as the disciples of artificial life understand it, by placing these different lessons on a temporal and causal axis, showing which one is indispensable to the appearance of the next and how does it connect to the next. I will discuss the task of artificial life as setting up experimental software platforms where these different lessons, whether taken in isolation or together, are tested, simulated, and, more systematically, analyzed. I will sketch some of these existing software platforms: chemical reaction networks, Varela’s autopoietic cellular automata, Ganti’s chemoton model, whose running delivers interesting take home messages to open-minded biologists.

  10. Software replica of minimal living processes.

    PubMed

    Bersini, Hugues

    2010-04-01

    There is a long tradition of software simulations in theoretical biology to complement pure analytical mathematics which are often limited to reproduce and understand the self-organization phenomena resulting from the non-linear and spatially grounded interactions of the huge number of diverse biological objects. Since John Von Neumann and Alan Turing pioneering works on self-replication and morphogenesis, proponents of artificial life have chosen to resolutely neglecting a lot of materialistic and quantitative information deemed not indispensable and have focused on the rule-based mechanisms making life possible, supposedly neutral with respect to their underlying material embodiment. Minimal life begins at the intersection of a series of processes which need to be isolated, differentiated and duplicated as such in computers. Only software developments and running make possible to understand the way these processes are intimately interconnected in order for life to appear at the crossroad. In this paper, I will attempt to set out the history of life as the disciples of artificial life understand it, by placing these different lessons on a temporal and causal axis, showing which one is indispensable to the appearance of the next and how does it connect to the next. I will discuss the task of artificial life as setting up experimental software platforms where these different lessons, whether taken in isolation or together, are tested, simulated, and, more systematically, analyzed. I will sketch some of these existing software platforms: chemical reaction networks, Varela's autopoietic cellular automata, Ganti's chemoton model, whose running delivers interesting take home messages to open-minded biologists. PMID:20204519

  11. A software architecture for automating operations processes

    NASA Technical Reports Server (NTRS)

    Miller, Kevin J.

    1994-01-01

    The Operations Engineering Lab (OEL) at JPL has developed a software architecture based on an integrated toolkit approach for simplifying and automating mission operations tasks. The toolkit approach is based on building adaptable, reusable graphical tools that are integrated through a combination of libraries, scripts, and system-level user interface shells. The graphical interface shells are designed to integrate and visually guide a user through the complex steps in an operations process. They provide a user with an integrated system-level picture of an overall process, defining the required inputs and possible output through interactive on-screen graphics. The OEL has developed the software for building these process-oriented graphical user interface (GUI) shells. The OEL Shell development system (OEL Shell) is an extension of JPL's Widget Creation Library (WCL). The OEL Shell system can be used to easily build user interfaces for running complex processes, applications with extensive command-line interfaces, and tool-integration tasks. The interface shells display a logical process flow using arrows and box graphics. They also allow a user to select which output products are desired and which input sources are needed, eliminating the need to know which program and its associated command-line parameters must be executed in each case. The shells have also proved valuable for use as operations training tools because of the OEL Shell hypertext help environment. The OEL toolkit approach is guided by several principles, including the use of ASCII text file interfaces with a multimission format, Perl scripts for mission-specific adaptation code, and programs that include a simple command-line interface for batch mode processing. Projects can adapt the interface shells by simple changes to the resources configuration file. This approach has allowed the development of sophisticated, automated software systems that are easy, cheap, and fast to build. This paper will

  12. Software: our quest for excellence. Honoring 50 years of software history, progress, and process

    SciTech Connect

    1997-06-01

    The Software Quality Forum was established by the Software Quality Assurance (SQA) Subcommittee, which serves as a technical advisory group on software engineering and quality initiatives and issues for DOE`s quality managers. The forum serves as an opportunity for all those involved in implementing SQA programs to meet and share ideas and concerns. Participation from managers, quality engineers, and software professionals provides an ideal environment for identifying and discussing issues and concerns. The interaction provided by the forum contributes to the realization of a shared goal--high quality software product. Topics include: testing, software measurement, software surety, software reliability, SQA practices, assessments, software process improvement, certification and licensing of software professionals, CASE tools, software project management, inspections, and management`s role in ensuring SQA. The bulk of this document consists of vugraphs. Selected papers have been indexed separately for inclusion in the Energy Science and Technology Database.

  13. Fuzzy Logic Enhanced Digital PIV Processing Software

    NASA Technical Reports Server (NTRS)

    Wernet, Mark P.

    1999-01-01

    Digital Particle Image Velocimetry (DPIV) is an instantaneous, planar velocity measurement technique that is ideally suited for studying transient flow phenomena in high speed turbomachinery. DPIV is being actively used at the NASA Glenn Research Center to study both stable and unstable operating conditions in a high speed centrifugal compressor. Commercial PIV systems are readily available which provide near real time feedback of the PIV image data quality. These commercial systems are well designed to facilitate the expedient acquisition of PIV image data. However, as with any general purpose system, these commercial PIV systems do not meet all of the data processing needs required for PIV image data reduction in our compressor research program. An in-house PIV PROCessing (PIVPROC) code has been developed for reducing PIV data. The PIVPROC software incorporates fuzzy logic data validation for maximum information recovery from PIV image data. PIVPROC enables combined cross-correlation/particle tracking wherein the highest possible spatial resolution velocity measurements are obtained.

  14. Current State of Agile User-Centered Design: A Survey

    NASA Astrophysics Data System (ADS)

    Hussain, Zahid; Slany, Wolfgang; Holzinger, Andreas

    Agile software development methods are quite popular nowadays and are being adopted at an increasing rate in the industry every year. However, these methods are still lacking usability awareness in their development lifecycle, and the integration of usability/User-Centered Design (UCD) into agile methods is not adequately addressed. This paper presents the preliminary results of a recently conducted online survey regarding the current state of the integration of agile methods and usability/UCD. A world wide response of 92 practitioners was received. The results show that the majority of practitioners perceive that the integration of agile methods with usability/UCD has added value to their adopted processes and to their teams; has resulted in the improvement of usability and quality of the product developed; and has increased the satisfaction of the end-users of the product developed. The top most used HCI techniques are low-fidelity prototyping, conceptual designs, observational studies of users, usability expert evaluations, field studies, personas, rapid iterative testing, and laboratory usability testing.

  15. Tailoring Agility: Promiscuous Pair Story Authoring and Value Calculation

    NASA Astrophysics Data System (ADS)

    Tendon, Steve

    This chapter describes how a multi-national software organization created a business plan involving business units from eight countries that followed an agile way, after two previously failed attempts with traditional approaches. The case is told by the consultant who initiated implementation of agility into requirements gathering, estimation and planning processes in an international setting. The agile approach was inspired by XP, but then tailored to meet the peculiar requirements. Two innovations were critical. The first innovation was promiscuous pair story authoring, where user stories were written by two people (similarly to pair programming), and the pairing changed very often (as frequently as every 15-20 minutes) to achieve promiscuity and cater for diverse point of views. The second innovation was an economic value evaluation (and not the cost) which was attributed to stories. Continuous recalculation of the financial value of the stories allowed to assess the projects financial return. In this case implementation of agility in the international context allowed the involved team members to reach consensus and unanimity of decisions, vision and purpose.

  16. Some Findings Concerning Requirements in Agile Methodologies

    NASA Astrophysics Data System (ADS)

    Rodríguez, Pilar; Yagüe, Agustín; Alarcón, Pedro P.; Garbajosa, Juan

    Agile methods have appeared as an attractive alternative to conventional methodologies. These methods try to reduce the time to market and, indirectly, the cost of the product through flexible development and deep customer involvement. The processes related to requirements have been extensively studied in literature, in most cases in the frame of conventional methods. However, conclusions of conventional methodologies could not be necessarily valid for Agile; in some issues, conventional and Agile processes are radically different. As recent surveys report, inadequate project requirements is one of the most conflictive issues in agile approaches and better understanding about this is needed. This paper describes some findings concerning requirements activities in a project developed under an agile methodology. The project intended to evolve an existing product and, therefore, some background information was available. The major difficulties encountered were related to non-functional needs and management of requirements dependencies.

  17. A Roadmap for Using Agile Development in a Traditional Environment

    NASA Technical Reports Server (NTRS)

    Streiffert, Barbara A.; Starbird, Thomas; Grenander, Sven

    2006-01-01

    One of the newer classes of software engineering techniques is called 'Agile Development'. In Agile Development software engineers take small implementation steps and, in some cases they program in pairs. In addition, they develop automatic tests prior to implementing their small functional piece. Agile Development focuses on rapid turnaround, incremental planning, customer involvement and continuous integration. Agile Development is not the traditional waterfall method or even a rapid prototyping method (although this methodology is closer to Agile Development). At Jet Propulsion Laboratory (JPL) a few groups have begun Agile Development software implementations. The difficulty with this approach becomes apparent when Agile Development is used in an organization that has specific criteria and requirements handed down for how software development is to be performed. The work at the JPL is performed for the National Aeronautics and Space Agency (NASA). Both organizations have specific requirements, rules and procedure for developing software. This paper will discuss the some of the initial uses of the Agile Development methodology, the spread of this method and the current status of the successful incorporation into the current JPL development policies.

  18. Agile manufacturing from a statistical perspective

    SciTech Connect

    Easterling, R.G.

    1995-10-01

    The objective of agile manufacturing is to provide the ability to quickly realize high-quality, highly-customized, in-demand products at a cost commensurate with mass production. More broadly, agility in manufacturing, or any other endeavor, is defined as change-proficiency; the ability to thrive in an environment of unpredictable change. This report discusses the general direction of the agile manufacturing initiative, including research programs at the National Institute of Standards and Technology (NIST), the Department of Energy, and other government agencies, but focuses on agile manufacturing from a statistical perspective. The role of statistics can be important because agile manufacturing requires the collection and communication of process characterization and capability information, much of which will be data-based. The statistical community should initiate collaborative work in this important area.

  19. ICESat (GLAS) Science Processing Software Document Series. Volume 1; Science Software Management Plan; 3.0

    NASA Technical Reports Server (NTRS)

    Hancock, David W., III

    1999-01-01

    This document provides the Software Management Plan for the GLAS Standard Data Software (SDS) supporting the GLAS instrument of the EOS ICESat Spacecraft. The SDS encompasses the ICESat Science Investigator-led Processing System (I-SIPS) Software and the Instrument Support Terminal (IST) Software. For the I-SIPS Software, the SDS will produce Level 0, Level 1, and Level 2 data products as well as the associated product quality assessments and descriptive information. For the IST Software, the SDS will accommodate the GLAS instrument support areas of engineering status, command, performance assessment, and instrument health status.

  20. Parallel Processing with Digital Signal Processing Hardware and Software

    NASA Technical Reports Server (NTRS)

    Swenson, Cory V.

    1995-01-01

    The assembling and testing of a parallel processing system is described which will allow a user to move a Digital Signal Processing (DSP) application from the design stage to the execution/analysis stage through the use of several software tools and hardware devices. The system will be used to demonstrate the feasibility of the Algorithm To Architecture Mapping Model (ATAMM) dataflow paradigm for static multiprocessor solutions of DSP applications. The individual components comprising the system are described followed by the installation procedure, research topics, and initial program development.

  1. Software engineering technology transfer: Understanding the process

    NASA Technical Reports Server (NTRS)

    Zelkowitz, Marvin V.

    1993-01-01

    Technology transfer is of crucial concern to both government and industry today. In this report, the mechanisms developed by NASA to transfer technology are explored and the actual mechanisms used to transfer software development technologies are investigated. Time, cost, and effectiveness of software engineering technology transfer is reported.

  2. Software Process Improvement: Supporting the Linking of the Software and the Business Strategies

    NASA Astrophysics Data System (ADS)

    Albuquerque, Adriano Bessa; Rocha, Ana Regina; Lima, Andreia Cavalcanti

    The market is becoming more and more competitive, a lot of products and services depend of the software product and the software is one of the most important assets, which influence the organizations’ businesses. Considering this context, we can observe that the companies must to deal with the software, developing or acquiring, carefully. One of the perspectives that can help to take advantage of the software, supporting effectively the business, is to invest on the organization’s software processes. This paper presents an approach to evaluate and improve the processes assets of the software organizations, based on internationally well-known standards and process models. This approach is supported by automated tools from the TABA Workstation and is part of a wider improvement strategy constituted of three layers (organizational layer, process execution layer and external entity layer). Moreover, this paper presents the experience of use and their results.

  3. Safety. [requirements for software to monitor and control critical processes

    NASA Technical Reports Server (NTRS)

    Leveson, Nancy G.

    1991-01-01

    Software requirements, design, implementation, verification and validation, and especially management are affected by the need to produce safe software. This paper discusses the changes in the software life cycle that are necessary to ensure that software will execute without resulting in unacceptable risk. Software is being used increasingly to monitor and control safety-critical processes in which a run-time failure or error could result in unacceptable losses such as death, injury, loss of property, or environmental harm. Examples of such processes maybe found in transportation, energy, aerospace, basic industry, medicine, and defense systems.

  4. Developing communications requirements for Agile Product Realization

    SciTech Connect

    Forsythe, C.; Ashby, M.R.

    1994-03-01

    Sandia National Laboratories has undertaken the Agile Product Realization for Innovative electroMEchanical Devices (A-PRIMED) pilot project to develop and implement technologies for agile design and manufacturing of electrochemical components. Emphasis on information-driven processes, concurrent engineering and multi-functional team communications makes computer-supported cooperative work critical to achieving significantly faster product development cycles. This report describes analyses conducted in developing communications requirements and a communications plan that addresses the unique communications demands of an agile enterprise.

  5. Agility enabled by the SEMATECH CIM framework

    NASA Astrophysics Data System (ADS)

    Hawker, Scott; Waskiewicz, Fred

    1997-01-01

    The survivor in today's market environment is agile: able to survive and thrive in a market place marked by rapid, continuous change. For manufacturers, this includes an ability to rapidly develop, deploy and reconfigure manufacturing information and control systems. The SEMATECH CIM framework defines an application integration architecture and standard application components that enable agile manufacturing information and control systems. Further, the CIM framework and its evolution process foster virtual organizations of suppliers and manufacturers, combining their products and capabilities into an agile manufacturing information and control system.

  6. Software Design Improvements. Part 2; Software Quality and the Design and Inspection Process

    NASA Technical Reports Server (NTRS)

    Lalli, Vincent R.; Packard, Michael H.; Ziemianski, Tom

    1997-01-01

    The application of assurance engineering techniques improves the duration of failure-free performance of software. The totality of features and characteristics of a software product are what determine its ability to satisfy customer needs. Software in safety-critical systems is very important to NASA. We follow the System Safety Working Groups definition for system safety software as: 'The optimization of system safety in the design, development, use and maintenance of software and its integration with safety-critical systems in an operational environment. 'If it is not safe, say so' has become our motto. This paper goes over methods that have been used by NASA to make software design improvements by focusing on software quality and the design and inspection process.

  7. An assessment of space shuttle flight software development processes

    NASA Technical Reports Server (NTRS)

    1993-01-01

    In early 1991, the National Aeronautics and Space Administration's (NASA's) Office of Space Flight commissioned the Aeronautics and Space Engineering Board (ASEB) of the National Research Council (NRC) to investigate the adequacy of the current process by which NASA develops and verifies changes and updates to the Space Shuttle flight software. The Committee for Review of Oversight Mechanisms for Space Shuttle Flight Software Processes was convened in Jan. 1992 to accomplish the following tasks: (1) review the entire flight software development process from the initial requirements definition phase to final implementation, including object code build and final machine loading; (2) review and critique NASA's independent verification and validation process and mechanisms, including NASA's established software development and testing standards; (3) determine the acceptability and adequacy of the complete flight software development process, including the embedded validation and verification processes through comparison with (1) generally accepted industry practices, and (2) generally accepted Department of Defense and/or other government practices (comparing NASA's program with organizations and projects having similar volumes of software development, software maturity, complexity, criticality, lines of code, and national standards); (4) consider whether independent verification and validation should continue. An overview of the study, independent verification and validation of critical software, and the Space Shuttle flight software development process are addressed. Findings and recommendations are presented.

  8. Compiling software for a hierarchical distributed processing system

    DOEpatents

    Archer, Charles J; Blocksome, Michael A; Ratterman, Joseph D; Smith, Brian E

    2013-12-31

    Compiling software for a hierarchical distributed processing system including providing to one or more compiling nodes software to be compiled, wherein at least a portion of the software to be compiled is to be executed by one or more nodes; compiling, by the compiling node, the software; maintaining, by the compiling node, any compiled software to be executed on the compiling node; selecting, by the compiling node, one or more nodes in a next tier of the hierarchy of the distributed processing system in dependence upon whether any compiled software is for the selected node or the selected node's descendents; sending to the selected node only the compiled software to be executed by the selected node or selected node's descendent.

  9. The IEEE Software Engineering Standards Process

    PubMed Central

    Buckley, Fletcher J.

    1984-01-01

    Software Engineering has emerged as a field in recent years, and those involved increasingly recognize the need for standards. As a result, members of the Institute of Electrical and Electronics Engineers (IEEE) formed a subcommittee to develop these standards. This paper discusses the ongoing standards development, and associated efforts.

  10. 7 Processes that Enable NASA Software Engineering Technologies: Value-Added Process Engineering

    NASA Technical Reports Server (NTRS)

    Housch, Helen; Godfrey, Sally

    2011-01-01

    The presentation reviews Agency process requirements and the purpose, benefits, and experiences or seven software engineering processes. The processes include: product integration, configuration management, verification, software assurance, measurement and analysis, requirements management, and planning and monitoring.

  11. Agile parallel bioinformatics workflow management using Pwrake

    PubMed Central

    2011-01-01

    Background In bioinformatics projects, scientific workflow systems are widely used to manage computational procedures. Full-featured workflow systems have been proposed to fulfil the demand for workflow management. However, such systems tend to be over-weighted for actual bioinformatics practices. We realize that quick deployment of cutting-edge software implementing advanced algorithms and data formats, and continuous adaptation to changes in computational resources and the environment are often prioritized in scientific workflow management. These features have a greater affinity with the agile software development method through iterative development phases after trial and error. Here, we show the application of a scientific workflow system Pwrake to bioinformatics workflows. Pwrake is a parallel workflow extension of Ruby's standard build tool Rake, the flexibility of which has been demonstrated in the astronomy domain. Therefore, we hypothesize that Pwrake also has advantages in actual bioinformatics workflows. Findings We implemented the Pwrake workflows to process next generation sequencing data using the Genomic Analysis Toolkit (GATK) and Dindel. GATK and Dindel workflows are typical examples of sequential and parallel workflows, respectively. We found that in practice, actual scientific workflow development iterates over two phases, the workflow definition phase and the parameter adjustment phase. We introduced separate workflow definitions to help focus on each of the two developmental phases, as well as helper methods to simplify the descriptions. This approach increased iterative development efficiency. Moreover, we implemented combined workflows to demonstrate modularity of the GATK and Dindel workflows. Conclusions Pwrake enables agile management of scientific workflows in the bioinformatics domain. The internal domain specific language design built on Ruby gives the flexibility of rakefiles for writing scientific workflows. Furthermore, readability

  12. Process-Based Software: Increasing Financial Control and Minimizing Change.

    ERIC Educational Resources Information Center

    Brown, Shawn

    1999-01-01

    New, highly flexible accounting software packages have abandoned the rigidity of past systems and are designed to be easily customized to the school district's needs without programming. Process-based software integrates processes and work flow, allowing users to work across application and function boundaries, as well as organizational…

  13. Software for Demonstration of Features of Chain Polymerization Processes

    ERIC Educational Resources Information Center

    Sosnowski, Stanislaw

    2013-01-01

    Free software for the demonstration of the features of homo- and copolymerization processes (free radical, controlled radical, and living) is described. The software is based on the Monte Carlo algorithms and offers insight into the kinetics, molecular weight distribution, and microstructure of the macromolecules formed in those processes. It also…

  14. Process Acceptance and Adoption by IT Software Project Practitioners

    ERIC Educational Resources Information Center

    Guardado, Deana R.

    2012-01-01

    This study addresses the question of what factors determine acceptance and adoption of processes in the context of Information Technology (IT) software development projects. This specific context was selected because processes required for managing software development projects are less prescriptive than in other, more straightforward, IT…

  15. Borehole seismic data processing and interpretation: New free software

    NASA Astrophysics Data System (ADS)

    Farfour, Mohammed; Yoon, Wang Jung

    2015-12-01

    Vertical Seismic Profile (VSP) surveying is a vital tool in subsurface imaging and reservoir characterization. The technique allows geophysicists to infer critical information that cannot be obtained otherwise. MVSP is a new MATLAB tool with a graphical user interface (GUI) for VSP shot modeling, data processing, and interpretation. The software handles VSP data from the loading and preprocessing stages to the final stage of corridor plotting and integration with well and seismic data. Several seismic and signal processing toolboxes are integrated and modified to suit and enrich the processing and display packages. The main motivation behind the development of the software is to provide new geoscientists and students in the geoscience fields with free software that brings together all VSP modules in one easy-to-use package. The software has several modules that allow the user to test, process, compare, visualize, and produce publication-quality results. The software is developed as a stand-alone MATLAB application that requires only MATLAB Compiler Runtime (MCR) to run with full functionality. We present a detailed description of MVSP and use the software to create synthetic VSP data. The data are then processed using different available tools. Next, real data are loaded and fully processed using the software. The data are then integrated with well data for more detailed analysis and interpretation. In order to evaluate the software processing flow accuracy, the same data are processed using commercial software. Comparison of the processing results shows that MVSP is able to process VSP data as efficiently as commercial software packages currently used in industry, and provides similar high-quality processed data.

  16. Integrating interface slicing into software engineering processes

    NASA Technical Reports Server (NTRS)

    Beck, Jon

    1993-01-01

    Interface slicing is a tool which was developed to facilitate software engineering. As previously presented, it was described in terms of its techniques and mechanisms. The integration of interface slicing into specific software engineering activities is considered by discussing a number of potential applications of interface slicing. The applications discussed specifically address the problems, issues, or concerns raised in a previous project. Because a complete interface slicer is still under development, these applications must be phrased in future tenses. Nonetheless, the interface slicing techniques which were presented can be implemented using current compiler and static analysis technology. Whether implemented as a standalone tool or as a module in an integrated development or reverse engineering environment, they require analysis no more complex than that required for current system development environments. By contrast, conventional slicing is a methodology which, while showing much promise and intuitive appeal, has yet to be fully implemented in a production language environment despite 12 years of development.

  17. Image processing software for imaging spectrometry

    NASA Technical Reports Server (NTRS)

    Mazer, Alan S.; Martin, Miki; Lee, Meemong; Solomon, Jerry E.

    1988-01-01

    The paper presents a software system, Spectral Analysis Manager (SPAM), which has been specifically designed and implemented to provide the exploratory analysis tools necessary for imaging spectrometer data, using only modest computational resources. The basic design objectives are described as well as the major algorithms designed or adapted for high-dimensional images. Included in a discussion of system implementation are interactive data display, statistical analysis, image segmentation and spectral matching, and mixture analysis.

  18. Peridigm summary report : lessons learned in development with agile components.

    SciTech Connect

    Salinger, Andrew Gerhard; Mitchell, John Anthony; Littlewood, David John; Parks, Michael L.

    2011-09-01

    This report details efforts to deploy Agile Components for rapid development of a peridynamics code, Peridigm. The goal of Agile Components is to enable the efficient development of production-quality software by providing a well-defined, unifying interface to a powerful set of component-based software. Specifically, Agile Components facilitate interoperability among packages within the Trilinos Project, including data management, time integration, uncertainty quantification, and optimization. Development of the Peridigm code served as a testbed for Agile Components and resulted in a number of recommendations for future development. Agile Components successfully enabled rapid integration of Trilinos packages into Peridigm. A cost of this approach, however, was a set of restrictions on Peridigm's architecture which impacted the ability to track history-dependent material data, dynamically modify the model discretization, and interject user-defined routines into the time integration algorithm. These restrictions resulted in modifications to the Agile Components approach, as implemented in Peridigm, and in a set of recommendations for future Agile Components development. Specific recommendations include improved handling of material states, a more flexible flow control model, and improved documentation. A demonstration mini-application, SimpleODE, was developed at the onset of this project and is offered as a potential supplement to Agile Components documentation.

  19. Priority-Based Constraint Management in Software Process Instantiation

    NASA Astrophysics Data System (ADS)

    Killisperger, Peter; Stumptner, Markus; Peters, Georg; Stückl, Thomas

    In order to reuse software processes for a spectrum of projects, they are described in a generic way. Due to the uniqueness of software development, processes have to be adapted to project-specific needs to be effectively applicable in projects. This instantiation still lacks standardization and tool support making it error-prone, time consuming, and thus expensive. Siemens AG has started research projects aiming to improve software process-related activities. Part of these efforts has been the development of an architecture for a system executing instantiation decisions made by humans which automatically restores correctness of the resulting process.

  20. Advanced information processing system: Input/output network management software

    NASA Technical Reports Server (NTRS)

    Nagle, Gail; Alger, Linda; Kemp, Alexander

    1988-01-01

    The purpose of this document is to provide the software requirements and specifications for the Input/Output Network Management Services for the Advanced Information Processing System. This introduction and overview section is provided to briefly outline the overall architecture and software requirements of the AIPS system before discussing the details of the design requirements and specifications of the AIPS I/O Network Management software. A brief overview of the AIPS architecture followed by a more detailed description of the network architecture.

  1. Software Architecture for Simultaneous Process Control and Software Development/Modification

    SciTech Connect

    Lenarduzzi, Roberto; Hileman, Michael S; McMillan, David E; Holmes Jr, William; Blankenship, Mark; Wilder, Terry

    2011-01-01

    A software architecture is described that allows modification of some application code sections while the remainder of the application continues executing. This architecture facilitates long term testing and process control because the overall process need not be stopped and restarted to allow modifications or additions to the software. A working implementation using National Instruments LabVIEW{trademark} sub-panel and shared variable features is described as an example. This architecture provides several benefits in both the program development and execution environments. The software is easier to maintain and it is not necessary to recompile the entire program after a modification.

  2. A Peer Review Process for Games and Software

    ERIC Educational Resources Information Center

    Mallon, Bride

    2008-01-01

    A peer-review process for assessing the contribution of artifacts, such as games and software to research, is proposed. Games and software produced as research output by academics tend to be accredited within their institution through discussion of the artifact, rather than directly. An independent judgment by peers confirming an artifact's…

  3. Improving the Software Development Process Using Testability Research

    NASA Technical Reports Server (NTRS)

    Voas, Jeffrey M.; Miller, Keith W.

    1991-01-01

    Software testability is the the tendency of code to reveal existing faults during random testing. This paper proposes to take software testability predictions into account throughout the development process. These predictions can be made from formal specifications, design documents, and the code itself. The insight provided by software testability is valuable during design, coding, testing, and quality assurance. We further believe that software testability analysis can play a crucial role in quantifying the likelihood that faults are not hiding after testing does not result in any failures for the current version.

  4. A communication channel model of the software process

    NASA Technical Reports Server (NTRS)

    Tausworthe, Robert C.

    1988-01-01

    Beginning research into a noisy communication channel analogy of software development process productivity, in order to establish quantifiable behavior and theoretical bounds is discussed. The analogy leads to a fundamental mathematical relationship between human productivity and the amount of information supplied by the developers, the capacity of the human channel for processing and transmitting information, the software product yield (object size) the work effort, requirements efficiency, tool and process efficiency, and programming environment advantage. An upper bound to productivity is derived that shows that software reuse is the only means that can lead to unbounded productivity growth; practical considerations of size and cost of reusable components may reduce this to a finite bound.

  5. Modeling and Developing the Information System for the SuperAGILE Experiment

    NASA Astrophysics Data System (ADS)

    Lazzarotto, F.; Costa, E.; del Monte, E.; Feroci, M.

    2004-07-01

    We will present some formal description of the SuperAGILE (SA) detection system data, the relationships among them and the operations applied on data, with the aid of instruments such as Entity-Relationship (E-R) and UML diagrams. We just realized functions of reception, pre-processing, archiving and analysis on SA data making use of Object Oriented and SQL open source software instruments.

  6. Organizational management practices for achieving software process improvement

    NASA Technical Reports Server (NTRS)

    Kandt, Ronald Kirk

    2004-01-01

    The crisis in developing software has been known for over thirty years. Problems that existed in developing software in the early days of computing still exist today. These problems include the delivery of low-quality products, actual development costs that exceed expected development costs, and actual development time that exceeds expected development time. Several solutions have been offered to overcome out inability to deliver high-quality software, on-time and within budget. One of these solutions involves software process improvement. However, such efforts often fail because of organizational management issues. This paper discusses business practices that organizations should follow to improve their chances of initiating and sustaining successful software process improvement efforts.

  7. The Development Process of the LUCIFER Control Software

    NASA Astrophysics Data System (ADS)

    Jütte, M.; Polsterer, K.; Lehmitz, M.

    2004-07-01

    We present the design and development process of the control software for the LBT NIR spectroscopic Utility with Camera and Integral-Field Unit for Extragalactic Research (LUCIFER) which is one of the first-light instruments for the LBT on Mt. Graham, Arizona. The LBT will be equipped with two identical LUCIFER instruments for both mirrors. We give an overview of the software architecture and the current state of the software package and describe the development process by using a virtual LUCIFER instrument.

  8. Seven Processes that Enable NASA Software Engineering Technologies

    NASA Technical Reports Server (NTRS)

    Housch, Helen; Godfrey, Sally

    2011-01-01

    This slide presentation reviews seven processes that NASA uses to ensure that software is developed, acquired and maintained as specified in the NPR 7150.2A requirement. The requirement is to ensure that all software be appraised for the Capability Maturity Model Integration (CMMI). The enumerated processes are: (7) Product Integration, (6) Configuration Management, (5) Verification, (4) Software Assurance, (3) Measurement and Analysis, (2) Requirements Management and (1) Planning & Monitoring. Each of these is described and the group(s) that are responsible is described.

  9. Agile informatics: application of agile project management to the development of a personal health application.

    PubMed

    Chung, Jeanhee; Pankey, Evan; Norris, Ryan J

    2007-01-01

    We describe the application of the Agile method-- a short iteration cycle, user responsive, measurable software development approach-- to the project management of a modular personal health record, iHealthSpace, to be deployed to the patients and providers of a large academic primary care practice. PMID:18694014

  10. Ten steps to successful software process improvement

    NASA Technical Reports Server (NTRS)

    Kandt, R. K.

    2003-01-01

    This paper identifies ten steps for managing change that address organizational and cultural issues. Four of these steps are critical, that if not done, will almost guarantee failure. This ten-step program emphasizes the alignment of business goals, change process goals, and the work performed by the employees of an organization.

  11. From Concept to Software: Developing a Framework for Understanding the Process of Software Design.

    ERIC Educational Resources Information Center

    Mishra, Punyashloke; Zhao, Yong; Tan, Sophia

    1999-01-01

    Discussion of technological innovation and the process of design focuses on the design of computer software. Offers a framework for understanding the design process by examining two computer programs: FliPS, a multimedia program for learning complex problems in chemistry; and Tiger, a Web-based program for managing and publishing electronic…

  12. Function-based integration strategy for an agile manufacturing testbed

    NASA Astrophysics Data System (ADS)

    Park, Hisup

    1997-01-01

    This paper describes an integration strategy for plug-and- play software based on functional descriptions of the software modules. The functional descriptions identify explicitly the role of each module with respect to the overall systems. They define the critical dependencies that affect the individual modules and thus affect the behavior of the system. The specified roles, dependencies and behavioral constraints are then incorporated in a group of shared objects that are distributed over a network. These objects may be interchanged with others without disrupting the system so long as the replacements meet the interface and functional requirements. In this paper, we propose a framework for modeling the behavior of plug-and-play software modules that will be used to (1) design and predict the outcome of the integration, (2) generate the interface and functional requirements of individual modules, and (3) form a dynamic foundation for applying interchangeable software modules. I describe this strategy in the context of the development of an agile manufacturing testbed. The testbed represents a collection of production cells for machining operations, supported by a network of software modules or agents for planning, fabrication, and inspection. A process definition layer holds the functional description of the software modules. A network of distributed objects interact with one another over the Internet and comprise the plug-compatible software nodes that execute these functions. This paper will explore the technical and operational ramifications of using the functional description framework to organize and coordinate the distributed object modules.

  13. Earth Observation Services (Image Processing Software)

    NASA Technical Reports Server (NTRS)

    1992-01-01

    San Diego State University and Environmental Systems Research Institute, with other agencies, have applied satellite imaging and image processing techniques to geographic information systems (GIS) updating. The resulting images display land use and are used by a regional planning agency for applications like mapping vegetation distribution and preserving wildlife habitats. The EOCAP program provides government co-funding to encourage private investment in, and to broaden the use of NASA-developed technology for analyzing information about Earth and ocean resources.

  14. Architecture-Centric Methods and Agile Approaches

    NASA Astrophysics Data System (ADS)

    Babar, Muhammad Ali; Abrahamsson, Pekka

    Agile software development approaches have had significant impact on industrial software development practices. Despite becoming widely popular, there is an increasing perplexity about the role and importance of a system’s software architecture in agile approaches [1, 2]. Advocates of the vital role of architecture in achieving quality goals of large-scale-software-intensive-systems are skeptics of the scalability of any development approach that does not pay sufficient attention to architectural issues. However, the proponents of agile approaches usually perceive the upfront design and evaluation of architecture as being of less value to the customers of a system. According to them, for example, re-factoring can help fix most of the problems. Many experiences show that large-scale re-factoring often results in significant defects, which are very costly to address later in the development cycle. It is considered that re-factoring is worthwhile as long as the high-level design is good enough to limit the need for large-scale re-factoring [1, 3, 4].

  15. Using Automation to Improve the Flight Software Testing Process

    NASA Technical Reports Server (NTRS)

    ODonnell, James R., Jr.; Morgenstern, Wendy M.; Bartholomew, Maureen O.

    2001-01-01

    One of the critical phases in the development of a spacecraft attitude control system (ACS) is the testing of its flight software. The testing (and test verification) of ACS flight software requires a mix of skills involving software, knowledge of attitude control, and attitude control hardware, data manipulation, and analysis. The process of analyzing and verifying flight software test results often creates a bottleneck which dictates the speed at which flight software verification can be conducted. In the development of the Microwave Anisotropy Probe (MAP) spacecraft ACS subsystem, an integrated design environment was used that included a MAP high fidelity (HiFi) simulation, a central database of spacecraft parameters, a script language for numeric and string processing, and plotting capability. In this integrated environment, it was possible to automate many of the steps involved in flight software testing, making the entire process more efficient and thorough than on previous missions. In this paper, we will compare the testing process used on MAP to that used on other missions. The software tools that were developed to automate testing and test verification will be discussed, including the ability to import and process test data, synchronize test data and automatically generate HiFi script files used for test verification, and an automated capability for generating comparison plots. A summary of the benefits of applying these test methods on MAP will be given. Finally, the paper will conclude with a discussion of re-use of the tools and techniques presented, and the ongoing effort to apply them to flight software testing of the Triana spacecraft ACS subsystem.

  16. Using Automation to Improve the Flight Software Testing Process

    NASA Technical Reports Server (NTRS)

    ODonnell, James R., Jr.; Andrews, Stephen F.; Morgenstern, Wendy M.; Bartholomew, Maureen O.; McComas, David C.; Bauer, Frank H. (Technical Monitor)

    2001-01-01

    One of the critical phases in the development of a spacecraft attitude control system (ACS) is the testing of its flight software. The testing (and test verification) of ACS flight software requires a mix of skills involving software, attitude control, data manipulation, and analysis. The process of analyzing and verifying flight software test results often creates a bottleneck which dictates the speed at which flight software verification can be conducted. In the development of the Microwave Anisotropy Probe (MAP) spacecraft ACS subsystem, an integrated design environment was used that included a MAP high fidelity (HiFi) simulation, a central database of spacecraft parameters, a script language for numeric and string processing, and plotting capability. In this integrated environment, it was possible to automate many of the steps involved in flight software testing, making the entire process more efficient and thorough than on previous missions. In this paper, we will compare the testing process used on MAP to that used on previous missions. The software tools that were developed to automate testing and test verification will be discussed, including the ability to import and process test data, synchronize test data and automatically generate HiFi script files used for test verification, and an automated capability for generating comparison plots. A summary of the perceived benefits of applying these test methods on MAP will be given. Finally, the paper will conclude with a discussion of re-use of the tools and techniques presented, and the ongoing effort to apply them to flight software testing of the Triana spacecraft ACS subsystem.

  17. The development process of the LUCIFER control software

    NASA Astrophysics Data System (ADS)

    Juette, Marcus; Polsterer, Kai L.; Lehmitz, Michael; Knierim, Volker

    2004-09-01

    In this paper we present the software development process and history of the LUCIFER (LBT NIR spectroscopic Utility with Camera and Integral- Field Unit for Extragalactic Research) multi-mode near-infrared instrument, which is one of the first light instruments of the LBT on Mt. Graham, Arizona. The software is realised as a distributed system in Java using its remote method invocation service (RMI). We describe the current status of the software and give an overview of the planned computer hardware architecture.

  18. FITSH: Software Package for Image Processing

    NASA Astrophysics Data System (ADS)

    Pál, András

    2011-11-01

    FITSH provides a standalone environment for analysis of data acquired by imaging astronomical detectors. The package provides utilities both for the full pipeline of subsequent related data processing steps (including image calibration, astrometry, source identification, photometry, differential analysis, low-level arithmetic operations, multiple image combinations, spatial transformations and interpolations, etc.) and for aiding the interpretation of the (mainly photometric and/or astrometric) results. The package also features a consistent implementation of photometry based on image subtraction, point spread function fitting and aperture photometry and provides easy-to-use interfaces for comparisons and for picking the most suitable method for a particular problem. The utilities in the package are built on the top of the commonly used UNIX/POSIX shells (hence the name of the package), therefore both frequently used and well-documented tools for such environments can be exploited and managing massive amount of data is rather convenient.

  19. Development of Data Processing Software for NBI Spectroscopic Analysis System

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaodan; Hu, Chundong; Sheng, Peng; Zhao, Yuanzhe; Wu, Deyun; Cui, Qinglong

    2015-04-01

    A set of data processing software is presented in this paper for processing NBI spectroscopic data. For better and more scientific managment and querying these data, they are managed uniformly by the NBI data server. The data processing software offers the functions of uploading beam spectral original and analytic data to the data server manually and automatically, querying and downloading all the NBI data, as well as dealing with local LZO data. The set software is composed of a server program and a client program. The server software is programmed in C/C++ under a CentOS development environment. The client software is developed under a VC 6.0 platform, which offers convenient operational human interfaces. The network communications between the server and the client are based on TCP. With the help of this set software, the NBI spectroscopic analysis system realizes the unattended automatic operation, and the clear interface also makes it much more convenient to offer beam intensity distribution data and beam power data to operators for operation decision-making. supported by National Natural Science Foundation of China (No. 11075183), the Chinese Academy of Sciences Knowledge Innovation

  20. Business Intelligence Applied to the ALMA Software Integration Process

    NASA Astrophysics Data System (ADS)

    Zambrano, M.; Recabarren, C.; González, V.; Hoffstadt, A.; Soto, R.; Shen, T.-C.

    2012-09-01

    Software quality assurance and planning of an astronomy project is a complex task, specially if it is a distributed collaborative project such as ALMA, where the development centers are spread across the globe. When you execute a software project there is much valuable information about this process itself that you might be able to collect. One of the ways you can receive this input is via an issue tracking system that will gather the problem reports relative to software bugs captured during the testing of the software, during the integration of the different components or even worst, problems occurred during production time. Usually, there is little time spent on analyzing them but with some multidimensional processing you can extract valuable information from them and it might help you on the long term planning and resources allocation. We present an analysis of the information collected at ALMA from a collection of key unbiased indicators. We describe here the extraction, transformation and load process and how the data was processed. The main goal is to assess a software process and get insights from this information.

  1. I'll Txt U if I Have a Problem: How the Société Canadienne du Cancer in Quebec Applied Behavior-Change Theory, Data Mining and Agile Software Development to Help Young Adults Quit Smoking

    PubMed Central

    van Mierlo, Trevor; Fournier, Rachel; Jean-Charles, Anathalie; Hovington, Jacinthe; Ethier, Isabelle; Selby, Peter

    2014-01-01

    Introduction For many organizations, limited budgets and phased funding restrict the development of digital health tools. This problem is often exacerbated by the ever-increasing sophistication of technology and costs related to programming and maintenance. Traditional development methods tend to be costly and inflexible and not client centered. The purpose of this study is to analyze the use of Agile software development and outcomes of a three-phase mHealth program designed to help young adult Quebecers quit smoking. Methods In Phase I, literature reviews, focus groups, interviews, and behavior change theory were used in the adaption and re-launch of an existing evidence-based mHealth platform. Based on analysis of user comments and utilization data from Phase I, the second phase expanded the service to allow participants to live text-chat with counselors. Phase II evaluation led to the third and current phase, in which algorithms were introduced to target pregnant smokers, substance users, students, full-time workers, those affected by mood disorders and chronic disease. Results Data collected throughout the three phases indicate that the incremental evolution of the intervention has led to increasing numbers of smokers being enrolled while making functional enhancements. In Phase I (240 days) 182 smokers registered with the service. 51% (n = 94) were male and 61.5% (n = 112) were between the ages of 18–24. In Phase II (300 days), 994 smokers registered with the service. 51% (n = 508) were male and 41% (n = 403) were between the ages of 18–24. At 174 days to date 873 smokers have registered in the third phase. 44% (n = 388) were male and 24% (n = 212) were between the ages of 18–24. Conclusions Emerging technologies in behavioral science show potential, but do not have defined best practices for application development. In phased-based projects with limited funding, Agile appears to be a viable approach to building and expanding

  2. MOPEX: a software package for astronomical image processing and visualization

    NASA Astrophysics Data System (ADS)

    Makovoz, David; Roby, Trey; Khan, Iffat; Booth, Hartley

    2006-06-01

    We present MOPEX - a software package for astronomical image processing and display. The package is a combination of command-line driven image processing software written in C/C++ with a Java-based GUI. The main image processing capabilities include creating mosaic images, image registration, background matching, point source extraction, as well as a number of minor image processing tasks. The combination of the image processing and display capabilities allows for much more intuitive and efficient way of performing image processing. The GUI allows for the control over the image processing and display to be closely intertwined. Parameter setting, validation, and specific processing options are entered by the user through a set of intuitive dialog boxes. Visualization feeds back into further processing by providing a prompt feedback of the processing results. The GUI also allows for further analysis by accessing and displaying data from existing image and catalog servers using a virtual observatory approach. Even though originally designed for the Spitzer Space Telescope mission, a lot of functionalities are of general usefulness and can be used for working with existing astronomical data and for new missions. The software used in the package has undergone intensive testing and benefited greatly from effective software reuse. The visualization part has been used for observation planning for both the Spitzer and Herschel Space Telescopes as part the tool Spot. The visualization capabilities of Spot have been enhanced and integrated with the image processing functionality of the command-line driven MOPEX. The image processing software is used in the Spitzer automated pipeline processing, which has been in operation for nearly 3 years. The image processing capabilities have also been tested in off-line processing by numerous astronomers at various institutions around the world. The package is multi-platform and includes automatic update capabilities. The software

  3. The development process for the space shuttle primary avionics software system

    NASA Technical Reports Server (NTRS)

    Keller, T. W.

    1987-01-01

    Primary avionics software system; software development approach; user support and problem diagnosis; software releases and configuration; quality/productivity programs; and software development/production facilities are addressed. Also examined are the external evaluations of the IBM process.

  4. Future Research in Agile Systems Development: Applying Open Innovation Principles Within the Agile Organisation

    NASA Astrophysics Data System (ADS)

    Conboy, Kieran; Morgan, Lorraine

    A particular strength of agile approaches is that they move away from ‘introverted' development and intimately involve the customer in all areas of development, supposedly leading to the development of a more innovative and hence more valuable information system. However, we argue that a single customer representative is too narrow a focus to adopt and that involvement of stakeholders beyond the software development itself is still often quite weak and in some cases non-existent. In response, we argue that current thinking regarding innovation in agile development needs to be extended to include multiple stakeholders outside the business unit. This paper explores the intra-organisational applicability and implications of open innovation in agile systems development. Additionally, it argues for a different perspective of project management that includes collaboration and knowledge-sharing with other business units, customers, partners, and other relevant stakeholders pertinent to the business success of an organisation, thus embracing open innovation principles.

  5. Open environment for image processing and software development

    NASA Astrophysics Data System (ADS)

    Rasure, John R.; Young, Mark

    1992-04-01

    The main goal of the Khoros software project is to create and provide an integrated software development environment for information processing and data visualization. The Khoros software system is now being used as a foundation to improve productivity and promote software reuse in a wide variety of application domain. A powerful feature of the Khoros system is the high-level, abstract visual language that can be employed to significantly boost the productivity of the researcher. Central to the Khoros system is the need for a consistent yet flexible user interface development system that provides cohesiveness to the vast number of programs that make up the Khoros system. Automated tools assist in maintenance as well as development of programs. The software structure that embodies this system provides for extensibility and portability, and allows for easy tailoring to target specific application domains and processing environments. First, an overview of the Khoros software environment is given. Then this paper presents the abstract applications programmer interface, API, the data services that are provided in Khoros to support it, and the Khoros visualization and image file format. The authors contend that Khoros is an excellent environment for the exploration and implementation of imaging standards.

  6. Software release build process and components in ATLAS offline

    NASA Astrophysics Data System (ADS)

    Obreshkov, Emil; ATLAS Collaboration

    2011-12-01

    ATLAS is one of the largest collaborations in the physical sciences and involves 3000 scientists and engineers from 174 institutions in 38 countries. The geographically dispersed developer community has produced a large amount of software which is organized in 10 projects. In this presentation we discuss how the software is built on a variety of compiler and operating system combinations every night. File level and package level parallelism together with multi-core build servers are used to perform fast builds of the different platforms in several branches. We discuss the different tools involved during the software release build process and also the various mechanisms implemented to provide performance gains and error detection and retry mechanisms in order to counteract network and other instabilities that would otherwise degrade the robustness of the system. The goal is to provide high quality software built as fast as possible ready for final validation and deployment.

  7. The Implementation of Computer Data Processing Software for EAST NBI

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaodan; Hu, Chundong; Sheng, Peng; Zhao, Yuanzhe; Wu, Deyun; Cui, Qinglong

    2014-10-01

    One of the most important project missions of neutral beam injectors is the implementation of 100 s neutral beam injection (NBI) with high power energy to the plasma of the EAST superconducting tokamak. Correspondingly, it's necessary to construct a high-speed and reliable computer data processing system for processing experimental data, such as data acquisition, data compression and storage, data decompression and query, as well as data analysis. The implementation of computer data processing application software (CDPS) for EAST NBI is presented in this paper in terms of its functional structure and system realization. The set of software is programmed in C language and runs on Linux operating system based on TCP network protocol and multi-threading technology. The hardware mainly includes industrial control computer (IPC), data server, PXI DAQ cards and so on. Now this software has been applied to EAST NBI system, and experimental results show that the CDPS can serve EAST NBI very well.

  8. A process improvement model for software verification and validation

    NASA Technical Reports Server (NTRS)

    Callahan, John; Sabolish, George

    1994-01-01

    We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and Space Station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.

  9. A process improvement model for software verification and validation

    NASA Technical Reports Server (NTRS)

    Callahan, John; Sabolish, George

    1994-01-01

    We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and space station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.

  10. A communication channel model of the software process

    NASA Technical Reports Server (NTRS)

    Tausworthe, R. C.

    1988-01-01

    Reported here is beginning research into a noisy communication channel analogy of software development process productivity, in order to establish quantifiable behavior and theoretical bounds. The analogy leads to a fundamental mathematical relationship between human productivity and the amount of information supplied by the developers, the capacity of the human channel for processing and transmitting information, the software product yield (object size), the work effort, requirements efficiency, tool and process efficiency, and programming environment advantage. Also derived is an upper bound to productivity that shows that software reuse is the only means than can lead to unbounded productivity growth; practical considerations of size and cost of reusable components may reduce this to a finite bound.

  11. The application of image processing software: Photoshop in environmental design

    NASA Astrophysics Data System (ADS)

    Dong, Baohua; Zhang, Chunmi; Zhuo, Chen

    2011-02-01

    In the process of environmental design and creation, the design sketch holds a very important position in that it not only illuminates the design's idea and concept but also shows the design's visual effects to the client. In the field of environmental design, computer aided design has made significant improvement. Many types of specialized design software for environmental performance of the drawings and post artistic processing have been implemented. Additionally, with the use of this software, working efficiency has greatly increased and drawings have become more specific and more specialized. By analyzing the application of photoshop image processing software in environmental design and comparing and contrasting traditional hand drawing and drawing with modern technology, this essay will further explore the way for computer technology to play a bigger role in environmental design.

  12. Model for Simulating a Spiral Software-Development Process

    NASA Technical Reports Server (NTRS)

    Mizell, Carolyn; Curley, Charles; Nayak, Umanath

    2010-01-01

    A discrete-event simulation model, and a computer program that implements the model, have been developed as means of analyzing a spiral software-development process. This model can be tailored to specific development environments for use by software project managers in making quantitative cases for deciding among different software-development processes, courses of action, and cost estimates. A spiral process can be contrasted with a waterfall process, which is a traditional process that consists of a sequence of activities that include analysis of requirements, design, coding, testing, and support. A spiral process is an iterative process that can be regarded as a repeating modified waterfall process. Each iteration includes assessment of risk, analysis of requirements, design, coding, testing, delivery, and evaluation. A key difference between a spiral and a waterfall process is that a spiral process can accommodate changes in requirements at each iteration, whereas in a waterfall process, requirements are considered to be fixed from the beginning and, therefore, a waterfall process is not flexible enough for some projects, especially those in which requirements are not known at the beginning or may change during development. For a given project, a spiral process may cost more and take more time than does a waterfall process, but may better satisfy a customer's expectations and needs. Models for simulating various waterfall processes have been developed previously, but until now, there have been no models for simulating spiral processes. The present spiral-process-simulating model and the software that implements it were developed by extending a discrete-event simulation process model of the IEEE 12207 Software Development Process, which was built using commercially available software known as the Process Analysis Tradeoff Tool (PATT). Typical inputs to PATT models include industry-average values of product size (expressed as number of lines of code

  13. Image-Processing Software For A Hypercube Computer

    NASA Technical Reports Server (NTRS)

    Lee, Meemong; Mazer, Alan S.; Groom, Steven L.; Williams, Winifred I.

    1992-01-01

    Concurrent Image Processing Executive (CIPE) is software system intended to develop and use image-processing application programs on concurrent computing environment. Designed to shield programmer from complexities of concurrent-system architecture, it provides interactive image-processing environment for end user. CIPE utilizes architectural characteristics of particular concurrent system to maximize efficiency while preserving architectural independence from user and programmer. CIPE runs on Mark-IIIfp 8-node hypercube computer and associated SUN-4 host computer.

  14. Parallel optimization methods for agile manufacturing

    SciTech Connect

    Meza, J.C.; Moen, C.D.; Plantenga, T.D.; Spence, P.A.; Tong, C.H.; Hendrickson, B.A.; Leland, R.W.; Reese, G.M.

    1997-08-01

    The rapid and optimal design of new goods is essential for meeting national objectives in advanced manufacturing. Currently almost all manufacturing procedures involve the determination of some optimal design parameters. This process is iterative in nature and because it is usually done manually it can be expensive and time consuming. This report describes the results of an LDRD, the goal of which was to develop optimization algorithms and software tools that will enable automated design thereby allowing for agile manufacturing. Although the design processes vary across industries, many of the mathematical characteristics of the problems are the same, including large-scale, noisy, and non-differentiable functions with nonlinear constraints. This report describes the development of a common set of optimization tools using object-oriented programming techniques that can be applied to these types of problems. The authors give examples of several applications that are representative of design problems including an inverse scattering problem, a vibration isolation problem, a system identification problem for the correlation of finite element models with test data and the control of a chemical vapor deposition reactor furnace. Because the function evaluations are computationally expensive, they emphasize algorithms that can be adapted to parallel computers.

  15. A Software Development Simulation Model of a Spiral Process

    NASA Technical Reports Server (NTRS)

    Mizell, Carolyn; Malone, Linda

    2007-01-01

    There is a need for simulation models of software development processes other than the waterfall because processes such as spiral development are becoming more and more popular. The use of a spiral process can make the inherently difficult job of cost and schedule estimation even more challenging due to its evolutionary nature, but this allows for a more flexible process that can better meet customers' needs. This paper will present a discrete event simulation model of spiral development that can be used to analyze cost and schedule effects of using such a process in comparison to a waterfall process.

  16. EOS MLS Level 2 Data Processing Software Version 3

    NASA Technical Reports Server (NTRS)

    Livesey, Nathaniel J.; VanSnyder, Livesey W.; Read, William G.; Schwartz, Michael J.; Lambert, Alyn; Santee, Michelle L.; Nguyen, Honghanh T.; Froidevaux, Lucien; wang, Shuhui; Manney, Gloria L.; Wu, Dong L.; Wagner, Paul A.; Vuu, Christina; Pumphrey, Hugh C.

    2011-01-01

    This software accepts the EOS MLS calibrated measurements of microwave radiances products and operational meteorological data, and produces a set of estimates of atmospheric temperature and composition. This version has been designed to be as flexible as possible. The software is controlled by a Level 2 Configuration File that controls all aspects of the software: defining the contents of state and measurement vectors, defining the configurations of the various forward models available, reading appropriate a priori spectroscopic and calibration data, performing retrievals, post-processing results, computing diagnostics, and outputting results in appropriate files. In production mode, the software operates in a parallel form, with one instance of the program acting as a master, coordinating the work of multiple slave instances on a cluster of computers, each computing the results for individual chunks of data. In addition, to do conventional retrieval calculations and producing geophysical products, the Level 2 Configuration File can instruct the software to produce files of simulated radiances based on a state vector formed from a set of geophysical product files taken as input. Combining both the retrieval and simulation tasks in a single piece of software makes it far easier to ensure that identical forward model algorithms and parameters are used in both tasks. This also dramatically reduces the complexity of the code maintenance effort.

  17. New GNSS processing software for EOP service of IAA RAS

    NASA Astrophysics Data System (ADS)

    Suvorkin, Vladimir; Kurdubov, Sergey; Gayazov, Iskander

    2014-05-01

    GNSS Earth Orientation Parameters Service of Institute of Applied Astronomy RAS runs from year 2000 and provides daily estimates of Xp, Yp, Xp_rate, Yp_rate and LOD to IERS. The previous software which processes triple-difference GPS-measurements is replaced by newly developed software. This software processes daily observation series of globally distributed 50-70 fixed GNSS stations within IGS network. We process zero-difference ionosphere-free combinations of phase and code measurements and use physical models and calculating strategies in accordance to IERS Conventions 2010 and IGS recommendations to Analysis Centers. We use segmented least-squares algorithms for adjustment and highly optimized implementation for fast computing performance. The products of daily processing are not only EOP estimates but also satellites and receiver clock biases, orbital parameters and parameters of troposphere. The software can be used also outside EOP service to process regional and global GNSS-networks and estimating additional parameters of measurement models.

  18. Software control and system configuration management - A process that works

    NASA Technical Reports Server (NTRS)

    Petersen, K. L.; Flores, C., Jr.

    1983-01-01

    A comprehensive software control and system configuration management process for flight-crucial digital control systems of advanced aircraft has been developed and refined to insure efficient flight system development and safe flight operations. Because of the highly complex interactions among the hardware, software, and system elements of state-of-the-art digital flight control system designs, a systems-wide approach to configuration control and management has been used. Specific procedures are implemented to govern discrepancy reporting and reconciliation, software and hardware change control, systems verification and validation testing, and formal documentation requirements. An active and knowledgeable configuration control board reviews and approves all flight system configuration modifications and revalidation tests. This flexible process has proved effective during the development and flight testing of several research aircraft and remotely piloted research vehicles with digital flight control systems that ranged from relatively simple to highly complex, integrated mechanizations.

  19. Policy Process Editor for P3BM Software

    NASA Technical Reports Server (NTRS)

    James, Mark; Chang, Hsin-Ping; Chow, Edward T.; Crichton, Gerald A.

    2010-01-01

    A computer program enables generation, in the form of graphical representations of process flows with embedded natural-language policy statements, input to a suite of policy-, process-, and performance-based management (P3BM) software. This program (1) serves as an interface between users and the Hunter software, which translates the input into machine-readable form; and (2) enables users to initialize and monitor the policy-implementation process. This program provides an intuitive graphical interface for incorporating natural-language policy statements into business-process flow diagrams. Thus, the program enables users who dictate policies to intuitively embed their intended process flows as they state the policies, reducing the likelihood of errors and reducing the time between declaration and execution of policy.

  20. Using Knowledge Management to Revise Software-Testing Processes

    ERIC Educational Resources Information Center

    Nogeste, Kersti; Walker, Derek H. T.

    2006-01-01

    Purpose: This paper aims to use a knowledge management (KM) approach to effectively revise a utility retailer's software testing process. This paper presents a case study of how the utility organisation's customer services IT production support group improved their test planning skills through applying the American Productivity and Quality Center…

  1. DSPSR: Digital Signal Processing Software for Pulsar Astronomy

    NASA Astrophysics Data System (ADS)

    van Straten, W.; Bailes, M.

    2010-10-01

    DSPSR, written primarily in C++, is an open-source, object-oriented, digital signal processing software library and application suite for use in radio pulsar astronomy. The library implements an extensive range of modular algorithms for use in coherent dedispersion, filterbank formation, pulse folding, and other tasks. The software is installed and compiled using the standard GNU configure and make system, and is able to read astronomical data in 18 different file formats, including FITS, S2, CPSR, CPSR2, PuMa, PuMa2, WAPP, ASP, and Mark5.

  2. Development of an agility assessment module for preliminary fighter design

    NASA Technical Reports Server (NTRS)

    Ngan, Angelen; Bauer, Brent; Biezad, Daniel; Hahn, Andrew

    1996-01-01

    A FORTRAN computer program is presented to perform agility analysis on fighter aircraft configurations. This code is one of the modules of the NASA Ames ACSYNT (AirCraft SYNThesis) design code. The background of the agility research in the aircraft industry and a survey of a few agility metrics are discussed. The methodology, techniques, and models developed for the code are presented. FORTRAN programs were developed for two specific metrics, CCT (Combat Cycle Time) and PM (Pointing Margin), as part of the agility module. The validity of the code was evaluated by comparing with existing flight test data. Example trade studies using the agility module along with ACSYNT were conducted using Northrop F-20 Tigershark and McDonnell Douglas F/A-18 Hornet aircraft models. The sensitivity of thrust loading and wing loading on agility criteria were investigated. The module can compare the agility potential between different configurations and has the capability to optimize agility performance in the preliminary design process. This research provides a new and useful design tool for analyzing fighter performance during air combat engagements.

  3. Supporting Agile Development of Authorization Rules for SME Applications

    NASA Astrophysics Data System (ADS)

    Bartsch, Steffen; Sohr, Karsten; Bormann, Carsten

    Custom SME applications for collaboration and workflow have become affordable when implemented as Web applications employing Agile methodologies. Security engineering is still difficult with Agile development, though: heavy-weight processes put the improvements of Agile development at risk. We propose Agile security engineering and increased end-user involvement to improve Agile development with respect to authorization policy development. To support the authorization policy development, we introduce a simple and readable authorization rules language implemented in a Ruby on Rails authorization plugin that is employed in a real-world SME collaboration and workflow application. Also, we report on early findings of the language’s use in authorization policy development with domain experts.

  4. Addressing the need for adaptable decision processes within healthcare software.

    PubMed

    Miseldine, P; Taleb-Bendiab, A; England, D; Randles, M

    2007-03-01

    In the healthcare sector, where the decisions made by software aid in the direct treatment of patients, software requires high levels of assurance to ensure the correct interpretation of the tasks it is automating. This paper argues that introducing adaptable decision processes within eHealthcare initiatives can reduce software-maintenance complexity and, due to the instantaneous, distributed deployment of decision models, allow for quicker updates of current best practice, thereby improving patient care. The paper provides a description of a collection of technologies and tools that can be used to provide the required adaptation in a decision process. These tools are evaluated against two case studies that individually highlight different requirements in eHealthcare: a breast-cancer decision-support system, in partnership with several of the UK's leading cancer hospitals, and a dental triage in partnership with the Royal Liverpool Hospital which both show how the complete process flow of software can be abstracted and adapted, and the benefits that arise as a result. PMID:17365643

  5. Woods Hole Image Processing System Software implementation; using NetCDF as a software interface for image processing

    USGS Publications Warehouse

    Paskevich, Valerie F.

    1992-01-01

    The Branch of Atlantic Marine Geology has been involved in the collection, processing and digital mosaicking of high, medium and low-resolution side-scan sonar data during the past 6 years. In the past, processing and digital mosaicking has been accomplished with a dedicated, shore-based computer system. With the need to process sidescan data in the field with increased power and reduced cost of major workstations, a need to have an image processing package on a UNIX based computer system which could be utilized in the field as well as be more generally available to Branch personnel was identified. This report describes the initial development of that package referred to as the Woods Hole Image Processing System (WHIPS). The software was developed using the Unidata NetCDF software interface to allow data to be more readily portable between different computer operating systems.

  6. Achieving agility through parameter space qualification

    SciTech Connect

    Diegert, K.V.; Easterling, R.G.; Ashby, M.R.; Benavides, G.L.; Forsythe, C.; Jones, R.E.; Longcope, D.B.; Parratt, S.W.

    1995-02-01

    The A-primed (Agile Product Realization of Innovative electro-Mechanical Devices) project is defining and proving processes for agile product realization for the Department of Energy complex. Like other agile production efforts reported in the literature, A-primed uses concurrent engineering and information automation technologies to enhance information transfer. A unique aspect of our approach to agility is the qualification during development of a family of related product designs and their production processes, rather than a single design and its attendant processes. Applying engineering principles and statistical design of experiments, economies of test and analytic effort are realized for the qualification of the device family as a whole. Thus the need is minimized for test and analysis to qualify future devices from this family, thereby further reducing the design-to-production cycle time. As a measure of the success of the A-primed approach, the first design took 24 days to produce, and operated correctly on the first attempt. A flow diagram for the qualification process is presented. Guidelines are given for implementation, based on the authors experiences as members of the A-primed qualification team.

  7. Team Software Development for Aerothermodynamic and Aerodynamic Analysis and Design

    NASA Technical Reports Server (NTRS)

    Alexandrov, N.; Atkins, H. L.; Bibb, K. L.; Biedron, R. T.; Carpenter, M. H.; Gnoffo, P. A.; Hammond, D. P.; Jones, W. T.; Kleb, W. L.; Lee-Rausch, E. M.

    2003-01-01

    A collaborative approach to software development is described. The approach employs the agile development techniques: project retrospectives, Scrum status meetings, and elements of Extreme Programming to efficiently develop a cohesive and extensible software suite. The software product under development is a fluid dynamics simulator for performing aerodynamic and aerothermodynamic analysis and design. The functionality of the software product is achieved both through the merging, with substantial rewrite, of separate legacy codes and the authorship of new routines. Examples of rapid implementation of new functionality demonstrate the benefits obtained with this agile software development process. The appendix contains a discussion of coding issues encountered while porting legacy Fortran 77 code to Fortran 95, software design principles, and a Fortran 95 coding standard.

  8. The Khoros software development environment for image and signal processing.

    PubMed

    Konstantinides, K; Rasure, J R

    1994-01-01

    Data flow visual language systems allow users to graphically create a block diagram of their applications and interactively control input, output, and system variables. Khoros is an integrated software development environment for information processing and visualization. It is particularly attractive for image processing because of its rich collection of tools for image and digital signal processing. This paper presents a general overview of Khoros with emphasis on its image processing and DSP tools. Various examples are presented and the future direction of Khoros is discussed. PMID:18291923

  9. Perspectives on Agile Coaching

    NASA Astrophysics Data System (ADS)

    Fraser, Steven; Lundh, Erik; Davies, Rachel; Eckstein, Jutta; Larsen, Diana; Vilkki, Kati

    There are many perspectives to agile coaching including: growing coaching expertise, selecting the appropriate coach for your context; and eva luating value. A coach is often an itinerant who may observe, mentor, negotiate, influence, lead, and/or architect everything from team organization to system architecture. With roots in diverse fields ranging from technology to sociology coaches have differing motivations and experience bases. This panel will bring together coaches to debate and discuss various perspectives on agile coaching. Some of the questions to be addressed will include: What are the skills required for effective coaching? What should be the expectations for teams or individu als being coached? Should coaches be: a corporate resource (internal team of consultants working with multiple internal teams); an integral part of a specific team; or external contractors? How should coaches exercise influence and au thority? How should management assess the value of a coaching engagement? Do you have what it takes to be a coach? - This panel will bring together sea soned agile coaches to offer their experience and advice on how to be the best you can be!

  10. The agile transversal filter - A flexible building block for ICNIA

    NASA Astrophysics Data System (ADS)

    Botha, D. G.; Smead, F. W.

    Integrated Communications, Navigation and Identification Avionics (ICNIA) is an advanced development program to demonstrate an integrated systems approach to the implementation of functions normally performed by a collection of independent black boxes. The system design partitions all CNI functions to optimize modular commonality within the ICNIA system. One function required in many parallel channels is the processing of signals with instantaneous bandwidths of 10 MHz or less. A specific implementation is the Narrow Band Agile Transversal Filter (NBATF), which can be implemented in state-of-the-art technology, can process signals with a variety of algorithms selectable under software control, and can be replicated within the system, as required, to perform the total set of functions. The NBATF constitutes a building block module within the ICNIA system.

  11. CT-assisted agile manufacturing

    NASA Astrophysics Data System (ADS)

    Stanley, James H.; Yancey, Robert N.

    1996-11-01

    The next century will witness at least two great revolutions in the way goods are produced. First, workers will use the medium of virtual reality in all aspects of marketing, research, development, prototyping, manufacturing, sales and service. Second, market forces will drive manufacturing towards small-lot production and just-in-time delivery. Already, we can discern the merging of these megatrends into what some are calling agile manufacturing. Under this new paradigm, parts and processes will be designed and engineered within the mind of a computer, tooled and manufactured by the offspring of today's rapid prototyping equipment, and evaluated for performance and reliability by advanced nondestructive evaluation (NDE) techniques and sophisticated computational models. Computed tomography (CT) is the premier example of an NDE method suitable for future agile manufacturing activities. It is the only modality that provides convenient access to the full suite of engineering data that users will need to avail themselves of computer- aided design, computer-aided manufacturing, and computer- aided engineering capabilities, as well as newly emerging reverse engineering, rapid prototyping and solid freeform fabrication technologies. As such, CT is assured a central, utilitarian role in future industrial operations. An overview of this exciting future for industrial CT is presented.

  12. Sculpting in cyberspace: Parallel processing the development of new software

    NASA Technical Reports Server (NTRS)

    Fisher, Rob

    1993-01-01

    Stimulating creativity in problem solving, particularly where software development is involved, is applicable to many disciplines. Metaphorical thinking keeps the problem in focus but in a different light, jarring people out of their mental ruts and sparking fresh insights. It forces the mind to stretch to find patterns between dissimilar concepts, in the hope of discovering unusual ideas in odd associations (Technology Review January 1993, p. 37). With a background in Engineering and Visual Design from MIT, I have for the past 30 years pursued a career as a sculptor of interdisciplinary monumental artworks that bridge the fields of science, engineering and art. Since 1979, I have pioneered the application of computer simulation to solve the complex problems associated with these projects. A recent project for the roof of the Carnegie Science Center in Pittsburgh made particular use of the metaphoric creativity technique described above. The problem-solving process led to the creation of hybrid software combining scientific, architectural and engineering visualization techniques. David Steich, a Doctoral Candidate in Electrical Engineering at Penn State, was commissioned to develop special software that enabled me to create innovative free-form sculpture. This paper explores the process of inventing the software through a detailed analysis of the interaction between an artist and a computer programmer.

  13. Understanding and Predicting the Process of Software Maintenance Releases

    NASA Technical Reports Server (NTRS)

    Basili, Victor; Briand, Lionel; Condon, Steven; Kim, Yong-Mi; Melo, Walcelio L.; Valett, Jon D.

    1996-01-01

    One of the major concerns of any maintenance organization is to understand and estimate the cost of maintenance releases of software systems. Planning the next release so as to maximize the increase in functionality and the improvement in quality are vital to successful maintenance management. The objective of this paper is to present the results of a case study in which an incremental approach was used to better understand the effort distribution of releases and build a predictive effort model for software maintenance releases. This study was conducted in the Flight Dynamics Division (FDD) of NASA Goddard Space Flight Center(GSFC). This paper presents three main results: 1) a predictive effort model developed for the FDD's software maintenance release process; 2) measurement-based lessons learned about the maintenance process in the FDD; and 3) a set of lessons learned about the establishment of a measurement-based software maintenance improvement program. In addition, this study provides insights and guidelines for obtaining similar results in other maintenance organizations.

  14. EXACT Software Repository v 1.1

    2007-01-07

    The EXACT Software Repository contains a variety of software packages for describing, controlling, and analyzing computer experiments. The EXACT Python framework provides the experimentalist with convenient software tools to ease and organize the entire experimental process, including the description of factors and levels, the design of experiments, the control of experimental runs, the archiving of results, and analysis of results. The FAST package provides a Framework for Agile Software Testing. FAST manage the distributed executionmore » of EXACT, as well as summaries of test results.« less

  15. Project-Method Fit: Exploring Factors That Influence Agile Method Use

    ERIC Educational Resources Information Center

    Young, Diana K.

    2013-01-01

    While the productivity and quality implications of agile software development methods (SDMs) have been demonstrated, research concerning the project contexts where their use is most appropriate has yielded less definitive results. Most experts agree that agile SDMs are not suited for all project contexts. Several project and team factors have been…

  16. A review of the Technologies Enabling Agile Manufacturing program

    SciTech Connect

    Gray, W.H.; Neal, R.E.; Cobb, C.K.

    1996-10-01

    Addressing a technical plan developed in consideration with major US manufacturers, software and hardware providers, and government representatives, the Technologies Enabling Agile Manufacturing (TEAM) program is leveraging the expertise and resources of industry, universities, and federal agencies to develop, integrate, and deploy leap-ahead manufacturing technologies. One of the TEAM program`s goals is to transition products from design to production faster, more efficiently, and at less cost. TEAM`s technology development strategy also provides all participants with early experience in establishing and working within an electronic enterprise that includes access to high-speed networks and high-performance computing and storage systems. The TEAM program uses the cross-cutting tools it collects, develops, and integrates to demonstrate and deploy agile manufacturing capabilities for three high-priority processes identified by industry: material removal, sheet metal forming, electro-mechanical assembly. This paper reviews the current status of the TEAM program with emphasis upon TEAM`s information infrastructure.

  17. Signal processing software for ground penetrating radar, user's manual

    NASA Astrophysics Data System (ADS)

    Liem, Ronnie; Davis, Thomas J.

    1988-03-01

    This is the user's manual for the signal processing software for reducing ground penetrating radar (GPR) data. The manual provides background information and instructions for operating the computer program. The developed program is based on the synthetic aperture focusing technique. Input data to the program consists of digitized sequential GPR scans from a linear survey. The format for the input data is specified in Appendix C. The output of the program are two-dimensional plots of the ground profile showing the stations and depth of the objects identified by the program. Features of the program include utilities to determine the velocity of propagation of the GPR signal and the location of the ground surface as well as semi-automatic and automatic processing of the data. The program is designed to operate on an IBM PC or compatible computer. Other hardware and supporting software requirements for operating the program are specified in Appendix B.

  18. Expertise in professional software design: a process study.

    PubMed

    Sonnentag, S

    1998-10-01

    Forty professional software designers participated in a study in which they worked on a software design task and reported strategies for accomplishing that task. High performers were identified by a peer-nomination method and performance on a design. Verbal protocol analysis based on a comparison of 12 high and 12 moderate performers indicated that high performers structured their design process by local planning and showed more feedback processing, whereas moderate performers were more engaged in analyzing requirements and verbalizing task-irrelevant cognitions. High performers more often described problem comprehension and cooperation with colleagues as useful strategies. High and moderate performers did not differ with respect to length of experience. None of the differences between the two performance groups could be explained by length of experience. PMID:9806013

  19. Photonic IC design software and process design kits

    NASA Astrophysics Data System (ADS)

    Korthorst, Twan; Stoffer, Remco; Bakker, Arjen

    2015-04-01

    This review discusses photonic IC design software tools, examines existing design flows for photonics design and how these fit different design styles and describes the activities in collaboration and standardization within the silicon photonics group from Si2 and by members of the PDAFlow Foundation to improve design flows. Moreover, it will address the lowering of access barriers to the technology by providing qualified process design kits (PDKs) and improved integration of photonic integrated circuit simulations, physical simulations, mask layout, and verification.

  20. Parallel-Processing Test Bed For Simulation Software

    NASA Technical Reports Server (NTRS)

    Blech, Richard; Cole, Gary; Townsend, Scott

    1996-01-01

    Second-generation Hypercluster computing system is multiprocessor test bed for research on parallel algorithms for simulation in fluid dynamics, electromagnetics, chemistry, and other fields with large computational requirements but relatively low input/output requirements. Built from standard, off-shelf hardware readily upgraded as improved technology becomes available. System used for experiments with such parallel-processing concepts as message-passing algorithms, debugging software tools, and computational steering. First-generation Hypercluster system described in "Hypercluster Parallel Processor" (LEW-15283).

  1. FTOOLS: A FITS Data Processing and Analysis Software Package

    NASA Astrophysics Data System (ADS)

    Blackburn, J. K.

    FTOOLS, a highly modular collection of over 110 utilities for processing and analyzing data in the FITS (Flexible Image Transport System) format, has been developed in support of the HEASARC (High Energy Astrophysics Science Archive Research Center) at NASA's Goddard Space Flight Center. Each utility performs a single simple task such as presentation of file contents, extraction of specific rows or columns, appending or merging tables, binning values in a column or selecting subsets of rows based on a boolean expression. Individual utilities can easily be chained together in scripts to achieve more complex operations such as the generation and displaying of spectra or light curves. The collection of utilities provides both generic processing and analysis utilities and utilities specific to high energy astrophysics data sets used for the ASCA, ROSAT, GRO, and XTE missions. A core set of FTOOLS providing support for generic FITS data processing, FITS image analysis and timing analysis can easily be split out of the full software package for users not needing the high energy astrophysics mission utilities. The FTOOLS software package is designed to be both compatible with IRAF and completely stand alone in a UNIX or VMS environment. The user interface is controlled by standard IRAF parameter files. The package is self documenting through the IRAF help facility and a stand alone help task. Software is written in ANSI C and \\fortran to provide portability across most computer systems. The data format dependencies between hardware platforms are isolated through the FITSIO library package.

  2. SignalPlant: an open signal processing software platform.

    PubMed

    Plesinger, F; Jurco, J; Halamek, J; Jurak, P

    2016-07-01

    The growing technical standard of acquisition systems allows the acquisition of large records, often reaching gigabytes or more in size as is the case with whole-day electroencephalograph (EEG) recordings, for example. Although current 64-bit software for signal processing is able to process (e.g. filter, analyze, etc) such data, visual inspection and labeling will probably suffer from rather long latency during the rendering of large portions of recorded signals. For this reason, we have developed SignalPlant-a stand-alone application for signal inspection, labeling and processing. The main motivation was to supply investigators with a tool allowing fast and interactive work with large multichannel records produced by EEG, electrocardiograph and similar devices. The rendering latency was compared with EEGLAB and proves significantly faster when displaying an image from a large number of samples (e.g. 163-times faster for 75  ×  10(6) samples). The presented SignalPlant software is available free and does not depend on any other computation software. Furthermore, it can be extended with plugins by third parties ensuring its adaptability to future research tasks and new data formats. PMID:27243208

  3. Analyzing a Mature Software Inspection Process Using Statistical Process Control (SPC)

    NASA Technical Reports Server (NTRS)

    Barnard, Julie; Carleton, Anita; Stamper, Darrell E. (Technical Monitor)

    1999-01-01

    This paper presents a cooperative effort where the Software Engineering Institute and the Space Shuttle Onboard Software Project could experiment applying Statistical Process Control (SPC) analysis to inspection activities. The topics include: 1) SPC Collaboration Overview; 2) SPC Collaboration Approach and Results; and 3) Lessons Learned.

  4. A Process Asset Library to Support Software Process Improvement in Small Settings

    NASA Astrophysics Data System (ADS)

    Calvo-Manzano, Jose A.; Cuevas, Gonzalo; San Feliu, Tomás; Serrano, Ariel

    A main factor to the success of any organization process improvement effort is the Process Asset Library implementation that provides a central database accessible by anyone at the organization. This repository includes any process support materials to help process deployment. Those materials are composed of organization's standard software process, software process related documentation, descriptions of the software life cycles, guidelines, examples, templates, and any artefacts that the organization considers useful to help the process improvement. This paper describes the structure and contents of the Web-based Process Asset Library for Small businesses and small groups within large organizations. This library is structured using CMMI as reference model in order to implement those Process Areas described by this model.

  5. A Prototype for the Support of Integrated Software Process Development and Improvement

    NASA Astrophysics Data System (ADS)

    Porrawatpreyakorn, Nalinpat; Quirchmayr, Gerald; Chutimaskul, Wichian

    An efficient software development process is one of key success factors for quality software. Not only can the appropriate establishment but also the continuous improvement of integrated project management and of the software development process result in efficiency. This paper hence proposes a software process maintenance framework which consists of two core components: an integrated PMBOK-Scrum model describing how to establish a comprehensive set of project management and software engineering processes and a software development maturity model advocating software process improvement. Besides, a prototype tool to support the framework is introduced.

  6. Software Quality Perceptions of Stakeholders Involved in the Software Development Process

    ERIC Educational Resources Information Center

    Padmanabhan, Priya

    2013-01-01

    Software quality is one of the primary determinants of project management success. Stakeholders involved in software development widely agree that quality is important (Barney and Wohlin 2009). However, they may differ on what constitutes software quality, and which of its attributes are more important than others. Although, software quality…

  7. Agile Walking Robot

    NASA Technical Reports Server (NTRS)

    Larimer, Stanley J.; Lisec, Thomas R.; Spiessbach, Andrew J.; Waldron, Kenneth J.

    1990-01-01

    Proposed agile walking robot operates over rocky, sandy, and sloping terrain. Offers stability and climbing ability superior to other conceptual mobile robots. Equipped with six articulated legs like those of insect, continually feels ground under leg before applying weight to it. If leg sensed unexpected object or failed to make contact with ground at expected point, seeks alternative position within radius of 20 cm. Failing that, robot halts, examines area around foot in detail with laser ranging imager, and replans entire cycle of steps for all legs before proceeding.

  8. Frequency agile relativistic magnetrons

    SciTech Connect

    Levine, J.S.; Harteneck, B.D.; Price, H.D.

    1995-11-01

    The authors are developing a family of frequency agile relativistic magnetrons to continuously cover the bands from 1 to 3 GHz. They have achieved tuning ranges of > 33%. The magnetrons have been operated repetitively in burst mode at rates up to 100 pps for 10 sec. Power is extracted from two resonators, and is in the range of 400--600 MW, fairly flat across the tuning bandwidth. They are using a network of phase shifters and 3-dB hybrids to combine the power into a single arm and to provide a continuously adjustable attenuator.

  9. Agile Infrastructure Monitoring

    NASA Astrophysics Data System (ADS)

    Andrade, P.; Ascenso, J.; Fedorko, I.; Fiorini, B.; Paladin, M.; Pigueiras, L.; Santos, M.

    2014-06-01

    At the present time, data centres are facing a massive rise in virtualisation and cloud computing. The Agile Infrastructure (AI) project is working to deliver new solutions to ease the management of CERN data centres. Part of the solution consists in a new "shared monitoring architecture" which collects and manages monitoring data from all data centre resources. In this article, we present the building blocks of this new monitoring architecture, the different open source technologies selected for each architecture layer, and how we are building a community around this common effort.

  10. EOS MLS Level 1B Data Processing Software. Version 3

    NASA Technical Reports Server (NTRS)

    Perun, Vincent S.; Jarnot, Robert F.; Wagner, Paul A.; Cofield, Richard E., IV; Nguyen, Honghanh T.; Vuu, Christina

    2011-01-01

    This software is an improvement on Version 2, which was described in EOS MLS Level 1B Data Processing, Version 2.2, NASA Tech Briefs, Vol. 33, No. 5 (May 2009), p. 34. It accepts the EOS MLS Level 0 science/engineering data, and the EOS Aura spacecraft ephemeris/attitude data, and produces calibrated instrument radiances and associated engineering and diagnostic data. This version makes the code more robust, improves calibration, provides more diagnostics outputs, defines the Galactic core more finely, and fixes the equator crossing. The Level 1 processing software manages several different tasks. It qualifies each data quantity using instrument configuration and checksum data, as well as data transmission quality flags. Statistical tests are applied for data quality and reasonableness. The instrument engineering data (e.g., voltages, currents, temperatures, and encoder angles) is calibrated by the software, and the filter channel space reference measurements are interpolated onto the times of each limb measurement with the interpolates being differenced from the measurements. Filter channel calibration target measurements are interpolated onto the times of each limb measurement, and are used to compute radiometric gain. The total signal power is determined and analyzed by each digital autocorrelator spectrometer (DACS) during each data integration. The software converts each DACS data integration from an autocorrelation measurement in the time domain into a spectral measurement in the frequency domain, and estimates separately the spectrally, smoothly varying and spectrally averaged components of the limb port signal arising from antenna emission and scattering effects. Limb radiances are also calibrated.

  11. Frequency agile optical parametric oscillator

    DOEpatents

    Velsko, Stephan P.

    1998-01-01

    The frequency agile OPO device converts a fixed wavelength pump laser beam to arbitrary wavelengths within a specified range with pulse to pulse agility, at a rate limited only by the repetition rate of the pump laser. Uses of this invention include Laser radar, LIDAR, active remote sensing of effluents/pollutants, environmental monitoring, antisensor lasers, and spectroscopy.

  12. Frequency agile optical parametric oscillator

    DOEpatents

    Velsko, S.P.

    1998-11-24

    The frequency agile OPO device converts a fixed wavelength pump laser beam to arbitrary wavelengths within a specified range with pulse to pulse agility, at a rate limited only by the repetition rate of the pump laser. Uses of this invention include Laser radar, LIDAR, active remote sensing of effluents/pollutants, environmental monitoring, antisensor lasers, and spectroscopy. 14 figs.

  13. Measuring the software process and product: Lessons learned in the SEL

    NASA Technical Reports Server (NTRS)

    Basili, V. R.

    1985-01-01

    The software development process and product can and should be measured. The software measurement process at the Software Engineering Laboratory (SEL) has taught a major lesson: develop a goal-driven paradigm (also characterized as a goal/question/metric paradigm) for data collection. Project analysis under this paradigm leads to a design for evaluating and improving the methodology of software development and maintenance.

  14. Verification and Validation in a Rapid Software Development Process

    NASA Technical Reports Server (NTRS)

    Callahan, John R.; Easterbrook, Steve M.

    1997-01-01

    The high cost of software production is driving development organizations to adopt more automated design and analysis methods such as rapid prototyping, computer-aided software engineering (CASE) tools, and high-level code generators. Even developers of safety-critical software system have adopted many of these new methods while striving to achieve high levels Of quality and reliability. While these new methods may enhance productivity and quality in many cases, we examine some of the risks involved in the use of new methods in safety-critical contexts. We examine a case study involving the use of a CASE tool that automatically generates code from high-level system designs. We show that while high-level testing on the system structure is highly desirable, significant risks exist in the automatically generated code and in re-validating releases of the generated code after subsequent design changes. We identify these risks and suggest process improvements that retain the advantages of rapid, automated development methods within the quality and reliability contexts of safety-critical projects.

  15. NPOESS Interface Data Processing Segment Architecture and Software

    NASA Astrophysics Data System (ADS)

    Turek, S.; Souza, K. G.; Fox, C. A.; Grant, K. D.

    2004-12-01

    The National Oceanic and Atmospheric Administration (NOAA), Department of Defense (DoD), and National Aeronautics and Space Administration (NASA) are jointly acquiring the next-generation weather and environmental satellite system; the National Polar-orbiting Operational Environmental Satellite System (NPOESS). NPOESS is an estimated \\$6.5 billion program replacing the current Polar-orbiting Operational Environmental Satellites (POES) managed by NOAA and the Defense Meteorological Satellite Program (DMSP) managed by the DoD. The NPOESS satellites carry a suite of sensors that collect meteorological, oceanographic, climatological, and solar-geophysical observations of the earth, atmosphere, and space. The ground data processing segment for NPOESS is the Interface Data Processing Segment (IDPS). The IDPS processes NPOESS satellite data to provide weather, oceanographic, and environmental data products to NOAA and DoD processing centers and field terminals operated by the United States government. This paper describes Raytheon's high performance computer and software architecture for the NPOESS IDPS. NOAA, the DoD, and NASA selected this architecture after a 2.5-year Program Definition and Risk Reduction (PDRR) competition. The PDRR phase concluded in August of 2002, and has been followed by the NPOESS Preparatory Project (NPP) phase. The NPP satellite, scheduled to launch in late 2006, will provide risk reduction for the future NPOESS satellites, and will enable data continuity between the current EOS missions and NPOESS. Efforts within the PDRR and NPP phases consist of: requirements definition and flowdown from system to segment to subsystem, Object-Oriented (OO) software design, software code development, science to operational code conversion, integration and qualification testing. The NPOESS phase, which supports a constellation of three satellites, will also consist of this same lifecycle during the 2005 through 2009 timeframe, with operations and support

  16. Graphical Technique to Support the Teaching/Learning Process of Software Process Reference Models

    NASA Astrophysics Data System (ADS)

    Espinosa-Curiel, Ismael Edrein; Rodríguez-Jacobo, Josefina; Fernández-Zepeda, José Alberto

    In this paper, we propose a set of diagrams to visualize software process reference models (PRM). The diagrams, called dimods, are the combination of some visual and process modeling techniques such as rich pictures, mind maps, IDEF and RAD diagrams. We show the use of this technique by designing a set of dimods for the Mexican Software Industry Process Model (MoProSoft). Additionally, we perform an evaluation of the usefulness of dimods. The result of the evaluation shows that dimods may be a support tool that facilitates the understanding, memorization, and learning of software PRMs in both, software development organizations and universities. The results also show that dimods may have advantages over the traditional description methods for these types of models.

  17. "HIP" new software : the Hydroecological Integrity Assessment Process

    USGS Publications Warehouse

    Henriksen, Jim; Wilson, Juliette T.

    2006-01-01

    Center (FORT) have developed the Hydroecological Integrity Assessment Process (HIP) and a suite of software tools for conducting a hydrologic classification of streams, addressing instream flow needs, and assessing past and proposed hydrologic alterations on streamflow and other ecosystem components. The HIP recognizes that streamflow is strongly related to many critical physiochemical components of rivers, such as dissolved oxygen, channel geomorphology, and habitats. Streamflow is considered a “master variable” that limits the distribution, abundance, and diversity of many aquatic plant and animal species.

  18. Enhanced Software for Scheduling Space-Shuttle Processing

    NASA Technical Reports Server (NTRS)

    Barretta, Joseph A.; Johnson, Earl P.; Bierman, Rocky R.; Blanco, Juan; Boaz, Kathleen; Stotz, Lisa A.; Clark, Michael; Lebovitz, George; Lotti, Kenneth J.; Moody, James M.; Nguyen, Tony K.; Peterson, Kenneth A.; Sargent, Susan; Shaw, Karma; Stoner, Mack D.; Stowell, Deborah S.; Young, Daniel A.; Tulley, James H., Jr.

    2004-01-01

    The Ground Processing Scheduling System (GPSS) computer program is used to develop streamlined schedules for the inspection, repair, and refurbishment of space shuttles at Kennedy Space Center. A scheduling computer program is needed because space-shuttle processing is complex and it is frequently necessary to modify schedules to accommodate unanticipated events, unavailability of specialized personnel, unexpected delays, and the need to repair newly discovered defects. GPSS implements constraint-based scheduling algorithms and provides an interactive scheduling software environment. In response to inputs, GPSS can respond with schedules that are optimized in the sense that they contain minimal violations of constraints while supporting the most effective and efficient utilization of space-shuttle ground processing resources. The present version of GPSS is a product of re-engineering of a prototype version. While the prototype version proved to be valuable and versatile as a scheduling software tool during the first five years, it was characterized by design and algorithmic deficiencies that affected schedule revisions, query capability, task movement, report capability, and overall interface complexity. In addition, the lack of documentation gave rise to difficulties in maintenance and limited both enhanceability and portability. The goal of the GPSS re-engineering project was to upgrade the prototype into a flexible system that supports multiple- flow, multiple-site scheduling and that retains the strengths of the prototype while incorporating improvements in maintainability, enhanceability, and portability.

  19. Parallel-Processing Software for Creating Mosaic Images

    NASA Technical Reports Server (NTRS)

    Klimeck, Gerhard; Deen, Robert; McCauley, Michael; DeJong, Eric

    2008-01-01

    A computer program implements parallel processing for nearly real-time creation of panoramic mosaics of images of terrain acquired by video cameras on an exploratory robotic vehicle (e.g., a Mars rover). Because the original images are typically acquired at various camera positions and orientations, it is necessary to warp the images into the reference frame of the mosaic before stitching them together to create the mosaic. [Also see "Parallel-Processing Software for Correlating Stereo Images," Software Supplement to NASA Tech Briefs, Vol. 31, No. 9 (September 2007) page 26.] The warping algorithm in this computer program reflects the considerations that (1) for every pixel in the desired final mosaic, a good corresponding point must be found in one or more of the original images and (2) for this purpose, one needs a good mathematical model of the cameras and a good correlation of individual pixels with respect to their positions in three dimensions. The desired mosaic is divided into slices, each of which is assigned to one of a number of central processing units (CPUs) operating simultaneously. The results from the CPUs are gathered and placed into the final mosaic. The time taken to create the mosaic depends upon the number of CPUs, the speed of each CPU, and whether a local or a remote data-staging mechanism is used.

  20. Process Simulation of Gas Metal Arc Welding Software

    2005-09-06

    ARCWELDER is a Windows-based application that simulates gas metal arc welding (GMAW) of steel and aluminum. The software simulates the welding process in an accurate and efficient manner, provides menu items for process parameter selection, and includes a graphical user interface with the option to animate the process. The user enters the base and electrode material, open circuit voltage, wire diameter, wire feed speed, welding speed, and standoff distance. The program computes the size andmore » shape of a square-groove or V-groove weld in the flat position. The program also computes the current, arc voltage, arc length, electrode extension, transfer of droplets, heat input, filler metal deposition, base metal dilution, and centerline cooling rate, in English or SI units. The simulation may be used to select welding parameters that lead to desired operation conditions.« less

  1. Process Orchestration With Modular Software Applications On Intelligent Field Devices

    NASA Astrophysics Data System (ADS)

    Orfgen, Marius; Schmitt, Mathias

    2015-07-01

    The method developed by the DFKI-IFS for extending the functionality of intelligent field devices through the use of reloadable software applications (so-called Apps) is to be further augmented with a methodology and communication concept for process orchestration. The concept allows individual Apps from different manufacturers to decentrally share information. This way of communicating forms the basis for the dynamic orchestration of Apps to complete processes, in that it allows the actions of one App (e.g. detecting a component part with a sensor App) to trigger reactions in other Apps (e.g. triggering the processing of that component part). A holistic methodology and its implementation as a configuration tool allows one to model the information flow between Apps, as well as automatically introduce it into physical production hardware via available interfaces provided by the Field Device Middleware. Consequently, configuring industrial facilities is made simpler, resulting in shorter changeover and shutdown times.

  2. Agile manufacturing: The factory of the future

    NASA Technical Reports Server (NTRS)

    Loibl, Joseph M.; Bossieux, Terry A.

    1994-01-01

    The factory of the future will require an operating methodology which effectively utilizes all of the elements of product design, manufacturing and delivery. The process must respond rapidly to changes in product demand, product mix, design changes or changes in the raw materials. To achieve agility in a manufacturing operation, the design and development of the manufacturing processes must focus on customer satisfaction. Achieving greatest results requires that the manufacturing process be considered from product concept through sales. This provides the best opportunity to build a quality product for the customer at a reasonable rate. The primary elements of a manufacturing system include people, equipment, materials, methods and the environment. The most significant and most agile element in any process is the human resource. Only with a highly trained, knowledgeable work force can the proper methods be applied to efficiently process materials with machinery which is predictable, reliable and flexible. This paper discusses the affect of each element on the development of agile manufacturing systems.

  3. Organizational Agility and Complex Enterprise System Innovations: A Mixed Methods Study of the Effects of Enterprise Systems on Organizational Agility

    ERIC Educational Resources Information Center

    Kharabe, Amol T.

    2012-01-01

    Over the last two decades, firms have operated in "increasingly" accelerated "high-velocity" dynamic markets, which require them to become "agile." During the same time frame, firms have increasingly deployed complex enterprise systems--large-scale packaged software "innovations" that integrate and automate…

  4. Software and Algorithms for Biomedical Image Data Processing and Visualization

    NASA Technical Reports Server (NTRS)

    Talukder, Ashit; Lambert, James; Lam, Raymond

    2004-01-01

    A new software equipped with novel image processing algorithms and graphical-user-interface (GUI) tools has been designed for automated analysis and processing of large amounts of biomedical image data. The software, called PlaqTrak, has been specifically used for analysis of plaque on teeth of patients. New algorithms have been developed and implemented to segment teeth of interest from surrounding gum, and a real-time image-based morphing procedure is used to automatically overlay a grid onto each segmented tooth. Pattern recognition methods are used to classify plaque from surrounding gum and enamel, while ignoring glare effects due to the reflection of camera light and ambient light from enamel regions. The PlaqTrak system integrates these components into a single software suite with an easy-to-use GUI (see Figure 1) that allows users to do an end-to-end run of a patient s record, including tooth segmentation of all teeth, grid morphing of each segmented tooth, and plaque classification of each tooth image. The automated and accurate processing of the captured images to segment each tooth [see Figure 2(a)] and then detect plaque on a tooth-by-tooth basis is a critical component of the PlaqTrak system to do clinical trials and analysis with minimal human intervention. These features offer distinct advantages over other competing systems that analyze groups of teeth or synthetic teeth. PlaqTrak divides each segmented tooth into eight regions using an advanced graphics morphing procedure [see results on a chipped tooth in Figure 2(b)], and a pattern recognition classifier is then used to locate plaque [red regions in Figure 2(d)] and enamel regions. The morphing allows analysis within regions of teeth, thereby facilitating detailed statistical analysis such as the amount of plaque present on the biting surfaces on teeth. This software system is applicable to a host of biomedical applications, such as cell analysis and life detection, or robotic applications, such

  5. 77 FR 50724 - Developing Software Life Cycle Processes for Digital Computer Software Used in Safety Systems of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-22

    ...The U.S. Nuclear Regulatory Commission (NRC or the Commission) is issuing for public comment draft regulatory guide (DG), DG-1210, ``Developing Software Life Cycle Processes for Digital Computer Software used in Safety Systems of Nuclear Power Plants.'' The DG-1210 is proposed Revision 1 of RG 1.173, dated September 1997. This revision endorses, with clarifications, the enhanced consensus......

  6. FTOOLS: A FITS Data Processing and Analysis Software Package

    NASA Astrophysics Data System (ADS)

    Blackburn, J. Kent; Greene, Emily A.; Pence, William

    1993-05-01

    FTOOLS, a highly modular collection of utilities for processing and analyzing data in the FITS (Flexible Image Transport System) format, has been developed in support of the HEASARC (High Energy Astrophysics Research Archive Center) at NASA's Goddard Space Flight Center. Each utility performs a single simple task such as presentation of file contents, extraction of specific rows or columns, appending or merging tables, binning values in a column or selecting subsets of rows based on a boolean expression. Individual utilities can easily be chained together in scripts to achieve more complex operations such as the generation and displaying of spectra or light curves. The collection of utilities provides both generic processing and analysis utilities and utilities common to high energy astrophysics data sets. The FTOOLS software package is designed to be both compatible with IRAF and completely stand alone in a UNIX or VMS environment. The user interface is controlled by standard IRAF parameter files. The package is self documenting through the IRAF help facility and a stand alone help task. Software is written in ANSI C and FORTRAN to provide portability across most computer systems. The data format dependencies between hardware platforms are isolated through the FITSIO library package.

  7. Aircraft agility maneuvers

    NASA Technical Reports Server (NTRS)

    Cliff, Eugene M.; Thompson, Brian G.

    1992-01-01

    A new dynamic model for aircraft motions is presented. This model can be viewed as intermediate between a point-mass model, in which the body attitude angles are control-like, and a rigid-body model, in which the body-attitude angles evolve according to Newton's Laws. Specifically, consideration is given to the case of symmetric flight, and a model is constructed in which the body roll-rate and the body pitch-rate are the controls. In terms of this body-rate model a minimum-time heading change maneuver is formulated. When the bounds on the body-rates are large the results are similar to the point-mass model in that the model can very quickly change the applied forces and produce an acceleration to turn the vehicle. With finite bounds on these rates, the forces change in a smooth way. This leads to a measurable effect of agility.

  8. Agile manufacturing concept

    NASA Astrophysics Data System (ADS)

    Goldman, Steven L.

    1994-03-01

    The initial conceptualization of agile manufacturing was the result of a 1991 study -- chaired by Lehigh Professor Roger N. Nagel and California-based entrepreneur Rick Dove, President of Paradigm Shifts, International -- of what it would take for U.S. industry to regain global manufacturing competitiveness by the early twenty-first century. This industry-led study, reviewed by senior management at over 100 companies before its release, concluded that incremental improvement of the current system of manufacturing would not be enough to be competitive in today's global marketplace. Computer-based information and production technologies that were becoming available to industry opened up the possibility of an altogether new system of manufacturing, one that would be characterized by a distinctive integration of people and technologies; of management and labor; of customers, producers, suppliers, and society.

  9. Decreasing costs of ground data processing system development using a software product line

    NASA Technical Reports Server (NTRS)

    Chaffin, Brian

    2005-01-01

    In this paper, I describe software product lines and why a Ground Data Processing System should use one. I also describe how to develop a software product line, using examples from an imaginary Ground Data Processing System.

  10. Compact, Automated, Frequency-Agile Microspectrofluorimeter

    NASA Technical Reports Server (NTRS)

    Fernandez, Salvador M.; Guignon, Ernest F.

    1995-01-01

    Compact, reliable, rugged, automated cell-culture and frequency-agile microspectrofluorimetric apparatus developed to perform experiments involving photometric imaging observations of single live cells. In original application, apparatus operates mostly unattended aboard spacecraft; potential terrestrial applications include automated or semiautomated diagnosis of pathological tissues in clinical laboratories, biomedical instrumentation, monitoring of biological process streams, and portable instrumentation for testing biological conditions in various environments. Offers obvious advantages over present laboratory instrumentation.

  11. A Case Study of Measuring Process Risk for Early Insights into Software Safety

    NASA Technical Reports Server (NTRS)

    Layman, Lucas; Basili, Victor; Zelkowitz, Marvin V.; Fisher, Karen L.

    2011-01-01

    In this case study, we examine software safety risk in three flight hardware systems in NASA's Constellation spaceflight program. We applied our Technical and Process Risk Measurement (TPRM) methodology to the Constellation hazard analysis process to quantify the technical and process risks involving software safety in the early design phase of these projects. We analyzed 154 hazard reports and collected metrics to measure the prevalence of software in hazards and the specificity of descriptions of software causes of hazardous conditions. We found that 49-70% of 154 hazardous conditions could be caused by software or software was involved in the prevention of the hazardous condition. We also found that 12-17% of the 2013 hazard causes involved software, and that 23-29% of all causes had a software control. The application of the TPRM methodology identified process risks in the application of the hazard analysis process itself that may lead to software safety risk.

  12. Delivering Software Process-Specific Project Courses in Tertiary Education Environment: Challenges and Solution

    ERIC Educational Resources Information Center

    Rong, Guoping; Shao, Dong

    2012-01-01

    The importance of delivering software process courses to software engineering students has been more and more recognized in China in recent years. However, students usually cannot fully appreciate the value of software process courses by only learning methodology and principle in the classroom. Therefore, a process-specific project course was…

  13. Elements of an Art - Agile Coaching

    NASA Astrophysics Data System (ADS)

    Lundh, Erik

    This tutorial gives you a lead on becoming or redefining yourself as an Agile Coach. Introduction to elements and dimensions of state-of-the-art Agile Coaching. How to position the agile coach to be effective in a larger setting. Making the agile transition - from a single team to thousands of people. How to support multiple teams as a coach. How to build a coaches network in your company. Challenges when the agile coach is a consultant and the organization is large.

  14. Process of videotape making: presentation design, software, and hardware

    NASA Astrophysics Data System (ADS)

    Dickinson, Robert R.; Brady, Dan R.; Bennison, Tim; Burns, Thomas; Pines, Sheldon

    1991-06-01

    The use of technical video tape presentations for communicating abstractions of complex data is now becoming commonplace. While the use of video tapes in the day-to-day work of scientists and engineers is still in its infancy, their use as applications oriented conferences is now growing rapidly. Despite these advancements, there is still very little that is written down about the process of making technical videotapes. For printed media, different presentation styles are well known for categories such as results reports, executive summary reports, and technical papers and articles. In this paper, the authors present ideas on the topic of technical videotape presentation design in a format that is worth referring to. They have started to document the ways in which the experience of media specialist, teaching professionals, and character animators can be applied to scientific animation. Software and hardware considerations are also discussed. For this portion, distinctions are drawn between the software and hardware required for computer animation (frame at a time) productions, and live recorded interaction with a computer graphics display.

  15. Achieving dependability throughout the development process - A distributed software experiment

    NASA Technical Reports Server (NTRS)

    Kelly, John P. J.; Murphy, Susan C.

    1990-01-01

    Distributed software engineering techniques and methods for improving the specification and testing phases are considered. With multiversion development, multiple implementations allow the use of an automated approach to testing called back-to-back (B/B) testing in which the outputs are compared to detect any discrepancies. However, a specification defect may lead to similar errors in the multiple versions and the underlying fault may not be detected with a B/B testing approach. The use of diverse formal specifications has been proposed as a solution to this problem, since defects in independently written specifications are likely to be different. To examine these issues, an experiment was performed using the design diversity approach in the specification, design, implementation, and testing of distributed software. In the experiment, three diverse formal specifications were used to produce multiple independent implementations of a distributed communication protocol in Ada. The problems encountered in building complex concurrent processing systems in Ada were also studied. Many pitfalls were discovered in mapping the formal specifications into Ada implementations.

  16. Digital processing of side-scan sonar data with the Woods Hole image processing system software

    USGS Publications Warehouse

    Paskevich, Valerie F.

    1992-01-01

    Since 1985, the Branch of Atlantic Marine Geology has been involved in collecting, processing and digitally mosaicking high and low-resolution side-scan sonar data. Recent development of a UNIX-based image-processing software system includes a series of task specific programs for processing side-scan sonar data. This report describes the steps required to process the collected data and to produce an image that has equal along- and across-track resol

  17. Requirement Changes and Project Success: The Moderating Effects of Agile Approaches in System Engineering Projects

    NASA Astrophysics Data System (ADS)

    Maierhofer, Sabine; Stelzmann, Ernst; Kohlbacher, Markus; Fellner, Björn

    This paper reports the findings of an empirical study on the influence agile development methods exert on the success of projects. The goal is to determine whether agile methods are able to mitigate negative effects requirement changes have on the performance of Systems Engineering projects, i.e. projects where systems consisting of hard- and software are developed. Agile methods have been proven to successfully support development projects in the field of traditional software engineering, but with an ever expending market of integrated systems manufacturers their usability for those complex projects has yet to be examined. This study focuses on 16 specific agile practices and their ability to improve the success of complex hard- and software projects.

  18. Special Software for Planetary Image Processing and Research

    NASA Astrophysics Data System (ADS)

    Zubarev, A. E.; Nadezhdina, I. E.; Kozlova, N. A.; Brusnikin, E. S.; Karachevtseva, I. P.

    2016-06-01

    The special modules of photogrammetric processing of remote sensing data that provide the opportunity to effectively organize and optimize the planetary studies were developed. As basic application the commercial software package PHOTOMOD™ is used. Special modules were created to perform various types of data processing: calculation of preliminary navigation parameters, calculation of shape parameters of celestial body, global view image orthorectification, estimation of Sun illumination and Earth visibilities from planetary surface. For photogrammetric processing the different types of data have been used, including images of the Moon, Mars, Mercury, Phobos, Galilean satellites and Enceladus obtained by frame or push-broom cameras. We used modern planetary data and images that were taken over the years, shooting from orbit flight path with various illumination and resolution as well as obtained by planetary rovers from surface. Planetary data image processing is a complex task, and as usual it can take from few months to years. We present our efficient pipeline procedure that provides the possibilities to obtain different data products and supports a long way from planetary images to celestial body maps. The obtained data - new three-dimensional control point networks, elevation models, orthomosaics - provided accurate maps production: a new Phobos atlas (Karachevtseva et al., 2015) and various thematic maps that derived from studies of planetary surface (Karachevtseva et al., 2016a).

  19. Educational software for illustrating gas-exchange processes in plants

    SciTech Connect

    Wullschleger, S.D.; Hanson, P.J. ); Sage, R.F. )

    1991-05-01

    Simulation models are increasingly being used to describe physiological processes in the plant sciences. These models, while useful for research purposes, also offer tremendous potential for demonstrating a wide array of scientific topics to students. The authors have developed an educational software package, based on currently accepted principles, that illustrates the environmental and biochemical control of plant gas-exchange. Graphic and tabular presentations, coupled with on-screen requests for student input, serve to effectively convey the basic fundamentals of photosynthesis and transpiration, as well as the diurnal patterns of plant gas-exchange in response to fluctuating environmental conditions. More advanced topics focus on the biochemical limitations to photosynthesis imposed by Rubisco activity, electron transport capacity, and the regeneration of inorganic phosphorus. Also included is an exercise that challenges students to call upon the lessons learned in order to optimize carbon assimilation, while minimizing water losses, over a 72-h simulation period.

  20. Agile Data Management with the Global Change Information System

    NASA Astrophysics Data System (ADS)

    Duggan, B.; Aulenbach, S.; Tilmes, C.; Goldstein, J.

    2013-12-01

    We describe experiences applying agile software development techniques to the realm of data management during the development of the Global Change Information System (GCIS), a web service and API for authoritative global change information under development by the US Global Change Research Program. Some of the challenges during system design and implementation have been : (1) balancing the need for a rigorous mechanism for ensuring information quality with the realities of large data sets whose contents are often in flux, (2) utilizing existing data to inform decisions about the scope and nature of new data, and (3) continuously incorporating new knowledge and concepts into a relational data model. The workflow for managing the content of the system has much in common with the development of the system itself. We examine various aspects of agile software development and discuss whether or how we have been able to use them for data curation as well as software development.

  1. The LUCIFER control software

    NASA Astrophysics Data System (ADS)

    Jütte, Marcus; Knierim, Volker; Polsterer, Kai; Lehmitz, Michael; Storz, Clemens; Seifert, Walter; Ageorges, Nancy

    2010-07-01

    The successful roll-out of the control software for a complex NIR imager/spectrograph with MOS calls for flexible development strategies due to changing requirements during different phases of the project. A waterfall strategy used in the beginning has to change to a more iterative and agile process in the later stages. The choice of an appropriate program language as well as suitable software layout is crucial. For example the software has to accomplish multiple demands of different user groups, including a high level of flexibility for later changes and extensions. Different access levels to the instrument are mandatory to afford direct control mechanisms for lab operations and inspections of the instrument as well as tools to accomplish efficient science observations. Our hierarchical software structure with four layers of increasing abstract levels and the use of an object oriented language ideally supports these requirements. Here we describe our software architecture, the software development process, the different access levels and our commissioning experiences with LUCIFER 1.

  2. Data analysis software for the autoradiographic enhancement process. Volumes 1, 2, and 3, and appendix

    NASA Technical Reports Server (NTRS)

    Singh, S. P.

    1979-01-01

    The computer software developed to set up a method for Wiener spectrum analysis of photographic films is presented. This method is used for the quantitative analysis of the autoradiographic enhancement process. The software requirements and design for the autoradiographic enhancement process are given along with the program listings and the users manual. A software description and program listings modification of the data analysis software are included.

  3. Agile machining and inspection thrust area team-on-machine probing / compatibility assessment of Parametric Technology Corporation (PTC) pro/CMM DMIS with Zeiss DMISEngine.

    SciTech Connect

    Wade, James Rokwel; Tomlinson, Kurt; Bryce, Edwin Anthony

    2008-09-01

    The charter goal of the Agile Machining and Inspection Thrust Area Team is to identify technical requirements, within the nuclear weapons complex (NWC), for Agile Machining and Inspection capabilities. During FY 2008, the team identified Parametric Technology Corporation (PTC) Pro/CMM as a software tool for use in off-line programming of probing routines--used for measurement--for machining and turning centers. The probing routine would be used for in-process verification of part geometry. The same Pro/CMM program used on the machine tool could also be employed for program validation / part verification using a coordinate measuring machine (CMM). Funding was provided to determine the compatibility of the Pro/CMM probing program with CMM software (Zeiss DMISEngine).

  4. Certification Processes for Safety-Critical and Mission-Critical Aerospace Software

    NASA Technical Reports Server (NTRS)

    Nelson, Stacy

    2003-01-01

    This document is a quick reference guide with an overview of the processes required to certify safety-critical and mission-critical flight software at selected NASA centers and the FAA. Researchers and software developers can use this guide to jumpstart their understanding of how to get new or enhanced software onboard an aircraft or spacecraft. The introduction contains aerospace industry definitions of safety and safety-critical software, as well as, the current rationale for certification of safety-critical software. The Standards for Safety-Critical Aerospace Software section lists and describes current standards including NASA standards and RTCA DO-178B. The Mission-Critical versus Safety-Critical software section explains the difference between two important classes of software: safety-critical software involving the potential for loss of life due to software failure and mission-critical software involving the potential for aborting a mission due to software failure. The DO-178B Safety-critical Certification Requirements section describes special processes and methods required to obtain a safety-critical certification for aerospace software flying on vehicles under auspices of the FAA. The final two sections give an overview of the certification process used at Dryden Flight Research Center and the approval process at the Jet Propulsion Lab (JPL).

  5. A 3-Dimensional display and process software for THz spectrum

    NASA Astrophysics Data System (ADS)

    Jiang, Xiaowen; Zhang, Zhaohui; Zhao, Xiaoyan; Yin, Yixin; Ajito, Katsuhiro; Song, Hojin

    2011-02-01

    An underpinning software is devoted to THz spectrum analyzing and 3-D imaging. The paper describes the software's outline, structure, functions and some of considerations. Users in LAN (local area network) can access it and implement some basic and advanced works such as files operation, echoes cutting, spectrum calculation, baseline cancelling, peak fitting, qualitative and quantitative measuring of solid-state samples.

  6. Software Package for Preparing and Processing of an Astronomical Observation

    NASA Astrophysics Data System (ADS)

    Vaduvescu, Ovidiu; Birlan, Mirel

    This paper presents an astronomical software package which draws celestial charts. It was conceived taking into account the technical possibilities available for the Romanian astronomers and the actual trend of the observational astronomy. The software package, now to its third version, comes to decrease the time to prepare an observation and to perform accurate charts for searching and identification.

  7. Measuring the development process: A tool for software design evaluation

    NASA Technical Reports Server (NTRS)

    Moy, S. S.

    1980-01-01

    The design metrics evaluator (DME), a component of an automated software design analysis system, is described. The DME quantitatively evaluates software design attributes. Its use directs attention to areas of a procedure, module, or complete program having a high potential for error.

  8. NEIGHBOUR-IN: Image processing software for spatial analysis of animal grouping

    PubMed Central

    Caubet, Yves; Richard, Freddie-Jeanne

    2015-01-01

    Abstract Animal grouping is a very complex process that occurs in many species, involving many individuals under the influence of different mechanisms. To investigate this process, we have created an image processing software, called NEIGHBOUR-IN, designed to analyse individuals’ coordinates belonging to up to three different groups. The software also includes statistical analysis and indexes to discriminate aggregates based on spatial localisation of individuals and their neighbours. After the description of the software, the indexes computed by the software are illustrated using both artificial patterns and case studies using the spatial distribution of woodlice. The added strengths of this software and methods are also discussed. PMID:26261448

  9. Frequency-agile wireless sensor networks

    NASA Astrophysics Data System (ADS)

    Arms, Steven W.; Townsend, Christopher P.; Churchill, David L.; Hamel, Michael J.; Galbreath, Jacob H.; Mundell, Steven W.

    2004-07-01

    Our goal was to demonstrate a wireless communications system capable of simultaneous, high speed data communications from a variety of sensors. We have previously reported on the design and application of 2 KHz data logging transceiver nodes, however, only one node may stream data at a time, since all nodes on the network use the same communications frequency. To overcome these limitations, second generation data logging transceivers were developed with software programmable radio frequency (RF) communications. Each node contains on-board memory (2 Mbytes), sensor excitation, instrumentation amplifiers with programmable gains & offsets, multiplexer, 16 bit A/D converter, microcontroller, and frequency agile, bi-directional, frequency shift keyed (FSK) RF serial data link. These systems are capable of continuous data transmission from 26 distinct nodes (902-928 MHz band, 75 kbaud). The system was demonstrated in a compelling structural monitoring application. The National Parks Service requested a means for continual monitoring and recording of sensor data from the Liberty Bell during a move to a new location (Philadelphia, October 2003). Three distinct, frequency agile, wireless sensing nodes were used to detect visible crack shear/opening micromotions, triaxial accelerations, and hairline crack tip strains. The wireless sensors proved to be useful in protecting the Liberty Bell.

  10. Piloted simulator assessments of agility

    NASA Technical Reports Server (NTRS)

    Schneider, Edward T.

    1990-01-01

    NASA has utilized piloted simulators for nearly two decades to study high-angle-of-attack flying qualities, agility, and air-to-air combat. These studies have included assessments of an F-16XL aircraft equipped with thrust vectoring, an assessment of the F-18 HARV maneuvering requirements to assist in thrust vectoring control system design, and an agility assessment of the F-18. The F-18 agility assessment was compared with in-flight testing. Open-loop maneuvers such as 180-deg rolls to measure roll rate showed favorable simulator/in-flight comparison. Closed-loop maneuvers such as rolls to 90 deg with precision stops or certain maximum longitudinal pitching maneuvers showed poorer performance due to reduced aggressiveness of pilot inputs in flight to remain within flight envelope limits.

  11. WFF TOPEX Software Documentation Altimeter Instrument File (AIF) Processing, October 1998. Volume 3

    NASA Technical Reports Server (NTRS)

    Lee, Jeffrey; Lockwood, Dennis

    2003-01-01

    This document is a compendium of the WFF TOPEX Software Development Team's knowledge regarding Sensor Data Record (SDR) Processing. It includes many elements of a requirements document, a software specification document, a software design document, and a user's manual. In the more technical sections, this document assumes the reader is familiar with TOPEX and instrument files.

  12. Software Piracy among College Students: A Comprehensive Review of Contributing Factors, Underlying Processes, and Tackling Strategies

    ERIC Educational Resources Information Center

    Liang, Zhili; Yan, Zheng

    2005-01-01

    This article reviewed empirical studies published in the past 30 years that examined software piracy among college students. It focused on three areas of study: (a) major factors that affect college students' intentions, attitudes, and moral intensity regarding software piracy, (b) various decision-making processes that underlie software piracy…

  13. 78 FR 47012 - Developing Software Life Cycle Processes Used in Safety Systems of Nuclear Power Plants

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-02

    ...The U.S. Nuclear Regulatory Commission (NRC) is issuing a revised regulatory guide (RG), revision 1 of RG 1.173, ``Developing Software Life Cycle Processes for Digital Computer Software used in Safety Systems of Nuclear Power Plants.'' This RG endorses the Institute of Electrical and Electronic Engineers (IEEE) Standard (Std.) 1074-2006, ``IEEE Standard for Developing a Software Project Life......

  14. TOPEX Software Document Series. Volume 5; Rev. 1; TOPEX GDR Processing

    NASA Technical Reports Server (NTRS)

    Lee, Jeffrey; Lockwood, Dennis; Hancock, David W., III

    2003-01-01

    This document is a compendium of the WFF TOPEX Software Development Team's knowledge regarding Geophysical Data Record (GDR) Processing. It includes many elements of a requirements document, a software specification document, a software design document, and a user's manual. In the more technical sections, this document assumes the reader is familiar with TOPEX and instrument files.

  15. A Process for Evaluating Student Records Management Software. ERIC/AE Digest.

    ERIC Educational Resources Information Center

    Vecchioli, Lisa

    This digest provides practical advice on evaluating software for managing student records. An evaluation of record-keeping software should start with a process to identify all of the individual needs the software produce must meet in order to be considered for purchase. The first step toward establishing an administrative computing system is…

  16. The AGILE Data Center at ASDC

    NASA Astrophysics Data System (ADS)

    Pittori, Carlotta; AGILE Collaboration

    2013-01-01

    AGILE is a Scientific Mission of the Italian Space Agency (ASI) with INFN, INAF and CIFS participation, devoted to gamma-ray astrophysics. The satellite has been in orbit since April 23rd, 2007. Thanks to its sky monitoring capability and fast ground segment alert system, AGILE produced several important scientific results, among which was the unexpected discovery of strong and rapid gamma-ray flares from the Crab Nebula over daily timescales. This discovery won for the AGILE PI and the AGILE Team the Bruno Rossi Prize for 2012. The AGILE Data Center, located at ASDC, is in charge of all the scientific oriented activities related to the analysis and archiving of AGILE data. I will present the AGILE data center main activities, and I will give an overview of the AGILE scientific highlights after 5 years of operations.

  17. Global Software Development Patterns for Project Management

    NASA Astrophysics Data System (ADS)

    Välimäki, Antti; Kääriäinen, Jukka; Koskimies, Kai

    Global software development with the agile or waterfall development process has been taken into use in many companies. GSD offers benefits but also new challenges without known, documented solutions. The goal of this research is to present current best practices for GSD in the form of process patterns for project management, evaluated by using a scenario-based assessment method. The best practices have been collected from a large company operating in process automation. It is expected that the resulting pattern language helps other companies to improve their GSD processes by incorporating the patterns in the processes.

  18. Models for Deploying Open Source and Commercial Software to Support Earth Science Data Processing and Distribution

    NASA Astrophysics Data System (ADS)

    Yetman, G.; Downs, R. R.

    2011-12-01

    Software deployment is needed to process and distribute scientific data throughout the data lifecycle. Developing software in-house can take software development teams away from other software development projects and can require efforts to maintain the software over time. Adopting and reusing software and system modules that have been previously developed by others can reduce in-house software development and maintenance costs and can contribute to the quality of the system being developed. A variety of models are available for reusing and deploying software and systems that have been developed by others. These deployment models include open source software, vendor-supported open source software, commercial software, and combinations of these approaches. Deployment in Earth science data processing and distribution has demonstrated the advantages and drawbacks of each model. Deploying open source software offers advantages for developing and maintaining scientific data processing systems and applications. By joining an open source community that is developing a particular system module or application, a scientific data processing team can contribute to aspects of the software development without having to commit to developing the software alone. Communities of interested developers can share the work while focusing on activities that utilize in-house expertise and addresses internal requirements. Maintenance is also shared by members of the community. Deploying vendor-supported open source software offers similar advantages to open source software. However, by procuring the services of a vendor, the in-house team can rely on the vendor to provide, install, and maintain the software over time. Vendor-supported open source software may be ideal for teams that recognize the value of an open source software component or application and would like to contribute to the effort, but do not have the time or expertise to contribute extensively. Vendor-supported software may

  19. The AGILE Alert System for Gamma-Ray Transients

    NASA Astrophysics Data System (ADS)

    Bulgarelli, A.; Trifoglio, M.; Gianotti, F.; Tavani, M.; Parmiggiani, N.; Fioretti, V.; Chen, A. W.; Vercellone, S.; Pittori, C.; Verrecchia, F.; Lucarelli, F.; Santolamazza, P.; Fanari, G.; Giommi, P.; Beneventano, D.; Argan, A.; Trois, A.; Scalise, E.; Longo, F.; Pellizzoni, A.; Pucella, G.; Colafrancesco, S.; Conforti, V.; Tempesta, P.; Cerone, M.; Sabatini, P.; Annoni, G.; Valentini, G.; Salotti, L.

    2014-01-01

    In recent years, a new generation of space missions has offered great opportunities for discovery in high-energy astrophysics. In this article we focus on the scientific operations of the Gamma-Ray Imaging Detector (GRID) on board the AGILE space mission. AGILE-GRID, sensitive in the energy range of 30 MeV-30 GeV, has detected many γ-ray transients of both galactic and extragalactic origin. This work presents the AGILE innovative approach to fast γ-ray transient detection, which is a challenging task and a crucial part of the AGILE scientific program. The goals are to describe (1) the AGILE Gamma-Ray Alert System, (2) a new algorithm for blind search identification of transients within a short processing time, (3) the AGILE procedure for γ-ray transient alert management, and (4) the likelihood of ratio tests that are necessary to evaluate the post-trial statistical significance of the results. Special algorithms and an optimized sequence of tasks are necessary to reach our goal. Data are automatically analyzed at every orbital downlink by an alert pipeline operating on different timescales. As proper flux thresholds are exceeded, alerts are automatically generated and sent as SMS messages to cellular telephones, via e-mail, and via push notifications from an application for smartphones and tablets. These alerts are crosschecked with the results of two pipelines, and a manual analysis is performed. Being a small scientific-class mission, AGILE is characterized by optimization of both scientific analysis and ground-segment resources. The system is capable of generating alerts within two to three hours of a data downlink, an unprecedented reaction time in γ-ray astrophysics.

  20. The agile alert system for gamma-ray transients

    SciTech Connect

    Bulgarelli, A.; Trifoglio, M.; Gianotti, F.; Fioretti, V.; Chen, A. W.; Pittori, C.; Verrecchia, F.; Lucarelli, F.; Santolamazza, P.; Fanari, G.; Giommi, P.; Pellizzoni, A.; and others

    2014-01-20

    In recent years, a new generation of space missions has offered great opportunities for discovery in high-energy astrophysics. In this article we focus on the scientific operations of the Gamma-Ray Imaging Detector (GRID) on board the AGILE space mission. AGILE-GRID, sensitive in the energy range of 30 MeV-30 GeV, has detected many γ-ray transients of both galactic and extragalactic origin. This work presents the AGILE innovative approach to fast γ-ray transient detection, which is a challenging task and a crucial part of the AGILE scientific program. The goals are to describe (1) the AGILE Gamma-Ray Alert System, (2) a new algorithm for blind search identification of transients within a short processing time, (3) the AGILE procedure for γ-ray transient alert management, and (4) the likelihood of ratio tests that are necessary to evaluate the post-trial statistical significance of the results. Special algorithms and an optimized sequence of tasks are necessary to reach our goal. Data are automatically analyzed at every orbital downlink by an alert pipeline operating on different timescales. As proper flux thresholds are exceeded, alerts are automatically generated and sent as SMS messages to cellular telephones, via e-mail, and via push notifications from an application for smartphones and tablets. These alerts are crosschecked with the results of two pipelines, and a manual analysis is performed. Being a small scientific-class mission, AGILE is characterized by optimization of both scientific analysis and ground-segment resources. The system is capable of generating alerts within two to three hours of a data downlink, an unprecedented reaction time in γ-ray astrophysics.

  1. a Matlab Geodetic Software for Processing Airborne LIDAR Bathymetry Data

    NASA Astrophysics Data System (ADS)

    Pepe, M.; Prezioso, G.

    2015-04-01

    The ability to build three-dimensional models through technologies based on satellite navigation systems GNSS and the continuous development of new sensors, as Airborne Laser Scanning Hydrography (ALH), data acquisition methods and 3D multi-resolution representations, have contributed significantly to the digital 3D documentation, mapping, preservation and representation of landscapes and heritage as well as to the growth of research in this fields. However, GNSS systems led to the use of the ellipsoidal height; to transform this height in orthometric is necessary to know a geoid undulation model. The latest and most accurate global geoid undulation model, available worldwide, is EGM2008 which has been publicly released by the U.S. National Geospatial-Intelligence Agency (NGA) EGM Development Team. Therefore, given the availability and accuracy of this geoid model, we can use it in geomatics applications that require the conversion of heights. Using this model, to correct the elevation of a point does not coincide with any node must interpolate elevation information of adjacent nodes. The purpose of this paper is produce a Matlab® geodetic software for processing airborne LIDAR bathymetry data. In particular we want to focus on the point clouds in ASPRS LAS format and convert the ellipsoidal height in orthometric. The algorithm, valid on the whole globe and operative for all UTM zones, allows the conversion of ellipsoidal heights using the EGM2008 model. Of this model we analyse the slopes which occur, in some critical areas, between the nodes of the undulations grid; we will focus our attention on the marine areas verifying the impact that the slopes have in the calculation of the orthometric height and, consequently, in the accuracy of the in the 3-D point clouds. This experiment will be carried out by analysing a LAS APRS file containing topographic and bathymetric data collected with LIDAR systems along the coasts of Oregon and Washington (USA).

  2. Some Future Software Engineering Opportunities and Challenges

    NASA Astrophysics Data System (ADS)

    Boehm, Barry

    This paper provides an update and extension of a 2006 paper, “Some Future Trends and Implications for Systems and Software Engineering Processes,” Systems Engineering, Spring 2006. Some of its challenges and opportunities are similar, such as the need to simultaneously achieve high levels of both agility and assurance. Others have emerged as increasingly important, such as the challenges of dealing with ultralarge volumes of data, with multicore chips, and with software as a service. The paper is organized around eight relatively surprise-free trends and two “wild cards” whose trends and implications are harder to foresee. The eight surprise-free trends are:

  3. Mapping modern software process engineering techniques onto an HEP development environment

    NASA Astrophysics Data System (ADS)

    Wellisch, J. P.

    2003-04-01

    One of the most challenging issues faced in HEP in recent years is the question of how to capitalise on software development and maintenance experience in a continuous manner. To capitalise means in our context to evaluate and apply new process technologies as they arise, and to further evolve technologies already widely in use. It also implies the definition and adoption of standards. The CMS off-line software improvement effort aims at continual software quality improvement, and continual improvement in the efficiency of the working environment with the goal to facilitate doing great new physics. To achieve this, we followed a process improvement program based on ISO-15504, and Rational Unified Process. This experiment in software process improvement in HEP has been progressing now for a period of 3 years. Taking previous experience from ATLAS and SPIDER into account, we used a soft approach of continuous change within the limits of current culture to create of de facto software process standards within the CMS off line community as the only viable route to a successful software process improvement program in HEP. We will present the CMS approach to software process improvement in this process R&D, describe lessons learned, and mistakes made. We will demonstrate the benefits gained, and the current status of the software processes established in CMS off-line software.

  4. Studying the Accuracy of Software Process Elicitation: The User Articulated Model

    ERIC Educational Resources Information Center

    Crabtree, Carlton A.

    2010-01-01

    Process models are often the basis for demonstrating improvement and compliance in software engineering organizations. A descriptive model is a type of process model describing the human activities in software development that actually occur. The purpose of a descriptive model is to provide a documented baseline for further process improvement…

  5. Educational software for the visualization of space plasma processes

    NASA Technical Reports Server (NTRS)

    Russell, C. T.; Le, G.; Luhmann, J. G.; Littlefield, B.

    1995-01-01

    The UCLA Space Physics Group has developed educational software composed of a series of modules to assist students with understanding basic concepts of space plasmas and charged particle motion. Present modules cover planetary magnetospheres, charged particle motion, cold plasma waves, collisionless shock waves, and solar wind. The software is designed around the principle that students can learn more by doing rather than by reading or listening. The programs provide a laboratory-like environment in which the student can control, observe, and measure complex behavior. The interactive graphics environment allows the student to visualize the results of his or her experimentation and to try different parameters as desired. The current version of the software runs on UNIX-based operating systems in an X-Windows environment. It has been used in a classroom setting at both UCLA and the University of California at San Diego.

  6. IMPROVING (SOFTWARE) PATENT QUALITY THROUGH THE ADMINISTRATIVE PROCESS

    PubMed Central

    Rai, Arti K.

    2014-01-01

    The available evidence indicates that patent quality, particularly in the area of software, needs improvement. This Article argues that even an agency as institutionally constrained as the U.S. Patent and Trademark Office (“PTO”) could implement a portfolio of pragmatic, cost-effective quality improvement strategies. The argument in favor of these strategies draws upon not only legal theory and doctrine but also new data from a PTO software examination unit with relatively strict practices. Strategies that resolve around Section 112 of the patent statute could usefully be deployed at the initial examination stage. Other strategies could be deployed within the new post-issuance procedures available to the agency under the America Invents Act. Notably, although the strategies the Article discusses have the virtue of being neutral as to technology, they are likely to have a very significant practical impact in the area of software. PMID:25221346

  7. IMPROVING (SOFTWARE) PATENT QUALITY THROUGH THE ADMINISTRATIVE PROCESS.

    PubMed

    Rai, Arti K

    2013-11-24

    The available evidence indicates that patent quality, particularly in the area of software, needs improvement. This Article argues that even an agency as institutionally constrained as the U.S. Patent and Trademark Office ("PTO") could implement a portfolio of pragmatic, cost-effective quality improvement strategies. The argument in favor of these strategies draws upon not only legal theory and doctrine but also new data from a PTO software examination unit with relatively strict practices. Strategies that resolve around Section 112 of the patent statute could usefully be deployed at the initial examination stage. Other strategies could be deployed within the new post-issuance procedures available to the agency under the America Invents Act. Notably, although the strategies the Article discusses have the virtue of being neutral as to technology, they are likely to have a very significant practical impact in the area of software. PMID:25221346

  8. A Framework for Process Improvement in Software Product Management

    NASA Astrophysics Data System (ADS)

    Bekkers, Willem; van de Weerd, Inge; Spruit, Marco; Brinkkemper, Sjaak

    This paper presents a comprehensive overview of all the important areas within Software Product Management (SPM). The overview has been created and validated in collaboration with many experts from practice and the scientific community. It provides a list of 68 capabilities a product software organization should implement to reach a full grown SPM maturity. The overview consists of the SPM Competence Model that shows the areas of importance to SPM, and the SPM Maturity Matrix that lists all important activities within those areas in a best practice implementation order. SPM organizations can use this matrix to map and improve their SPM practices incrementally.

  9. Agile data management for curation of genomes to watershed datasets

    NASA Astrophysics Data System (ADS)

    Varadharajan, C.; Agarwal, D.; Faybishenko, B.; Versteeg, R.

    2015-12-01

    A software platform is being developed for data management and assimilation [DMA] as part of the U.S. Department of Energy's Genomes to Watershed Sustainable Systems Science Focus Area 2.0. The DMA components and capabilities are driven by the project science priorities and the development is based on agile development techniques. The goal of the DMA software platform is to enable users to integrate and synthesize diverse and disparate field, laboratory, and simulation datasets, including geological, geochemical, geophysical, microbiological, hydrological, and meteorological data across a range of spatial and temporal scales. The DMA objectives are (a) developing an integrated interface to the datasets, (b) storing field monitoring data, laboratory analytical results of water and sediments samples collected into a database, (c) providing automated QA/QC analysis of data and (d) working with data providers to modify high-priority field and laboratory data collection and reporting procedures as needed. The first three objectives are driven by user needs, while the last objective is driven by data management needs. The project needs and priorities are reassessed regularly with the users. After each user session we identify development priorities to match the identified user priorities. For instance, data QA/QC and collection activities have focused on the data and products needed for on-going scientific analyses (e.g. water level and geochemistry). We have also developed, tested and released a broker and portal that integrates diverse datasets from two different databases used for curation of project data. The development of the user interface was based on a user-centered design process involving several user interviews and constant interaction with data providers. The initial version focuses on the most requested feature - i.e. finding the data needed for analyses through an intuitive interface. Once the data is found, the user can immediately plot and download data

  10. Using Automatic Code Generation in the Attitude Control Flight Software Engineering Process

    NASA Technical Reports Server (NTRS)

    McComas, David; O'Donnell, James R., Jr.; Andrews, Stephen F.

    1999-01-01

    This paper presents an overview of the attitude control subsystem flight software development process, identifies how the process has changed due to automatic code generation, analyzes each software development phase in detail, and concludes with a summary of our lessons learned.

  11. ASAP (Automatic Software for ASL Processing): A toolbox for processing Arterial Spin Labeling images.

    PubMed

    Mato Abad, Virginia; García-Polo, Pablo; O'Daly, Owen; Hernández-Tamames, Juan Antonio; Zelaya, Fernando

    2016-04-01

    The method of Arterial Spin Labeling (ASL) has experienced a significant rise in its application to functional imaging, since it is the only technique capable of measuring blood perfusion in a truly non-invasive manner. Currently, there are no commercial packages for processing ASL data and there is no recognized standard for normalizing ASL data to a common frame of reference. This work describes a new Automated Software for ASL Processing (ASAP) that can automatically process several ASL datasets. ASAP includes functions for all stages of image pre-processing: quantification, skull-stripping, co-registration, partial volume correction and normalization. To assess the applicability and validity of the toolbox, this work shows its application in the study of hypoperfusion in a sample of healthy subjects at risk of progressing to Alzheimer's disease. ASAP requires limited user intervention, minimizing the possibility of random and systematic errors, and produces cerebral blood flow maps that are ready for statistical group analysis. The software is easy to operate and results in excellent quality of spatial normalization. The results found in this evaluation study are consistent with previous studies that find decreased perfusion in Alzheimer's patients in similar regions and demonstrate the applicability of ASAP. PMID:26612079

  12. Effective confidence interval estimation of fault-detection process of software reliability growth models

    NASA Astrophysics Data System (ADS)

    Fang, Chih-Chiang; Yeh, Chun-Wu

    2016-09-01

    The quantitative evaluation of software reliability growth model is frequently accompanied by its confidence interval of fault detection. It provides helpful information to software developers and testers when undertaking software development and software quality control. However, the explanation of the variance estimation of software fault detection is not transparent in previous studies, and it influences the deduction of confidence interval about the mean value function that the current study addresses. Software engineers in such a case cannot evaluate the potential hazard based on the stochasticity of mean value function, and this might reduce the practicability of the estimation. Hence, stochastic differential equations are utilised for confidence interval estimation of the software fault-detection process. The proposed model is estimated and validated using real data-sets to show its flexibility.

  13. A Roadmap for using Agile Development in a Traditional System

    NASA Technical Reports Server (NTRS)

    Streiffert, Barbara; Starbird, Thomas

    2006-01-01

    I. Ensemble Development Group: a) Produces activity planning software for in spacecraft; b) Built on Eclipse Rich Client Platform (open source development and runtime software); c) Funded by multiple sources including the Mars Technology Program; d) Incorporated the use of Agile Development. II. Next Generation Uplink Planning System: a) Researches the Activity Planning and Sequencing Subsystem for Mars Science Laboratory (APSS); b) APSS includes Ensemble, Activity Modeling, Constraint Checking, Command Editing and Sequencing tools plus other uplink generation utilities; c) Funded by the Mars Technology Program; d) Integrates all of the tools for APSS.

  14. Geometric simulation analysis of multi-band mosaic imaging from the same orbit by agile satellites

    NASA Astrophysics Data System (ADS)

    Xu, Yue; Chen, Jinwei; Chen, Yueting; Xu, Zhihai; Feng, Huajun; Li, Qi

    2015-08-01

    This paper establishes a geometric model of multi-band mosaic imaging from the same orbit by agile satellites, and introduces a self-write simulation software. Geometric parameters of each band are calculated based on the attitude control ability of the satellite and the mission requirements. Considering the different ground resolution and the imaging angle of each band, two new concepts, Gradient Entropy and Structure Similarity Parameter are presented. These two values are used to evaluate the change of image quality caused by agility, and help to estimate the effect of the mission. By building the geometric model and calculating the agile information with the program, we propose a new approach of forward analysis of agile imaging, which helps users evaluate the image degradation.

  15. Design and performance test of spacecraft test and operation software

    NASA Astrophysics Data System (ADS)

    Wang, Guohua; Cui, Yan; Wang, Shuo; Meng, Xiaofeng

    2011-06-01

    Main test processor (MTP) software is the key element of Electrical Ground Support Equipment (EGSE) for spacecraft test and operation used in the Chinese Academy of Space Technology (CAST) for years without innovation. With the increasing demand for a more efficient and agile MTP software, the new MTP software was developed. It adopts layered and plug-in based software architecture, whose core runtime server provides message queue management, share memory management and process management services and forms the framework for a configurable and open architecture system. To investigate the MTP software's performance, the test case of network response time, test sequence management capability and data-processing capability was introduced in detail. Test results show that the MTP software is common and has higher performance than the legacy one.

  16. Utilization of an agility assessment module in analysis and optimization of preliminary fighter configuration

    NASA Technical Reports Server (NTRS)

    Ngan, Angelen; Biezad, Daniel

    1996-01-01

    A study has been conducted to develop and to analyze a FORTRAN computer code for performing agility analysis on fighter aircraft configurations. This program is one of the modules of the NASA Ames ACSYNT (AirCraft SYNThesis) design code. The background of the agility research in the aircraft industry and a survey of a few agility metrics are discussed. The methodology, techniques, and models developed for the code are presented. The validity of the existing code was evaluated by comparing with existing flight test data. A FORTRAN program was developed for a specific metric, PM (Pointing Margin), as part of the agility module. Example trade studies using the agility module along with ACSYNT were conducted using a McDonnell Douglas F/A-18 Hornet aircraft model. Tile sensitivity of thrust loading, wing loading, and thrust vectoring on agility criteria were investigated. The module can compare the agility potential between different configurations and has capability to optimize agility performance in the preliminary design process. This research provides a new and useful design tool for analyzing fighter performance during air combat engagements in the preliminary design.

  17. Improving Video Game Development: Facilitating Heterogeneous Team Collaboration through Flexible Software Processes

    NASA Astrophysics Data System (ADS)

    Musil, Juergen; Schweda, Angelika; Winkler, Dietmar; Biffl, Stefan

    Based on our observations of Austrian video game software development (VGSD) practices we identified a lack of systematic processes/method support and inefficient collaboration between various involved disciplines, i.e. engineers and artists. VGSD includes heterogeneous disciplines, e.g. creative arts, game/content design, and software. Nevertheless, improving team collaboration and process support is an ongoing challenge to enable a comprehensive view on game development projects. Lessons learned from software engineering practices can help game developers to increase game development processes within a heterogeneous environment. Based on a state of the practice survey in the Austrian games industry, this paper presents (a) first results with focus on process/method support and (b) suggests a candidate flexible process approach based on Scrum to improve VGSD and team collaboration. Results showed (a) a trend to highly flexible software processes involving various disciplines and (b) identified the suggested flexible process approach as feasible and useful for project application.

  18. ICESat (GLAS) Science Processing Software Document Series. Volume 2; Science Data Management Plan; 4.0

    NASA Technical Reports Server (NTRS)

    Jester, Peggy L.; Hancock, David W., III

    1999-01-01

    This document provides the Data Management Plan for the GLAS Standard Data Software (SDS) supporting the GLAS instrument of the EOS ICESat Spacecraft. The SDS encompasses the ICESat Science Investigator-led Processing System (I-SIPS) Software and the Instrument Support Facility (ISF) Software. This Plan addresses the identification, authority, and description of the interface nodes associated with the GLAS Standard Data Products and the GLAS Ancillary Data.

  19. Agile Data Curation at a State Geological Survey

    NASA Astrophysics Data System (ADS)

    Hills, D. J.

    2015-12-01

    State agencies, including geological surveys, are often the gatekeepers for myriad data products essential for scientific research and economic development. For example, the Geological Survey of Alabama (GSA) is mandated to explore for, characterize, and report Alabama's mineral, energy, water, and biological resources in support of economic development, conservation, management, and public policy for the betterment of Alabama's citizens, communities, and businesses. As part of that mandate, the GSA has increasingly been called upon to make our data more accessible to stakeholders. Even as demand for greater data accessibility grows, budgets for such efforts are often small, meaning that agencies must do more for less. Agile software development has yielded efficient, effective products, most often at lower cost and in shorter time. Taking guidance from the agile software development model, the GSA is working towards more agile data management and curation. To date, the GSA's work has been focused primarily on data rescue. By using workflows that maximize clear communication while encouraging simplicity (e.g., maximizing the amount of work not done or that can be automated), the GSA is bringing decades of dark data into the light. Regular checks by the data rescuer with the data provider (or their proxy) provides quality control without adding an overt burden on either party. Moving forward, these workflows will also allow for more efficient and effective data management.

  20. Image processing software for imaging spectrometry data analysis

    NASA Technical Reports Server (NTRS)

    Mazer, Alan; Martin, Miki; Lee, Meemong; Solomon, Jerry E.

    1988-01-01

    Imaging spectrometers simultaneously collect image data in hundreds of spectral channels, from the near-UV to the IR, and can thereby provide direct surface materials identification by means resembling laboratory reflectance spectroscopy. Attention is presently given to a software system, the Spectral Analysis Manager (SPAM) for the analysis of imaging spectrometer data. SPAM requires only modest computational resources and is composed of one main routine and a set of subroutine libraries. Additions and modifications are relatively easy, and special-purpose algorithms have been incorporated that are tailored to geological applications.

  1. AGILE integration into APC for high mix logic fab

    NASA Astrophysics Data System (ADS)

    Gatefait, M.; Lam, A.; Le Gratiet, B.; Mikolajczak, M.; Morin, V.; Chojnowski, N.; Kocsis, Z.; Smith, I.; Decaunes, J.; Ostrovsky, A.; Monget, C.

    2015-09-01

    For C040 technology and below, photolithographic depth of focus control and dispersion improvement is essential to secure product functionality. Critical 193nm immersion layers present initial focus process windows close to machine control capability. For previous technologies, the standard scanner sensor (Level sensor - LS) was used to map wafer topology and expose the wafer at the right Focus. Such optical embedded metrology, based on light reflection, suffers from reading issues that cannot be neglected anymore. Metrology errors are correlated to inspected product area for which material types and densities change, and so optical properties are not constant. Various optical phenomena occur across the product field during wafer inspection and have an effect on the quality and position of the reflected light. This can result in incorrect heights being recorded and exposures possibly being done out of focus. Focus inaccuracy associated to aggressive process windows on critical layers will directly impact product realization and therefore functionality and yield. ASML has introduced an air gauge sensor to complement the optical level sensor and lead to optimal topology metrology. The use of this new sensor is managed by the AGILE (Air Gauge Improved process LEveling) application. This measurement with no optical dependency will correct for optical inaccuracy of level sensor, and so improve best focus dispersion across the product. Due to the fact that stack complexity is more and more important through process steps flow, optical perturbation of standard Level sensor metrology is increasing and is becoming maximum for metallization layers. For these reasons AGILE feature implementation was first considered for contact and all metal layers. Another key point is that standard metrology will be sensitive to layer and reticle/product density. The gain of Agile will be enhanced for multiple product contribution mask and for complex System on Chip. Into ST context (High

  2. Advances in Software Tools for Pre-processing and Post-processing of Overset Grid Computations

    NASA Technical Reports Server (NTRS)

    Chan, William M.

    2004-01-01

    Recent developments in three pieces of software for performing pre-processing and post-processing work on numerical computations using overset grids are presented. The first is the OVERGRID graphical interface which provides a unified environment for the visualization, manipulation, generation and diagnostics of geometry and grids. Modules are also available for automatic boundary conditions detection, flow solver input preparation, multiple component dynamics input preparation and dynamics animation, simple solution viewing for moving components, and debris trajectory analysis input preparation. The second is a grid generation script library that enables rapid creation of grid generation scripts. A sample of recent applications will be described. The third is the OVERPLOT graphical interface for displaying and analyzing history files generated by the flow solver. Data displayed include residuals, component forces and moments, number of supersonic and reverse flow points, and various dynamics parameters.

  3. Differences in the Educational Software Evaluation Process for Experts and Novice Students

    ERIC Educational Resources Information Center

    Tokmak, Hatice Sancar; Incikabi, Lutfi; Yelken, Tugba Yanpar

    2012-01-01

    This comparative case study investigated the educational software evaluation processes of both experts and novices in conjunction with a software evaluation checklist. Twenty novice elementary education students, divided into groups of five, and three experts participated. Each novice group and the three experts evaluated educational software…

  4. Understanding Expertise-Based Training Effects on the Software Evaluation Process of Mathematics Education Teachers

    ERIC Educational Resources Information Center

    Incikabi, Lutfi; Sancar Tokmak, Hatice

    2012-01-01

    This case study examined the educational software evaluation processes of pre-service teachers who attended either expertise-based training (XBT) or traditional training in conjunction with a Software-Evaluation checklist. Forty-three mathematics teacher candidates and three experts participated in the study. All participants evaluated educational…

  5. Software Development Processes Applied to Computational Icing Simulation

    NASA Technical Reports Server (NTRS)

    Levinson, Laurie H.; Potapezuk, Mark G.; Mellor, Pamela A.

    1999-01-01

    The development of computational icing simulation methods is making the transition form the research to common place use in design and certification efforts. As such, standards of code management, design validation, and documentation must be adjusted to accommodate the increased expectations of the user community with respect to accuracy, reliability, capability, and usability. This paper discusses these concepts with regard to current and future icing simulation code development efforts as implemented by the Icing Branch of the NASA Lewis Research Center in collaboration with the NASA Lewis Engineering Design and Analysis Division. With the application of the techniques outlined in this paper, the LEWICE ice accretion code has become a more stable and reliable software product.

  6. MNE software for processing MEG and EEG data

    PubMed Central

    Gramfort, A.; Luessi, M.; Larson, E.; Engemann, D.; Strohmeier, D.; Brodbeck, C.; Parkkonen, L.; Hämäläinen, M.

    2013-01-01

    Magnetoencephalography and electroencephalography (M/EEG) measure the weak electromagnetic signals originating from neural currents in the brain. Using these signals to characterize and locate brain activity is a challenging task, as evidenced by several decades of methodological contributions. MNE, whose name stems from its capability to compute cortically-constrained minimum-norm current estimates from M/EEG data, is a software package that provides comprehensive analysis tools and workflows including preprocessing, source estimation, time–frequency analysis, statistical analysis, and several methods to estimate functional connectivity between distributed brain regions. The present paper gives detailed information about the MNE package and describes typical use cases while also warning about potential caveats in analysis. The MNE package is a collaborative effort of multiple institutes striving to implement and share best methods and to facilitate distribution of analysis pipelines to advance reproducibility of research. Full documentation is available at http://martinos.org/mne. PMID:24161808

  7. The AGILE gamma-ray astronomy mission

    NASA Astrophysics Data System (ADS)

    Mereghetti, S.; Tavani, M.; Argan, A.; Barbiellini, G.; Caraveo, P.; Chen, A.; Cocco, V.; Costa, E.; Di Cocco, G.; Feroci, M.; Labanti, C.; Lapshov, I.; Lipari, P.; Longo, F.; Morselli, A.; Perotti, F.; Picozza, P.; Pittori, C.; Prest, M.; Rubini, A.; Soffitta, P.; Vallazza, E.; Vercellone, S.; Zanello, D.

    2001-09-01

    We describe the AGILE satellite: a unique tool for high-energy astrophysics in the 30 MeV - 50 GeV range before GLAST. The scientific performances of AGILE are comparable to those of EGRET, despite the much smaller weight and dimensions. The AGILE mission will be optimized for the imaging capabilities above 30 MeV and for the study of transient phenomena, complemented by simultaneous monitoring in the hard X-ray band (10 - 40 keV).

  8. Assessment of proposed fighter agility metrics

    NASA Technical Reports Server (NTRS)

    Liefer, Randall K.; Valasek, John; Eggold, David P.; Downing, David R.

    1990-01-01

    This paper presents the results of an analysis of proposed metrics to assess fighter aircraft agility. A novel framework for classifying these metrics is developed and applied. A set of transient metrics intended to quantify the axial and pitch agility of fighter aircraft is evaluated with a high fidelity, nonlinear F-18 simulation. Test techniques and data reduction method are proposed, and sensitivities to pilot introduced errors during flight testing is investigated. Results indicate that the power onset and power loss parameters are promising candidates for quantifying axial agility, while maximum pitch up and pitch down rates are for quantifying pitch agility.

  9. Computer Center: Evaluating Biological Software for Process Skills--Part II. Evaluation Process: Catalog Review--The First Step.

    ERIC Educational Resources Information Center

    Duhrkopf, Richard, Ed.; Bell, Nancy B., Ed.

    1989-01-01

    Describes procedures for choosing process skill materials for evaluation. Steps include selecting software titles, reading descriptions, and skimming for key phrases associated with process skills, examining courseware, considering cost versus time, and the selecting process. Sample forms for the evaluation process are included. (RT)

  10. Archiving and Data Processing Software for Solar Radio Observations

    NASA Astrophysics Data System (ADS)

    Abramov-Maksimov, V. E.; Bogod, V. M.; Korzhavin, A. N.; Opeikina, L. V.; Shatilov, V. A.

    1997-03-01

    Regular daily observations of the Sun have been made over long time intervals since 1975 with the RATAN--600 radio telescope in the centimeter and decimeter wavelength ranges with high spatial resolution. A huge archive of observational data has been accumulated. The archive contains valuable astrophysical information covering a period that exceeds the duration of a solar activity cycle. However, all these data were stored on different data media (magnetic tapes, diskettes, streamer's cartridges, etc.) and in different formats. The main purpose of the work presented here is to produce a homogeneous database with our experimental data. We plan to create an electronic journal of observations using a relational database. Presently the journal of observations exists only in the form of a single paper copy and information about observations must be searched for manually. The electronic journal of observations will contain all the information about the observations and allow an easy extraction of such information and a simple selection of the observational data for any specific astrophysical problem. We plan to develop a software package for primary data reduction of observations of the Sun on RATAN-600, stored in various primary formats, and to create a homogeneous archive in the FITS format. A binary table extension to FITS will be used. We also plan to record our archive on a CD-ROM.

  11. Development of GENOA Progressive Failure Parallel Processing Software Systems

    NASA Technical Reports Server (NTRS)

    Abdi, Frank; Minnetyan, Levon

    1999-01-01

    A capability consisting of software development and experimental techniques has been developed and is described. The capability is integrated into GENOA-PFA to model polymer matrix composite (PMC) structures. The capability considers the physics and mechanics of composite materials and structure by integration of a hierarchical multilevel macro-scale (lamina, laminate, and structure) and micro scale (fiber, matrix, and interface) simulation analyses. The modeling involves (1) ply layering methodology utilizing FEM elements with through-the-thickness representation, (2) simulation of effects of material defects and conditions (e.g., voids, fiber waviness, and residual stress) on global static and cyclic fatigue strengths, (3) including material nonlinearities (by updating properties periodically) and geometrical nonlinearities (by Lagrangian updating), (4) simulating crack initiation. and growth to failure under static, cyclic, creep, and impact loads. (5) progressive fracture analysis to determine durability and damage tolerance. (6) identifying the percent contribution of various possible composite failure modes involved in critical damage events. and (7) determining sensitivities of failure modes to design parameters (e.g., fiber volume fraction, ply thickness, fiber orientation. and adhesive-bond thickness). GENOA-PFA progressive failure analysis is now ready for use to investigate the effects on structural responses to PMC material degradation from damage induced by static, cyclic (fatigue). creep, and impact loading in 2D/3D PMC structures subjected to hygrothermal environments. Its use will significantly facilitate targeting design parameter changes that will be most effective in reducing the probability of a given failure mode occurring.

  12. Agile robotic edge finishing

    SciTech Connect

    Powell, M.

    1996-08-01

    Edge finishing processes have seemed like ideal candidates for automation. Most edge finishing processes are unpleasant, dangerous, tedious, expensive, not repeatable and labor intensive. Estimates place the cost of manual edge finishing processes at 12% of the total cost of fabricating precision parts. For small, high precision parts, the cost of hand finishing may be as high as 305 of the total part cost. Up to 50% of this cost could be saved through automation. This cost estimate includes the direct costs of edge finishing: the machining hours required and the 30% scrap and rework rate after manual finishing. Not included in these estimates are the indirect costs resulting from cumulative trauma disorders and retraining costs caused by the high turnover rate for finishing jobs.. Despite the apparent economic advantages, edge finishing has proven difficult to automate except in low precision and/or high volume production environments. Finishing automation systems have not been deployed successfully in Department of Energy defense programs (DOE/DP) production, A few systems have been attempted but have been subsequently abandoned for traditional edge finishing approaches: scraping, grinding, and filing the edges using modified dental tools and hand held power tools. Edge finishing automation has been an elusive but potentially lucrative production enhancement. The amount of time required for reconfiguring workcells for new parts, the time required to reprogram the workcells to finish new parts, and automation equipment to respond to fixturing errors and part tolerances are the most common reasons cited for eliminating automation as an option for DOE/DP edge finishing applications. Existing automated finishing systems have proven to be economically viable only where setup and reprogramming costs are a negligible fraction of overall production costs.

  13. IDP: Image and data processing (software) in C++

    SciTech Connect

    Lehman, S.

    1994-11-15

    IDP++(Image and Data Processing in C++) is a complied, multidimensional, multi-data type, signal processing environment written in C++. It is being developed within the Radar Ocean Imaging group and is intended as a partial replacement for View. IDP++ takes advantage of the latest object-oriented compiler technology to provide `information hiding.` Users need only know C, not C++. Signals are treated like any other variable with a defined set of operators and functions in an intuitive manner. IDP++ is being designed for real-time environment where interpreted signal processing packages are less efficient.

  14. Platform-independent software for medical image processing on the Internet

    NASA Astrophysics Data System (ADS)

    Mancuso, Michael E.; Pathak, Sayan D.; Kim, Yongmin

    1997-05-01

    We have developed a software tool for image processing over the Internet. The tool is a general purpose, easy to use, flexible, platform independent image processing software package with functions most commonly used in medical image processing.It provides for processing of medical images located wither remotely on the Internet or locally. The software was written in Java - the new programming language developed by Sun Microsystems. It was compiled and tested using Microsoft's Visual Java 1.0 and Microsoft's Just in Time Compiler 1.00.6211. The software is simple and easy to use. In order to use the tool, the user needs to download the software from our site before he/she runs it using any Java interpreter, such as those supplied by Sun, Symantec, Borland or Microsoft. Future versions of the operating systems supplied by Sun, Microsoft, Apple, IBM, and others will include Java interpreters. The software is then able to access and process any image on the iNternet or on the local computer. Using a 512 X 512 X 8-bit image, a 3 X 3 convolution took 0.88 seconds on an Intel Pentium Pro PC running at 200 MHz with 64 Mbytes of memory. A window/level operation took 0.38 seconds while a 3 X 3 median filter took 0.71 seconds. These performance numbers demonstrate the feasibility of using this software interactively on desktop computes. Our software tool supports various image processing techniques commonly used in medical image processing and can run without the need of any specialized hardware. It can become an easily accessible resource over the Internet to promote the learning and of understanding image processing algorithms. Also, it could facilitate sharing of medical image databases and collaboration amongst researchers and clinicians, regardless of location.

  15. A satellite data processing and analysis software system for earth's atmosphere and surface research

    NASA Technical Reports Server (NTRS)

    Dealy, B.; Gautier, C.; Frouin, R.; Bates, J.; Lingner, D.

    1988-01-01

    The OASIS (Oceanic and Atmospheric Satellite Imaging System) is a satellite data processing and analysis software system being developed by the California Space Institute (Cal Space) for support of interdisciplinary and integrated earth sciences research programs. The system's software applications are integrated under a common executive, NASA's Transportable Application Executive (TAE). In this paper, TAE and the system software and hardware are described, and specific techniques used for ingesting, processing, analyzing, and graphically displaying data from many of the sensors presently being flown are presented. Scientific uses of these capabilities that are, or will shortly be, running under TAE at Cal Space are described.

  16. Software for biomedical engineering signal processing laboratory experiments.

    PubMed

    Tompkins, Willis J; Wilson, J

    2009-01-01

    In the early 1990's we developed a special computer program called UW DigiScope to provide a mechanism for anyone interested in biomedical digital signal processing to study the field without requiring any other instrument except a personal computer. There are many digital filtering and pattern recognition algorithms used in processing biomedical signals. In general, students have very limited opportunity to have hands-on access to the mechanisms of digital signal processing. In a typical course, the filters are designed non-interactively, which does not provide the student with significant understanding of the design constraints of such filters nor their actual performance characteristics. UW DigiScope 3.0 is the first major update since version 2.0 was released in 1994. This paper provides details on how the new version based on MATLAB! works with signals, including the filter design tool that is the programming interface between UW DigiScope and processing algorithms. PMID:19964035

  17. Parallel-Processing Software for Correlating Stereo Images

    NASA Technical Reports Server (NTRS)

    Klimeck, Gerhard; Deen, Robert; Mcauley, Michael; DeJong, Eric

    2007-01-01

    A computer program implements parallel- processing algorithms for cor relating images of terrain acquired by stereoscopic pairs of digital stereo cameras on an exploratory robotic vehicle (e.g., a Mars rove r). Such correlations are used to create three-dimensional computatio nal models of the terrain for navigation. In this program, the scene viewed by the cameras is segmented into subimages. Each subimage is assigned to one of a number of central processing units (CPUs) opera ting simultaneously.

  18. The Enterprise Derivative Application: Flexible Software for Optimizing Manufacturing Processes

    SciTech Connect

    Ward, Richard C; Allgood, Glenn O; Knox, John R

    2008-11-01

    The Enterprise Derivative Application (EDA) implements the enterprise-derivative analysis for optimization of an industrial process (Allgood and Manges, 2001). It is a tool to help industry planners choose the most productive way of manufacturing their products while minimizing their cost. Developed in MS Access, the application allows users to input initial data ranging from raw material to variable costs and enables the tracking of specific information as material is passed from one process to another. Energy-derivative analysis is based on calculation of sensitivity parameters. For the specific application to a steel production process these include: the cost to product sensitivity, the product to energy sensitivity, the energy to efficiency sensitivity, and the efficiency to cost sensitivity. Using the EDA, for all processes the user can display a particular sensitivity or all sensitivities can be compared for all processes. Although energy-derivative analysis was originally designed for use by the steel industry, it is flexible enough to be applied to many other industrial processes. Examples of processes where energy-derivative analysis would prove useful are wireless monitoring of processes in the petroleum cracking industry and wireless monitoring of motor failure for determining the optimum time to replace motor parts. One advantage of the MS Access-based application is its flexibility in defining the process flow and establishing the relationships between parent and child process and products resulting from a process. Due to the general design of the program, a process can be anything that occurs over time with resulting output (products). So the application can be easily modified to many different industrial and organizational environments. Another advantage is the flexibility of defining sensitivity parameters. Sensitivities can be determined between all possible variables in the process flow as a function of time. Thus the dynamic development of the

  19. Three `C's of Agile Practice: Collaboration, Co-ordination and Communication

    NASA Astrophysics Data System (ADS)

    Sharp, Helen; Robinson, Hugh

    The importance of collaboration, co-ordination and communication in agile teams is often discussed and rarely disputed. These activities are supported through various practices including pairing, customer collaboration, stand-ups and the planning game. However the mechanisms used to support these activities are sometimes more difficult to pin down. We have been studying agile teams for over a decade, and have found that story cards and the Wall are central to an agile team's activity, and the information they hold and convey is crucial for supporting the team's collaboration and co-ordination activity. However the information captured by these usually physical artefacts pertains mainly to progress rather than to functional dependencies. This latter information is fundamental to any software development, and in a non-agile environment is usually contained in detailed documentation not generally produced in an agile team. Instead, this information resides in their communication and social practices. In this chapter we discuss these three ‘C's of agile development and what we know about how they are supported through story cards and the Wall.

  20. AGILE/GRID Science Alert Monitoring System: The Workflow and the Crab Flare Case

    NASA Astrophysics Data System (ADS)

    Bulgarelli, A.; Trifoglio, M.; Gianotti, F.; Tavani, M.; Conforti, V.; Parmiggiani, N.

    2013-10-01

    During the first five years of the AGILE mission we have observed many gamma-ray transients of Galactic and extragalactic origin. A fast reaction to unexpected transient events is a crucial part of the AGILE monitoring program, because the follow-up of astrophysical transients is a key point for this space mission. We present the workflow and the software developed by the AGILE Team to perform the automatic analysis for the detection of gamma-ray transients. In addition, an App for iPhone will be released enabling the Team to access the monitoring system through mobile phones. In 2010 September the science alert monitoring system presented in this paper recorded a transient phenomena from the Crab Nebula, generating an automated alert sent via email and SMS two hours after the end of an AGILE satellite orbit, i.e. two hours after the Crab flare itself: for this discovery AGILE won the 2012 Bruno Rossi prize. The design of this alert system is maximized to reach the maximum speed, and in this, as in many other cases, AGILE has demonstrated that the reaction speed of the monitoring system is crucial for the scientific return of the mission.

  1. Network configuration management : paving the way to network agility.

    SciTech Connect

    Maestas, Joseph H.

    2007-08-01

    Sandia networks consist of nearly nine hundred routers and switches and nearly one million lines of command code, and each line ideally contributes to the capabilities of the network to convey information from one location to another. Sandia's Cyber Infrastructure Development and Deployment organizations recognize that it is therefore essential to standardize network configurations and enforce conformance to industry best business practices and documented internal configuration standards to provide a network that is agile, adaptable, and highly available. This is especially important in times of constrained budgets as members of the workforce are called upon to improve efficiency, effectiveness, and customer focus. Best business practices recommend using the standardized configurations in the enforcement process so that when root cause analysis results in recommended configuration changes, subsequent configuration auditing will improve compliance to the standard. Ultimately, this minimizes mean time to repair, maintains the network security posture, improves network availability, and enables efficient transition to new technologies. Network standardization brings improved network agility, which in turn enables enterprise agility, because the network touches all facets of corporate business. Improved network agility improves the business enterprise as a whole.

  2. Agile manufacturing and constraints management: a strategic perspective

    NASA Astrophysics Data System (ADS)

    Stratton, Roy; Yusuf, Yahaya Y.

    2000-10-01

    The definition of the agile paradigm has proved elusive and is often viewed as a panacea, in contention with more traditional approaches to operations strategy development and Larkin its own methodology and tools. The Theory of Constraints (TOC) is also poorly understood, as it is commonly solely associated with production planning and control systems and bottleneck management. This paper will demonstrate the synergy between these two approaches together with the Theory of Inventive Problem Solving (TRIZ), and establish how the systematic elimination of trade-offs can support the agile paradigm. Whereas agility is often seen as a trade-off free destination, both TOC and TRIZ may be considered to be route finders, as they comprise methodologies that focus on the identification and elimination of the trade-offs that constrain the purposeful improvement of a system, be it organizational or mechanical. This paper will also show how the TOC thinking process may be combined with the TRIZ knowledge based approach and used in breaking contradictions within agile logistics.

  3. Modified Polar-Format Software for Processing SAR Data

    NASA Technical Reports Server (NTRS)

    Chen, Curtis

    2003-01-01

    HMPF is a computer program that implements a modified polar-format algorithm for processing data from spaceborne synthetic-aperture radar (SAR) systems. Unlike prior polar-format processing algorithms, this algorithm is based on the assumption that the radar signal wavefronts are spherical rather than planar. The algorithm provides for resampling of SAR pulse data from slant range to radial distance from the center of a reference sphere that is nominally the local Earth surface. Then, invoking the projection-slice theorem, the resampled pulse data are Fourier-transformed over radial distance, arranged in the wavenumber domain according to the acquisition geometry, resampled to a Cartesian grid, and inverse-Fourier-transformed. The result of this process is the focused SAR image. HMPF, and perhaps other programs that implement variants of the algorithm, may give better accuracy than do prior algorithms for processing strip-map SAR data from high altitudes and may give better phase preservation relative to prior polar-format algorithms for processing spotlight-mode SAR data.

  4. The Influence of Software Complexity on the Maintenance Effort: Case Study on Software Developed within Educational Process

    ERIC Educational Resources Information Center

    Radulescu, Iulian Ionut

    2006-01-01

    Software complexity is the most important software quality attribute and a very useful instrument in the study of software quality. Is one of the factors that affect most of the software quality characteristics, including maintainability. It is very important to quantity this influence and identify the means to keep it under control; by using…

  5. Multiply-agile encryption in high speed communication networks

    SciTech Connect

    Pierson, L.G.; Witzke, E.L.

    1997-05-01

    Different applications have different security requirements for data privacy, data integrity, and authentication. Encryption is one technique that addresses these requirements. Encryption hardware, designed for use in high-speed communications networks, can satisfy a wide variety of security requirements if that hardware is key-agile, robustness-agile and algorithm-agile. Hence, multiply-agile encryption provides enhanced solutions to the secrecy, interoperability and quality of service issues in high-speed networks. This paper defines these three types of agile encryption. Next, implementation issues are discussed. While single-algorithm, key-agile encryptors exist, robustness-agile and algorithm-agile encryptors are still research topics.

  6. Obtaining Valid Safety Data for Software Safety Measurement and Process Improvement

    NASA Technical Reports Server (NTRS)

    Basili, Victor r.; Zelkowitz, Marvin V.; Layman, Lucas; Dangle, Kathleen; Diep, Madeline

    2010-01-01

    We report on a preliminary case study to examine software safety risk in the early design phase of the NASA Constellation spaceflight program. Our goal is to provide NASA quality assurance managers with information regarding the ongoing state of software safety across the program. We examined 154 hazard reports created during the preliminary design phase of three major flight hardware systems within the Constellation program. Our purpose was two-fold: 1) to quantify the relative importance of software with respect to system safety; and 2) to identify potential risks due to incorrect application of the safety process, deficiencies in the safety process, or the lack of a defined process. One early outcome of this work was to show that there are structural deficiencies in collecting valid safety data that make software safety different from hardware safety. In our conclusions we present some of these deficiencies.

  7. Software quality and process improvement in scientific simulation codes

    SciTech Connect

    Ambrosiano, J.; Webster, R.

    1997-11-01

    This report contains viewgraphs on the quest to develope better simulation code quality through process modeling and improvement. This study is based on the experience of the authors and interviews with ten subjects chosen from simulation code development teams at LANL. This study is descriptive rather than scientific.

  8. The Introduction of Agility into Albania.

    ERIC Educational Resources Information Center

    Smith-Stevens, Eileen J.; Shkurti, Drita

    1998-01-01

    Describes a plan to introduce and achieve a national awareness of agility (and easy entry into the world market) for Albania through the relatively stable higher-education order. Agility's four strategic principles are enriching the customer, cooperating to enhance competitiveness, organizing to master change and uncertainty, and leveraging the…

  9. Transitioning from Distributed and Traditional to Distributed and Agile: An Experience Report

    NASA Astrophysics Data System (ADS)

    Wildt, Daniel; Prikladnicki, Rafael

    Global companies that experienced extensive waterfall phased plans are trying to improve their existing processes to expedite team engagement. Agile methodologies have become an acceptable path to follow because it comprises project management as part of its practices. Agile practices have been used with the objective of simplifying project control through simple processes, easy to update documentation and higher team iteration over exhaustive documentation, focusing rather on team continuous improvement and aiming to add value to business processes. The purpose of this chapter is to describe the experience of a global multinational company on transitioning from distributed and traditional to distributed and agile. This company has development centers across North America, South America and Asia. This chapter covers challenges faced by the project teams of two pilot projects, including strengths of using agile practices in a globally distributed environment and practical recommendations for similar endeavors.

  10. Process control of large-scale finite element simulation software

    SciTech Connect

    Spence, P.A.; Weingarten, L.I.; Schroder, K.; Tung, D.M.; Sheaffer, D.A.

    1996-02-01

    We have developed a methodology for coupling large-scale numerical codes with process control algorithms. Closed-loop simulations were demonstrated using the Sandia-developed finite element thermal code TACO and the commercially available finite element thermal-mechanical code ABAQUS. This new capability enables us to use computational simulations for designing and prototyping advanced process-control systems. By testing control algorithms on simulators before building and testing hardware, enormous time and cost savings can be realized. The need for a closed-loop simulation capability was demonstrated in a detailed design study of a rapid-thermal-processing reactor under development by CVC Products Inc. Using a thermal model of the RTP system as a surrogate for the actual hardware, we were able to generate response data needed for controller design. We then evaluated the performance of both the controller design and the hardware design by using the controller to drive the finite element model. The controlled simulations provided data on wafer temperature uniformity as a function of ramp rate, temperature sensor locations, and controller gain. This information, which is critical to reactor design, cannot be obtained from typical open-loop simulations.

  11. Agile robotic edge finishing system research

    SciTech Connect

    Powell, M.A.

    1995-07-01

    This paper describes a new project undertaken by Sandia National Laboratories to develop an agile, automated, high-precision edge finishing system. The project has a two-year duration and was initiated in October, 1994. This project involves re-designing and adding additional capabilities to an existing finishing workcell at Sandia; and developing intelligent methods for automating process definition and for controlling finishing processes. The resulting system will serve as a prototype for systems that will be deployed into highly flexible automated production lines. The production systems will be used to produce a wide variety of products with limited production quantities and quick turnaround requirements. The prototype system is designed to allow programming, process definition, fixture re-configuration, and process verification to be performed off-line for new products. CAD/CAM (Computer Aided Design/Computer Aided Manufacturing) models of the part will be used to assist with the automated process development and process control tasks. To achieve Sandia`s performance goals, the system will be employ advanced path planning, burr prediction expert systems, automated process definition, statistical process models in a process database, and a two-level control scheme using hybrid position-force control and fuzzy logic control. In this paper, we discuss the progress and the planned system development under this project.

  12. Fighter agility metrics, research, and test

    NASA Technical Reports Server (NTRS)

    Liefer, Randall K.; Valasek, John; Eggold, David P.

    1990-01-01

    Proposed new metrics to assess fighter aircraft agility are collected and analyzed. A framework for classification of these new agility metrics is developed and applied. A completed set of transient agility metrics is evaluated with a high fidelity, nonlinear F-18 simulation provided by the NASA Dryden Flight Research Center. Test techniques and data reduction methods are proposed. A method of providing cuing information to the pilot during flight test is discussed. The sensitivity of longitudinal and lateral agility metrics to deviations from the pilot cues is studied in detail. The metrics are shown to be largely insensitive to reasonable deviations from the nominal test pilot commands. Instrumentation required to quantify agility via flight test is also considered. With one exception, each of the proposed new metrics may be measured with instrumentation currently available. Simulation documentation and user instructions are provided in an appendix.

  13. Integrating a distributed, agile, virtual enterprise in the TEAM program

    NASA Astrophysics Data System (ADS)

    Cobb, C. K.; Gray, W. Harvey; Hewgley, Robert E.; Klages, Edward J.; Neal, Richard E.

    1997-01-01

    The technologies enabling agile manufacturing (TEAM) program enhances industrial capability by advancing and deploying manufacturing technologies that promote agility. TEAM has developed a product realization process that features the integration of product design and manufacturing groups. TEAM uses the tools it collects, develops, and integrates in support of the product realization process to demonstrate and deploy agile manufacturing capabilities for three high- priority processes identified by industry: material removal, forming, and electromechanical assembly. In order to provide a proof-of-principle, the material removal process has been addressed first and has been successfully demonstrate din an 'interconnected' mode. An internet-accessible intersite file manager (IFM) application has been deployed to allow geographically distributed TEAM participants to share and distribute information as the product realization process is executed. An automated inspection planning application has been demonstrated, importing a solid model form the IFM, generating an inspection plan and a part program to be used in the inspection process, and then distributing the part program to the inspection site via the IFM. TEAM seeks to demonstrate the material removal process in an integrated mode in June 1997 complete with an object-oriented framework and infrastructure. The current status and future plans for this project are presented here.

  14. Selecting Software.

    ERIC Educational Resources Information Center

    Pereus, Steven C.

    2002-01-01

    Describes a comprehensive computer software selection and evaluation process, including documenting district needs, evaluating software packages, weighing the alternatives, and making the purchase. (PKP)

  15. Implementation of New Process Models for Tailored Polymer Composite Structures into Processing Software Packages

    SciTech Connect

    Nguyen, Ba Nghiep; Jin, Xiaoshi; Wang, Jin; Phelps, Jay; Tucker III, Charles L.; Kunc, Vlastimil; Bapanapalli, Satish K.; Smith, Mark T.

    2010-02-23

    This report describes the work conducted under the Cooperative Research and Development Agreement (CRADA) (Nr. 260) between the Pacific Northwest National Laboratory (PNNL) and Autodesk, Inc. to develop and implement process models for injection-molded long-fiber thermoplastics (LFTs) in processing software packages. The structure of this report is organized as follows. After the Introduction Section (Section 1), Section 2 summarizes the current fiber orientation models developed for injection-molded short-fiber thermoplastics (SFTs). Section 3 provides an assessment of these models to determine their capabilities and limitations, and the developments needed for injection-molded LFTs. Section 4 then focuses on the development of a new fiber orientation model for LFTs. This model is termed the anisotropic rotary diffusion - reduced strain closure (ARD-RSC) model as it explores the concept of anisotropic rotary diffusion to capture the fiber-fiber interaction in long-fiber suspensions and uses the reduced strain closure method of Wang et al. to slow down the orientation kinetics in concentrated suspensions. In contrast to fiber orientation modeling, before this project, no standard model was developed to predict the fiber length distribution in molded fiber composites. Section 5 is therefore devoted to the development of a fiber length attrition model in the mold. Sections 6 and 7 address the implementations of the models in AMI, and the conclusions drawn from this work is presented in Section 8.

  16. Cordic based algorithms for software defined radio (SDR) baseband processing

    NASA Astrophysics Data System (ADS)

    Heyne, B.; Götze, J.

    2006-09-01

    This paper presents two Cordic based algorithms which may be used for digital baseband processing in OFDM and/or CDMA based communication systems. The first one is a linear least squares based multiuser detector for CDMA incorporating descrambling and despreading. The second algorithm is a pure Cordic based FFT implementation. Both algorithms can be implemented using solely Cordic based architectures (e.g. coprocessors or ASIPs). The algorithms exactly fit the needs of a multistandard terminal as they both are freely parameterizable. This regards to the accuracy of the results as well as to the parameters of the performed function (e.g. size of the FFT).

  17. Towards an Improvement of Software Development Processes through Standard Business Rules

    NASA Astrophysics Data System (ADS)

    Martínez-Fernández, José L.; Martínez, Paloma; González-Cristóbal, José C.

    The automation of software development processes is a desirable goal of current software companies which would lead to a cost reduction in software production. This automation is the backbone of approaches such as Model Driven Architecture (MDA) or Software Factories. This paper proposes the use of standard Business Rules (using Rules Interchange Format, RIF) to specify application functionality along with a platform to produce automatic implementations for them. The novelty of this proposal is to introduce Business Rules at all levels of MDA architecture in a software development process, providing a supporting tool where production Business Rules are considered at every abstraction level. Production Business Rules are represented through standard languages, rule engine vendor independence is assured via automatic transformation between rule languages, and Business Rules reuse is made possible. The objective is to get the development of production Business Rules closer to non-technical people involved in the software development process through the use of natural language processing approaches, automatic transformations among models and semantic web languages such as Ontology Web Language (OWL).

  18. Measuring the impact of computer resource quality on the software development process and product

    NASA Technical Reports Server (NTRS)

    Mcgarry, Frank; Valett, Jon; Hall, Dana

    1985-01-01

    The availability and quality of computer resources during the software development process was speculated to have measurable, significant impact on the efficiency of the development process and the quality of the resulting product. Environment components such as the types of tools, machine responsiveness, and quantity of direct access storage may play a major role in the effort to produce the product and in its subsequent quality as measured by factors such as reliability and ease of maintenance. During the past six years, the NASA Goddard Space Flight Center has conducted experiments with software projects in an attempt to better understand the impact of software development methodologies, environments, and general technologies on the software process and product. Data was extracted and examined from nearly 50 software development projects. All were related to support of satellite flight dynamics ground-based computations. The relationship between computer resources and the software development process and product as exemplified by the subject NASA data was examined. Based upon the results, a number of computer resource-related implications are provided.

  19. MULTIPROCESSOR AND DISTRIBUTED PROCESSING BIBLIOGRAPHIC DATA BASE SOFTWARE SYSTEM

    NASA Technical Reports Server (NTRS)

    Miya, E. N.

    1994-01-01

    Multiprocessors and distributed processing are undergoing increased scientific scrutiny for many reasons. It is more and more difficult to keep track of the existing research in these fields. This package consists of a large machine-readable bibliographic data base which, in addition to the usual keyword searches, can be used for producing citations, indexes, and cross-references. The data base is compiled from smaller existing multiprocessing bibliographies, and tables of contents from journals and significant conferences. There are approximately 4,000 entries covering topics such as parallel and vector processing, networks, supercomputers, fault-tolerant computers, and cellular automata. Each entry is represented by 21 fields including keywords, author, referencing book or journal title, volume and page number, and date and city of publication. The data base contains UNIX 'refer' formatted ASCII data and can be implemented on any computer running under the UNIX operating system. The data base requires approximately one megabyte of secondary storage. The documentation for this program is included with the distribution tape, although it can be purchased for the price below. This bibliography was compiled in 1985 and updated in 1988.

  20. On the nature of bias and defects in the software specification process

    NASA Technical Reports Server (NTRS)

    Straub, Pablo A.; Zelkowitz, Marvin V.

    1992-01-01

    Implementation bias in a specification is an arbitrary constraint in the solution space. This paper describes the problem of bias. Additionally, this paper presents a model of the specification and design processes describing individual subprocesses in terms of precision/detail diagrams and a model of bias in multi-attribute software specifications. While studying how bias is introduced into a specification we realized that software defects and bias are dual problems of a single phenomenon. This was used to explain the large proportion of faults found during the coding phase at the Software Engineering Laboratory at NASA/GSFC.

  1. A proposed acceptance process for commercial off-the-shelf (COTS) software in reactor applications

    SciTech Connect

    Preckshot, G.G.; Scott, J.A.

    1996-03-01

    This paper proposes a process for acceptance of commercial off-the-shelf (COTS) software products for use in reactor systems important to safety. An initial set of four criteria establishes COTS software product identification and its safety category. Based on safety category, three sets of additional criteria, graded in rigor, are applied to approve/disapprove the product. These criteria fall roughly into three areas: product assurance, verification of safety function and safety impact, and examination of usage experience of the COTS product in circumstances similar to the proposed application. A report addressing the testing of existing software is included as an appendix.

  2. The power and efficiency of advanced software and parallel processing

    NASA Technical Reports Server (NTRS)

    Singh, Ramen P.; Taylor, Lawrence W., Jr.

    1989-01-01

    Real-time simulation of flexible and articulating systems is difficult because of the computational burden of the time varying calculations. The mobile servicing system of the NASA Space Station Freedom will handle heavy payloads by local arm manipulations and by translating along the spline of the Station, it is crucial to have real-time simulation available. To enable such a simulation to be of high fidelity and to be able to be hosted on a modest computer, special care must be made in formulating the structural dynamics. Frontal solution algorithms save considerable time in performing these calculations. In addition, it is necessary to take advantage of parallel processing be compatible to take full advantage of both. An approach is offered which will result in high fidelity, real-time simulation for flexible, articulating systems such as the space Station remote servicing system.

  3. Natural language processing-based COTS software and related technologies survey.

    SciTech Connect

    Stickland, Michael G.; Conrad, Gregory N.; Eaton, Shelley M.

    2003-09-01

    Natural language processing-based knowledge management software, traditionally developed for security organizations, is now becoming commercially available. An informal survey was conducted to discover and examine current NLP and related technologies and potential applications for information retrieval, information extraction, summarization, categorization, terminology management, link analysis, and visualization for possible implementation at Sandia National Laboratories. This report documents our current understanding of the technologies, lists software vendors and their products, and identifies potential applications of these technologies.

  4. ESO C Library for an Image Processing Software Environment (eclipse)

    NASA Astrophysics Data System (ADS)

    Devillard, N.

    Written in ANSI C, eclipse is a library offering numerous services related to astronomical image processing: FITS data access, various image and cube loading methods, binary image handling and filtering (including convolution and morphological filters), 2-D cross-correlation, connected components, cube and image arithmetic, dead pixel detection and correction, object detection, data extraction, flat-fielding with robust fit, image generation, statistics, photometry, image-space resampling, image combination, and cube stacking. It also contains support for mathematical tools like random number generation, FFT, curve fitting, matrices, fast median computation, and point-pattern matching. The main feature of this library is its ability to handle large amounts of input data (up to 2 GB in the current version) regardless of the amount of memory and swap available on the local machine. Another feature is the very high speed allowed by optimized C, making it an ideal base tool for programming efficient number-crunching applications, e.g., on parallel (Beowulf) systems. Running on all Unix-like platforms, eclipse is portable. A high-level interface to Python is foreseen that would allow programmers to prototype their applications much faster than through C programs.

  5. Eclipse: ESO C Library for an Image Processing Software Environment

    NASA Astrophysics Data System (ADS)

    Devillard, Nicolas

    2011-12-01

    Written in ANSI C, eclipse is a library offering numerous services related to astronomical image processing: FITS data access, various image and cube loading methods, binary image handling and filtering (including convolution and morphological filters), 2-D cross-correlation, connected components, cube and image arithmetic, dead pixel detection and correction, object detection, data extraction, flat-fielding with robust fit, image generation, statistics, photometry, image-space resampling, image combination, and cube stacking. It also contains support for mathematical tools like random number generation, FFT, curve fitting, matrices, fast median computation, and point-pattern matching. The main feature of this library is its ability to handle large amounts of input data (up to 2GB in the current version) regardless of the amount of memory and swap available on the local machine. Another feature is the very high speed allowed by optimized C, making it an ideal base tool for programming efficient number-crunching applications, e.g., on parallel (Beowulf) systems.

  6. Software Graphics Processing Unit (sGPU) for Deep Space Applications

    NASA Technical Reports Server (NTRS)

    McCabe, Mary; Salazar, George; Steele, Glen

    2015-01-01

    A graphics processing capability will be required for deep space missions and must include a range of applications, from safety-critical vehicle health status to telemedicine for crew health. However, preliminary radiation testing of commercial graphics processing cards suggest they cannot operate in the deep space radiation environment. Investigation into an Software Graphics Processing Unit (sGPU)comprised of commercial-equivalent radiation hardened/tolerant single board computers, field programmable gate arrays, and safety-critical display software shows promising results. Preliminary performance of approximately 30 frames per second (FPS) has been achieved. Use of multi-core processors may provide a significant increase in performance.

  7. Software Process Improvement through the Removal of Project-Level Knowledge Flow Obstacles: The Perceptions of Software Engineers

    ERIC Educational Resources Information Center

    Mitchell, Susan Marie

    2012-01-01

    Uncontrollable costs, schedule overruns, and poor end product quality continue to plague the software engineering field. Innovations formulated with the expectation to minimize or eliminate cost, schedule, and quality problems have generally fallen into one of three categories: programming paradigms, software tools, and software process…

  8. DIII-D Thomson Scattering Diagnostic Data Acquisition, Processing and Analysis Software

    SciTech Connect

    Middaugh, K.R.; Bray, B.D.; Hsieh, C.L.; McHarg, B.B., Jr.; Penaflor, B.G.

    1999-06-01

    One of the diagnostic systems critical to the success of the DIII-D tokamak experiment is the Thomson scattering diagnostic. This diagnostic is unique in that it measures local electron temperature and density: (1) at multiple locations within the tokamak plasma; and (2) at different times throughout the plasma duration. Thomson ''raw'' data are digitized signals of scattered light, measured at different times and locations, from the laser beam paths fired into the plasma. Real-time acquisition of this data is performed by specialized hardware. Once obtained, the raw data are processed into meaningful temperature and density values which can be analyzed for measurement quality. This paper will provide an overview of the entire Thomson scattering diagnostic software and will focus on the data acquisition, processing, and analysis software implementation. The software falls into three general categories: (1) Set-up and Control: Initializes and controls all Thomson hardware and software, synchronizes with other DIII-D computers, and invokes other Thomson software as appropriate. (2) Data Acquisition and Processing: Obtains raw measured data from memory and processes it into temperature and density values. (3) Analysis: Provides a graphical user interface in which to perform analysis and sophisticated plotting of analysis parameters.

  9. Specific Requirements of Physiotherapists on the Practical Use of Software in the Therapeutical Process.

    PubMed

    Messer-Misak, Karin; Egger, Rudolf

    2016-01-01

    The current healthcare system requires more effective management. New media and technology are supposed to support the demands of the current healthcare system. By the example of physiotherapy, the primary objective of this study was to define the specific requirements of therapists on the practical use of software which cover the administration, documentation and evaluation of the entire therapy process, including a database with pictures/videos about exercises which can be adapted individually by the therapists. Another objective was to show what conditions for a successful implementation of advanced applications during the entire treatment process have to be fulfilled. The approach of mixed-methods designs was chosen. In the first part a two-stage qualitative study was carried out, followed by a quantitative survey. The results show that the use of the software regarding the therapy-related part is dependent on how adaptable the software is to the special needs of the therapists, that the whole treatment process is mapped on the software and that an additional training during the professional practice must be implemented in order to deploy the use of the software successfully in the therapeutic process. PMID:27139399

  10. Social Protocols for Agile Virtual Teams

    NASA Astrophysics Data System (ADS)

    Picard, Willy

    Despite many works on collaborative networked organizations (CNOs), CSCW, groupware, workflow systems and social networks, computer support for virtual teams is still insufficient, especially support for agility, i.e. the capability of virtual team members to rapidly and cost efficiently adapt the way they interact to changes. In this paper, requirements for computer support for agile virtual teams are presented. Next, an extension of the concept of social protocol is proposed as a novel model supporting agile interactions within virtual teams. The extended concept of social protocol consists of an extended social network and a workflow model.

  11. Web-based interactive 2D/3D medical image processing and visualization software.

    PubMed

    Mahmoudi, Seyyed Ehsan; Akhondi-Asl, Alireza; Rahmani, Roohollah; Faghih-Roohi, Shahrooz; Taimouri, Vahid; Sabouri, Ahmad; Soltanian-Zadeh, Hamid

    2010-05-01

    There are many medical image processing software tools available for research and diagnosis purposes. However, most of these tools are available only as local applications. This limits the accessibility of the software to a specific machine, and thus the data and processing power of that application are not available to other workstations. Further, there are operating system and processing power limitations which prevent such applications from running on every type of workstation. By developing web-based tools, it is possible for users to access the medical image processing functionalities wherever the internet is available. In this paper, we introduce a pure web-based, interactive, extendable, 2D and 3D medical image processing and visualization application that requires no client installation. Our software uses a four-layered design consisting of an algorithm layer, web-user-interface layer, server communication layer, and wrapper layer. To compete with extendibility of the current local medical image processing software, each layer is highly independent of other layers. A wide range of medical image preprocessing, registration, and segmentation methods are implemented using open source libraries. Desktop-like user interaction is provided by using AJAX technology in the web-user-interface. For the visualization functionality of the software, the VRML standard is used to provide 3D features over the web. Integration of these technologies has allowed implementation of our purely web-based software with high functionality without requiring powerful computational resources in the client side. The user-interface is designed such that the users can select appropriate parameters for practical research and clinical studies. PMID:20022133

  12. Wired Widgets: Agile Visualization for Space Situational Awareness

    NASA Astrophysics Data System (ADS)

    Gerschefske, K.; Witmer, J.

    2012-09-01

    Continued advancement in sensors and analysis techniques have resulted in a wealth of Space Situational Awareness (SSA) data, made available via tools and Service Oriented Architectures (SOA) such as those in the Joint Space Operations Center Mission Systems (JMS) environment. Current visualization software cannot quickly adapt to rapidly changing missions and data, preventing operators and analysts from performing their jobs effectively. The value of this wealth of SSA data is not fully realized, as the operators' existing software is not built with the flexibility to consume new or changing sources of data or to rapidly customize their visualization as the mission evolves. While tools like the JMS user-defined operational picture (UDOP) have begun to fill this gap, this paper presents a further evolution, leveraging Web 2.0 technologies for maximum agility. We demonstrate a flexible Web widget framework with inter-widget data sharing, publish-subscribe eventing, and an API providing the basis for consumption of new data sources and adaptable visualization. Wired Widgets offers cross-portal widgets along with a widget communication framework and development toolkit for rapid new widget development, giving operators the ability to answer relevant questions as the mission evolves. Wired Widgets has been applied in a number of dynamic mission domains including disaster response, combat operations, and noncombatant evacuation scenarios. The variety of applications demonstrate that Wired Widgets provides a flexible, data driven solution for visualization in changing environments. In this paper, we show how, deployed in the Ozone Widget Framework portal environment, Wired Widgets can provide an agile, web-based visualization to support the SSA mission. Furthermore, we discuss how the tenets of agile visualization can generally be applied to the SSA problem space to provide operators flexibility, potentially informing future acquisition and system development.

  13. An image-processing software package: UU and Fig for optical metrology applications

    NASA Astrophysics Data System (ADS)

    Chen, Lujie

    2013-06-01

    Modern optical metrology applications are largely supported by computational methods, such as phase shifting [1], Fourier Transform [2], digital image correlation [3], camera calibration [4], etc, in which image processing is a critical and indispensable component. While it is not too difficult to obtain a wide variety of image-processing programs from the internet; few are catered for the relatively special area of optical metrology. This paper introduces an image-processing software package: UU (data processing) and Fig (data rendering) that incorporates many useful functions to process optical metrological data. The cross-platform programs UU and Fig are developed based on wxWidgets. At the time of writing, it has been tested on Windows, Linux and Mac OS. The userinterface is designed to offer precise control of the underline processing procedures in a scientific manner. The data input/output mechanism is designed to accommodate diverse file formats and to facilitate the interaction with other independent programs. In terms of robustness, although the software was initially developed for personal use, it is comparably stable and accurate to most of the commercial software of similar nature. In addition to functions for optical metrology, the software package has a rich collection of useful tools in the following areas: real-time image streaming from USB and GigE cameras, computational geometry, computer vision, fitting of data, 3D image processing, vector image processing, precision device control (rotary stage, PZT stage, etc), point cloud to surface reconstruction, volume rendering, batch processing, etc. The software package is currently used in a number of universities for teaching and research.

  14. CDApps: integrated software for experimental planning and data processing at beamline B23, Diamond Light Source

    PubMed Central

    Hussain, Rohanah; Benning, Kristian; Javorfi, Tamas; Longo, Edoardo; Rudd, Timothy R.; Pulford, Bill; Siligardi, Giuliano

    2015-01-01

    The B23 Circular Dichroism beamline at Diamond Light Source has been operational since 2009 and has seen visits from more than 200 user groups, who have generated large amounts of data. Based on the experience of overseeing the users’ progress at B23, four key areas requiring the most assistance are identified: planning of experiments and note-keeping; designing titration experiments; processing and analysis of the collected data; and production of experimental reports. To streamline these processes an integrated software package has been developed and made available for the users. The subsequent article summarizes the main features of the software. PMID:25723950

  15. High-Speed Time-Series CCD Photometry with Agile

    NASA Astrophysics Data System (ADS)

    Mukadam, Anjum S.; Owen, R.; Mannery, E.; MacDonald, N.; Williams, B.; Stauffer, F.; Miller, C.

    2011-12-01

    We have assembled a high-speed time-series CCD photometer named Agile for the 3.5 m telescope at Apache Point Observatory, based on the design of a photometer called Argos at McDonald Observatory. Instead of a mechanical shutter, we use the frame-transfer operation of the CCD to end an exposure and initiate the subsequent new exposure. The frame-transfer operation is triggered by the negative edge of a GPS pulse; the instrument timing is controlled directly by hardware, without any software intervention or delays. This is the central pillar in the design of Argos that we have also used in Agile; this feature makes the accuracy of instrument timing better than a millisecond. Agile is based on a Princeton Instruments Acton VersArray camera with a frame-transfer CCD, which has 1K × 1K active pixels, each of size 13 μm × 13 μm. Using a focal reducer at the Nasmyth focus of the 3.5 m telescope at Apache Point Observatory, we yield a field of view of 2.2 × 2.2 arcmin2 with an unbinned plate scale of 0.13'' pixel-1. The CCD is back-illuminated and thinned for improved blue sensitivity and provides a quantum efficiency >=80% in the wavelength range of 4500-7500 Å. The unbinned full-frame readout time can be as fast as 1.1 s this is achieved using a low-noise amplifier operating at 1 MHz with an average read noise of the order of 6.6 e rms. At the slow read rate of 100 kHz to be used for exposure times longer than a few seconds, we determine an average read noise of the order of 3.7 e rms. Agile is optimized to observe variability at short timescales from one-third of a second to several hundred seconds. The variable astronomical sources routinely observed with Agile include pulsating white dwarfs, cataclysmic variables, flare stars, planetary transits, and planetary satellite occultations.

  16. ICESat (GLAS) Science Processing Software Document Series. Volume 3; GLAS Science Software Requirements Document; Ver 2.1

    NASA Technical Reports Server (NTRS)

    Jester, Peggy L.; Lee, Jeffrey; Zukor, Dorothy J. (Technical Monitor)

    2001-01-01

    This document addresses the software requirements of the Geoscience Laser Altimeter System (GLAS) Standard Data Software (SDS) supporting the GLAS instrument on the EOS ICESat Spacecraft. This Software Requirements Document represents the initial collection of the technical engineering information for the GLAS SDS. This information is detailed within the second of four main volumes of the Standard documentation, the Product Specification volume. This document is a "roll-out" from the governing volume outline containing the Concept and Requirements sections.

  17. Agile manufacturing concepts and opportunities in ceramics

    SciTech Connect

    Booth, C.L.; Harmer, M.P.

    1995-08-01

    In 1991 Lehigh University facilitated seminars over a period of 8 months to define manufacturing needs for the 21st century. They concluded that the future will be characterized by rapid changes in technology advances, customer demands, and shifts in market dynamics and coined the term {open_quotes}Agile Manufacturing{close_quotes}. Agile manufacturing refers to the ability to thrive in an environment of constant unpredictable change. Market opportunities are attacked by partnering to form virtual firms to dynamically obtain the required skills for each product opportunity. This paper will describe and compare agile vs. traditional concepts of organization & structure, management policy and ethics, employee environment, product focus, information, and paradigm shift. Examples of agile manufacturing applied to ceramic materials will be presented.

  18. Agility Following the Application of Cold Therapy

    PubMed Central

    Evans, Todd A.; Ingersoll, Christopher; Knight, Kenneth L.; Worrell, Teddy

    1995-01-01

    Cold application is commonly used before strenuous exercise due to its hypalgesic effects. Some have questioned this procedure because of reports that cold may reduce isokinetic torque. However, there have been no investigations of actual physical performance following cold application. The purpose of this study was to determine if a 20-minute ice immersion treatment to the foot and ankle affected the performance of three agility tests: the carioca maneuver, the cocontraction test, and the shuttle run. Twenty-four male athletic subjects were tested during two different treatment sessions following an orientation session. Subjects were tested following a 20-minute 1°C ice immersion treatment to the dominant foot and ankle and 20 minutes of rest. Following each treatment, subjects performed three trials of each agility test, with 30 seconds rest between each trial, and 1 minute between each different agility test. The order in which each subject performed the agility tests was determined by a balanced Latin square. A MANOVA with repeated measures was used to determine if there was an overall significant difference in the agility times recorded between the cold and control treatments and if the order of the treatment sessions affected the scores. Although the mean agility time scores were slightly slower following the cold treatment, cooling the foot and ankle caused no difference in agility times. Also, there was no difference resulting from the treatment orders. We felt that the slightly slower scores may have been a result of tissue stiffness and/or subject's apprehension immediately following the cold treatment. Cold application to the foot and ankle can be used before strenuous exercise without altering agility. Imagesp232-a PMID:16558341

  19. gLAB-A Fully Software Tool to Generate, Process and Analyze GNSS Signals

    NASA Astrophysics Data System (ADS)

    Dionisio, Cesare; Citterico, Dario; Pirazzi, Gabriele; De Quattro, Nicola; Marracci, Riccardo; Cucchi, Luca; Valdambrini, Nicola; Formaioni, Irene

    2010-08-01

    In this paper the concept of Software Defined Radio (SDR) and its use in modern GNSS receiver is highlighted demonstrating how software receivers are important in many situations especially for verification and validation. After a brief introduction of gLab, a fully software high modular tool to generate, process and analyze current and future GNSS signals, the different software modules will be described. Demonstrating the wide range of uses concerning gLab, different practical example will be briefly overviewed: from the analysis of real data over the experimental GIOVE-B satellite, to the antenna group delay determination or the CN0 estimation under wide dynamic range etc.. gLab is the result of different projects leaded by Intecs in GNSS SW Radio: the signal generator is the result of the SWAN (Sistemi softWare per Applicazioni di Navigazione) project under Italian Space Agency (ASI) contract, the analyzer and the processing module have been developed for ESA to V&V the IOV (In Orbit Validation) Galileo Phase. In this case the GNSS SW RX works in parallel with Test User Receivers (TUR) in order to validate the Signal In Space (SiS). Is remarkable that gLab is the result of over three years of development and approximately one year of test and validation under ESA (European Space Agency) supervision.

  20. SuperAGILE Services at ASDC

    SciTech Connect

    Preger, B.; Verrecchia, F.; Pittori, C.; Antonelli, L. A.; Giommi, P.; Lazzarotto, F.; Evangelista, Y.

    2008-05-22

    The Italian Space Agency Science Data Center (ASDC) is a facility with several responsibilities including support to all the ASI scientific missions as for management and archival of the data, acting as the interface between ASI and the scientific community and providing on-line access to the data hosted. In this poster we describe the services that ASDC provides for SuperAGILE, in particular the ASDC public web pages devoted to the dissemination of SuperAGILE scientific results. SuperAGILE is the X-Ray imager onboard the AGILE mission, and provides the scientific community with orbit-by-orbit information on the observed sources. Crucial source information including position and flux in chosen energy bands will be reported in the SuperAGILE public web page at ASDC. Given their particular interest, another web page will be dedicated entirely to GRBs and other transients, where new event alerts will be notified and where users will find all the available informations on the GRBs detected by SuperAGILE.

  1. Making Visible the Coding Process: Using Qualitative Data Software in a Post-Structural Study

    ERIC Educational Resources Information Center

    Ryan, Mary

    2009-01-01

    Qualitative research methods require transparency to ensure the "trustworthiness" of the data analysis. The intricate processes of organising, coding and analysing the data are often rendered invisible in the presentation of the research findings, which requires a "leap of faith" for the reader. Computer assisted data analysis software can be used…

  2. Effects of Reflective Thinking in the Process of Designing Software on Students' Learning Performances

    ERIC Educational Resources Information Center

    Hsieh, Pei-Hsuan; Chen, Nian-Shing

    2012-01-01

    The purpose of this study is to examine the effects of reflective thinking effects in the process of designing software on students' learning performances. The study contends that reflective thinking is a useful teaching strategy to improve learning performance among lower achieving students. Participants were students from two groups: Higher…

  3. A method for tailoring the information content of a software process model

    NASA Technical Reports Server (NTRS)

    Perkins, Sharon; Arend, Mark B.

    1990-01-01

    The framework is defined for a general method for selecting a necessary and sufficient subset of a general software life cycle's information products, to support new software development process. Procedures for characterizing problem domains in general and mapping to a tailored set of life cycle processes and products is presented. An overview of the method is shown using the following steps: (1) During the problem concept definition phase, perform standardized interviews and dialogs between developer and user, and between user and customer; (2) Generate a quality needs profile of the software to be developed, based on information gathered in step 1; (3) Translate the quality needs profile into a profile of quality criteria that must be met by the software to satisfy the quality needs; (4) Map the quality criteria to a set of accepted processes and products for achieving each criterion; (5) select the information products which match or support the accepted processes and product of step 4; and (6) Select the design methodology which produces the information products selected in step 5.

  4. Toward Understanding the Cognitive Processes of Software Design in Novice Programmers

    ERIC Educational Resources Information Center

    Yeh, Kuo-Chuan

    2009-01-01

    This study provides insights with regard to the types of cognitive processes that are involved in the formation of mental models and the way those models change over the course of a semester in novice programmers doing a design task. Eight novice programmers participated in this study for three distinct software design sessions, using the same…

  5. Connecting the Library's Patron Database to Campus Administrative Software: Simplifying the Library's Accounts Receivable Process

    ERIC Educational Resources Information Center

    Oliver, Astrid; Dahlquist, Janet; Tankersley, Jan; Emrich, Beth

    2010-01-01

    This article discusses the processes that occurred when the Library, Controller's Office, and Information Technology Department agreed to create an interface between the Library's Innovative Interfaces patron database and campus administrative software, Banner, using file transfer protocol, in an effort to streamline the Library's accounts…

  6. Year 2000 compliance concerns with the ISA Thermoluminescent Dosimetry Data Processing (TL-DP) software system

    SciTech Connect

    Saviz, K.

    1998-05-26

    The year 2000 is rapidly approaching, and there is a good chance that computer systems that utilize two digit year dates will experience problems in retrieval of date information. The ISA Thermoluminescent Dosimetry Data Processing (TL-DP) software and computer system has been reviewed for Year 2000 compliance issues.

  7. A method for tailoring the information content of a software process model

    NASA Technical Reports Server (NTRS)

    Perkins, Sharon; Arend, Mark B.

    1990-01-01

    The framework is defined for a general method for selecting a necessary and sufficient subset of a general software life cycle's information products, to support new software development process. Procedures for characterizing problem domains in general and mapping to a tailored set of life cycle processes and products is presented. An overview of the method is shown using the following steps: (1) During the problem concept definition phase, perform standardized interviews and dialogs between developer and user, and between user and customer; (2) Generate a quality needs profile of the software to be developed, based on information gathered in step 1; (3) Translate the quality needs profile into a profile of quality criteria that must be met by the software to satisfy the quality needs; (4) Map the quality criteria to set of accepted processes and products for achieving each criterion; (5) Select the information products which match or support the accepted processes and product of step 4; and (6) Select the design methodology which produces the information products selected in step 5.

  8. Complex software dedicated for design and simulation of LPCE process for heavy water detritiation

    SciTech Connect

    Bornea, A.; Petrutiu, C.; Zamfirache, M.

    2015-03-15

    The main purpose of this paper is to present a comprehensive software, SICA, designed to be used in water-hydrogen liquid phase catalytic exchange process (LPCE). The software calculates the water-gas catalytic isotopic exchange process, following the transfer of any H, D or T isotope from water to gas and vice versa. This software is useful for both design and laboratory-based research; the type of the catalytic filling (ordered or random) can be defined for any of these 2 cases, the isotopic calculation being specific to the package type. For the laboratory-based research, the performance of a catalytic packing can be determined by knowing the type and by using experimental results. Performance of the mixed catalytic packing is defined by mass transfer constants for each catalytic and hydrophilic package in that specific arrangement, and also for the isotope whose transfer is studied from one phase to another. Also, it has been established a link between these constants and commonly used parameters for the fillings performance defined by HETP (Height Equivalent of Theoretical Plate). To demonstrate the performance of the software, we present a comparative analysis of water-gas catalytic isotopic exchange on a column equipped with 3 types of filling: successive layers, random and structured (ordered package filled with catalyst). The program can be used for the LPCE process calculation, process used at detritiation facilities for CANDU reactors or fusion reactors. (authors)

  9. List processing software for the LeCroy 1821 Segment Manager Interface

    SciTech Connect

    Dorries, T.; Moore, C.; Pordes, R.

    1987-05-01

    Many experiments at Fermilab now include some FASTBUS electronics in their data readout. The software reported in this paper provides general support for the LeCroy 1821 interface. The list processing device drivers allow FASTBUS data to be read out efficiently into the Fermilab Computing Department supported data acquisition systems.

  10. The Effects of Beacons, Comments, and Tasks on Program Comprehension Process in Software Maintenance

    ERIC Educational Resources Information Center

    Fan, Quyin

    2010-01-01

    Program comprehension is the most important and frequent process in software maintenance. Extensive research has found that individual characteristics of programmers, differences of computer programs, and differences of task-driven motivations are the major factors that affect the program comprehension results. There is no study specifically…

  11. ProcessGene-Connect: SOA Integration between Business Process Models and Enactment Transactions of Enterprise Software Systems

    NASA Astrophysics Data System (ADS)

    Wasser, Avi; Lincoln, Maya

    In recent years, both practitioners and applied researchers have become increasingly interested in methods for integrating business process models and enterprise software systems through the deployment of enabling middleware. Integrative BPM research has been mainly focusing on the conversion of workflow notations into enacted application procedures, and less effort has been invested in enhancing the connectivity between design level, non-workflow business process models and related enactment systems such as: ERP, SCM and CRM. This type of integration is useful at several stages of an IT system lifecycle, from design and implementation through change management, upgrades and rollout. The paper presents an integration method that utilizes SOA for connecting business process models with corresponding enterprise software systems. The method is then demonstrated through an Oracle E-Business Suite procurement process and its ERP transactions.

  12. Cassini's Maneuver Automation Software (MAS) Process: How to Successfully Command 200 Navigation Maneuvers

    NASA Technical Reports Server (NTRS)

    Yang, Genevie Velarde; Mohr, David; Kirby, Charles E.

    2008-01-01

    To keep Cassini on its complex trajectory, more than 200 orbit trim maneuvers (OTMs) have been planned from July 2004 to July 2010. With only a few days between many of these OTMs, the operations process of planning and executing the necessary commands had to be automated. The resulting Maneuver Automation Software (MAS) process minimizes the workforce required for, and maximizes the efficiency of, the maneuver design and uplink activities. The MAS process is a well-organized and logically constructed interface between Cassini's Navigation (NAV), Spacecraft Operations (SCO), and Ground Software teams. Upon delivery of an orbit determination (OD) from NAV, the MAS process can generate a maneuver design and all related uplink and verification products within 30 minutes. To date, all 112 OTMs executed by the Cassini spacecraft have been successful. MAS was even used to successfully design and execute a maneuver while the spacecraft was in safe mode.

  13. The enhanced Software Life Cyle Support Environment (ProSLCSE): Automation for enterprise and process modeling

    NASA Technical Reports Server (NTRS)

    Milligan, James R.; Dutton, James E.

    1993-01-01

    In this paper, we have introduced a comprehensive method for enterprise modeling that addresses the three important aspects of how an organization goes about its business. FirstEP includes infrastructure modeling, information modeling, and process modeling notations that are intended to be easy to learn and use. The notations stress the use of straightforward visual languages that are intuitive, syntactically simple, and semantically rich. ProSLCSE will be developed with automated tools and services to facilitate enterprise modeling and process enactment. In the spirit of FirstEP, ProSLCSE tools will also be seductively easy to use. Achieving fully managed, optimized software development and support processes will be long and arduous for most software organizations, and many serious problems will have to be solved along the way. ProSLCSE will provide the ability to document, communicate, and modify existing processes, which is the necessary first step.

  14. Reducing the complexity of the software design process with object-oriented design

    NASA Technical Reports Server (NTRS)

    Schuler, M. P.

    1991-01-01

    Designing software is a complex process. How object-oriented design (OOD), coupled with formalized documentation and tailored object diagraming techniques, can reduce the complexity of the software design process is described and illustrated. The described OOD methodology uses a hierarchical decomposition approach in which parent objects are decomposed into layers of lower level child objects. A method of tracking the assignment of requirements to design components is also included. Increases in the reusability, portability, and maintainability of the resulting products are also discussed. This method was built on a combination of existing technology, teaching experience, consulting experience, and feedback from design method users. The discussed concepts are applicable to hierarchal OOD processes in general. Emphasis is placed on improving the design process by documenting the details of the procedures involved and incorporating improvements into those procedures as they are developed.

  15. Spatial data software integration - Merging CAD/CAM/mapping with GIS and image processing

    NASA Technical Reports Server (NTRS)

    Logan, Thomas L.; Bryant, Nevin A.

    1987-01-01

    The integration of CAD/CAM/mapping with image processing using geographic information systems (GISs) as the interface is examined. Particular emphasis is given to the development of software interfaces between JPL's Video Image Communication and Retrieval (VICAR)/Imaged Based Information System (IBIS) raster-based GIS and the CAD/CAM/mapping system. The design and functions of the VICAR and IBIS are described. Vector data capture and editing are studied. Various software programs for interfacing between the VICAR/IBIS and CAD/CAM/mapping are presented and analyzed.

  16. Waste receiving and processing facility module 1 data management system software project management plan

    SciTech Connect

    Clark, R.E.

    1994-11-02

    This document provides the software development plan for the Waste Receiving and Processing (WRAP) Module 1 Data Management System (DMS). The DMS is one of the plant computer systems for the new WRAP 1 facility (Project W-026). The DMS will collect, store, and report data required to certify the low level waste (LLW) and transuranic (TRU) waste items processed at WRAP 1 as acceptable for shipment, storage, or disposal.

  17. Waste Receiving and Processing Facility Module 1 Data Management System Software Requirements Specification

    SciTech Connect

    Brann, E.C. II

    1994-09-09

    This document provides the software requirements for Waste Receiving and Processing (WRAP) Module 1 Data Management System (DMS). The DMS is one of the plant computer systems for the new WRAP 1 facility (Project W-026). The DMS will collect, store and report data required to certify the low level waste (LLW) and transuranic (TRU) waste items processed at WRAP 1 as acceptable for shipment, storage, or disposal.

  18. Waste Receiving and Processing Facility Module 1 Data Management System software requirements specification

    SciTech Connect

    Rosnick, C.K.

    1996-04-19

    This document provides the software requirements for Waste Receiving and Processing (WRAP) Module 1 Data Management System (DMS). The DMS is one of the plant computer systems for the new WRAP 1 facility (Project W-0126). The DMS will collect, store and report data required to certify the low level waste (LLW) and transuranic (TRU) waste items processed at WRAP 1 as acceptable for shipment, storage, or disposal.

  19. GenomeTools: a comprehensive software library for efficient processing of structured genome annotations.

    PubMed

    Gremme, Gordon; Steinbiss, Sascha; Kurtz, Stefan

    2013-01-01

    Genome annotations are often published as plain text files describing genomic features and their subcomponents by an implicit annotation graph. In this paper, we present the GenomeTools, a convenient and efficient software library and associated software tools for developing bioinformatics software intended to create, process or convert annotation graphs. The GenomeTools strictly follow the annotation graph approach, offering a unified graph-based representation. This gives the developer intuitive and immediate access to genomic features and tools for their manipulation. To process large annotation sets with low memory overhead, we have designed and implemented an efficient pull-based approach for sequential processing of annotations. This allows to handle even the largest annotation sets, such as a complete catalogue of human variations. Our object-oriented C-based software library enables a developer to conveniently implement their own functionality on annotation graphs and to integrate it into larger workflows, simultaneously accessing compressed sequence data if required. The careful C implementation of the GenomeTools does not only ensure a light-weight memory footprint while allowing full sequential as well as random access to the annotation graph, but also facilitates the creation of bindings to a variety of script programming languages (like Python and Ruby) sharing the same interface. PMID:24091398

  20. Polarization information processing and software system design for simultaneously imaging polarimetry

    NASA Astrophysics Data System (ADS)

    Wang, Yahui; Liu, Jing; Jin, Weiqi; Wen, Renjie

    2015-08-01

    Simultaneous imaging polarimetry can realize real-time polarization imaging of the dynamic scene, which has wide application prospect. This paper first briefly illustrates the design of the double separate Wollaston Prism simultaneous imaging polarimetry, and then emphases are put on the polarization information processing methods and software system design for the designed polarimetry. Polarization information processing methods consist of adaptive image segmentation, high-accuracy image registration, instrument matrix calibration. Morphological image processing was used for image segmentation by taking dilation of an image; The accuracy of image registration can reach 0.1 pixel based on the spatial and frequency domain cross-correlation; Instrument matrix calibration adopted four-point calibration method. The software system was implemented under Windows environment based on C++ programming language, which realized synchronous polarization images acquisition and preservation, image processing and polarization information extraction and display. Polarization data obtained with the designed polarimetry shows that: the polarization information processing methods and its software system effectively performs live realize polarization measurement of the four Stokes parameters of a scene. The polarization information processing methods effectively improved the polarization detection accuracy.

  1. A Software Platform for Post-Processing Waveform-Based NDE

    NASA Technical Reports Server (NTRS)

    Roth, Donald J.; Martin, Richard E.; Seebo, Jeff P.; Trinh, Long B.; Walker, James L.; Winfree, William P.

    2007-01-01

    Ultrasonic, microwave, and terahertz nondestructive evaluation imaging systems generally require the acquisition of waveforms at each scan point to form an image. For such systems, signal and image processing methods are commonly needed to extract information from the waves and improve resolution of, and highlight, defects in the image. Since some similarity exists for all waveform-based NDE methods, it would seem a common software platform containing multiple signal and image processing techniques to process the waveforms and images makes sense where multiple techniques, scientists, engineers, and organizations are involved. This presentation describes NASA Glenn Research Center's approach in developing a common software platform for processing waveform-based NDE signals and images. This platform is currently in use at NASA Glenn and at Lockheed Martin Michoud Assembly Facility for processing of pulsed terahertz and ultrasonic data. Highlights of the software operation will be given. A case study will be shown for use with terahertz data. The authors also request scientists and engineers who are interested in sharing customized signal and image processing algorithms to contribute to this effort by letting the authors code up and include these algorithms in future releases.

  2. An agile mask data preparation and writer dispatching approach

    NASA Astrophysics Data System (ADS)

    Hsu, Chih-tung; Chen, Y. S.; Hsin, S. C.; Tuo, Laurent C.; Schulze, Steffen F.

    2004-08-01

    An agile mask data preparation (MDP) approach is proposed to cut re-fracture cycle time as incurred by mask writer dispatching policy changes. Shorter re-fracture cycle time increases the flexibility of mask writer dispatching, as a result, mask writer's capacity can be utilized to its optimum. Preliminary results demonstrate promising benefits in MDP cycle time reduction and writer dispatching flexibility improvement. The agile MDP can save up to 40% of re-fracture cycle time. OASIS (Open Artwork System Interchange Standard) was proposed to address the GDSII file size explosion problem. However, OASIS has yet to gain wide acceptance in the mask industry. The authors envision OASIS adoption by the mask industry as a three-phase process and identify key issues of each phase from the mask manufacturer's perspective. As a long-term MDP flow reengineering project, an agile MDP and writer dispatching approach based on OASIS is proposed. The paper describes the results of an extensive evaluation on OASIS performance compared to that of GDSII, both original GDSII and post-OPC GDSII files. The file size of eighty percent of the original GDSII files is more than ten times larger compared to that of its OASIS counterpart.

  3. The (mis)use of subjective process measures in software engineering

    NASA Technical Reports Server (NTRS)

    Valett, Jon D.; Condon, Steven E.

    1993-01-01

    A variety of measures are used in software engineering research to develop an understanding of the software process and product. These measures fall into three broad categories: quantitative, characteristics, and subjective. Quantitative measures are those to which a numerical value can be assigned, for example effort or lines of code (LOC). Characteristics describe the software process or product; they might include programming language or the type of application. While such factors do not provide a quantitative measurement of a process or product, they do help characterize them. Subjective measures (as defined in this study) are those that are based on the opinion or opinions of individuals; they are somewhat unique and difficult to quantify. Capturing of subjective measure data typically involves development of some type of scale. For example, 'team experience' is one of the subjective measures that were collected and studied by the Software Engineering Laboratory (SEL). Certainly, team experience could have an impact on the software process or product; actually measuring a team's experience, however, is not a strictly mathematical exercise. Simply adding up each team member's years of experience appears inadequate. In fact, most researchers would agree that 'years' do not directly translate into 'experience.' Team experience must be defined subjectively and then a scale must be developed e.g., high experience versus low experience; or high, medium, low experience; or a different or more granular scale. Using this type of scale, a particular team's overall experience can be compared with that of other teams in the development environment. Defining, collecting, and scaling subjective measures is difficult. First, precise definitions of the measures must be established. Next, choices must be made about whose opinions will be solicited to constitute the data. Finally, care must be given to defining the right scale and level of granularity for measurement.

  4. An Examination of an Information Security Framework Implementation Based on Agile Values to Achieve Health Insurance Portability and Accountability Act Security Rule Compliance in an Academic Medical Center: The Thomas Jefferson University Case Study

    ERIC Educational Resources Information Center

    Reis, David W.

    2012-01-01

    Agile project management is most often examined in relation to software development, while information security frameworks are often examined with respect to certain risk management capabilities rather than in terms of successful implementation approaches. This dissertation extended the study of both Agile project management and information…

  5. Analyses of requirements for computer control and data processing experiment subsystems: Image data processing system (IDAPS) software description (7094 version), volume 2

    NASA Technical Reports Server (NTRS)

    1973-01-01

    A description of each of the software modules of the Image Data Processing System (IDAPS) is presented. The changes in the software modules are the result of additions to the application software of the system and an upgrade of the IBM 7094 Mod(1) computer to a 1301 disk storage configuration. Necessary information about IDAPS sofware is supplied to the computer programmer who desires to make changes in the software system or who desires to use portions of the software outside of the IDAPS system. Each software module is documented with: module name, purpose, usage, common block(s) description, method (algorithm of subroutine) flow diagram (if needed), subroutines called, and storage requirements.

  6. Production planning tools and techniques for agile manufacturing

    SciTech Connect

    Kjeldgaard, E.A.; Jones, D.A.; List, G.F.; Turnquist, M.A.

    1996-10-01

    Effective use of resources shared among multiple products or processes is critical for agile manufacturing. This paper describes development and implementation of a computerized model to support production planning in a complex manufacturing system at Pantex Plant. The model integrates two different production processes (nuclear weapon dismantlement and stockpile evaluation) which use common facilities and personnel, and reflects the interactions of scheduling constraints, material flow constraints, and resource availability. These two processes reflect characteristics of flow-shop and job-shop operations in a single facility. Operational results from using the model are also discussed.

  7. Software Process Improvement Initiatives Based on Quality Assurance Strategies: A QATAM Pilot Application

    NASA Astrophysics Data System (ADS)

    Winkler, Dietmar; Elberzhager, Frank; Biffl, Stefan; Eschbach, Robert

    Quality Assurance (QA) strategies, i.e., bundles of verification and validation approaches embedded within a balanced software process can support project and quality managers in systematically planning and implementing improvement initiatives. New and modified processes and methods come up frequently that seems promising candidates for improvement. Nevertheless, the impact of processes and methods strongly depends on individual project contexts. A major challenge is how to systematically select and implement "bestpractices" for product construction, verification, and validation. In this paper we present the Quality Assurance Tradeoff Analysis Method (QATAM) that supports engineers in (a) systematically identifying candidate QA strategies and (b) evaluating QA strategy variants in a given project context. We evaluate feasibility and usefulness in a pilot application in a medium-size software engineering organization. Main results were that QATAM was considered useful for identifying and evaluating various improvement initiatives applicable for large organizations as well as for small and medium enterprises.

  8. BioSig: The Free and Open Source Software Library for Biomedical Signal Processing

    PubMed Central

    Vidaurre, Carmen; Sander, Tilmann H.; Schlögl, Alois

    2011-01-01

    BioSig is an open source software library for biomedical signal processing. The aim of the BioSig project is to foster research in biomedical signal processing by providing free and open source software tools for many different application areas. Some of the areas where BioSig can be employed are neuroinformatics, brain-computer interfaces, neurophysiology, psychology, cardiovascular systems, and sleep research. Moreover, the analysis of biosignals such as the electroencephalogram (EEG), electrocorticogram (ECoG), electrocardiogram (ECG), electrooculogram (EOG), electromyogram (EMG), or respiration signals is a very relevant element of the BioSig project. Specifically, BioSig provides solutions for data acquisition, artifact processing, quality control, feature extraction, classification, modeling, and data visualization, to name a few. In this paper, we highlight several methods to help students and researchers to work more efficiently with biomedical signals. PMID:21437227

  9. The frequency-agile radar: A multifunctional approach to remote sensing of the ionosphere

    NASA Astrophysics Data System (ADS)

    Tsunoda, R. T.; Livingston, R. C.; Buonocore, J. J.; McKinley, A. V.

    1995-09-01

    We introduce a new kind of diagnostic sensor that combines multifunctional measurement capabilities for ionospheric research. Multifunctionality is realized through agility in frequency selection over an extended band (1.5 to 50 MHz), system modularity, complete system control by software written in C, and a user-friendly computer interface. This sensor, which we call the frequency-agile radar (FAR), incorporates dual radar channels and an arbitrary waveform synthesizer that allows creative design of sophisticated waveforms as a means of increasing its sensitivity to weak signals while minimizing loss in radar resolution. The sensitivity of the FAR is determined by two sets of power amplifier modules: four 4-kW solid-state broadband amplifiers, and four 30-kW vacuum tube amplifiers. FAR control is by an AT-bus personal computer with on-line processing by a programmable array processor. The FAR does not simply house the separate functions of most radio sensors in use today, it provides convenient and flexible access to those functions as elements to be used in any combination. Some of the first new results obtained with the FAR during recent field campaigns are presented to illustrate its versatility. These include (1) the first detection of anomalous high-frequency (HF) reflections from a barium ion cloud, (2) the first evidence of unexpectedly large drifts and a shear north of the equatorial electrojet, (3) the first HF radar signature of a developing equatorial plasma bubble, and (4) the first measurements by a portable radar of altitude-extended, quasi-periodic backscatter from midlatitude sporadic E. We also mention the potential of the FAR for atmospheric remote sensing.

  10. Performance, Agility and Cost of Cloud Computing Services for NASA GES DISC Giovanni Application

    NASA Astrophysics Data System (ADS)

    Pham, L.; Chen, A.; Wharton, S.; Winter, E. L.; Lynnes, C.

    2013-12-01

    The NASA Goddard Earth Science Data and Information Services Center (GES DISC) is investigating the performance, agility and cost of Cloud computing for GES DISC applications. Giovanni (Geospatial Interactive Online Visualization ANd aNalysis Infrastructure), one of the core applications at the GES DISC for online climate-related Earth science data access, subsetting, analysis, visualization, and downloading, was used to evaluate the feasibility and effort of porting an application to the Amazon Cloud Services platform. The performance and the cost of running Giovanni on the Amazon Cloud were compared to similar parameters for the GES DISC local operational system. A Giovanni Time-Series analysis of aerosol absorption optical depth (388nm) from OMI (Ozone Monitoring Instrument)/Aura was selected for these comparisons. All required data were pre-cached in both the Cloud and local system to avoid data transfer delays. The 3-, 6-, 12-, and 24-month data were used for analysis on the Cloud and local system respectively, and the processing times for the analysis were used to evaluate system performance. To investigate application agility, Giovanni was installed and tested on multiple Cloud platforms. The cost of using a Cloud computing platform mainly consists of: computing, storage, data requests, and data transfer in/out. The Cloud computing cost is calculated based on the hourly rate, and the storage cost is calculated based on the rate of Gigabytes per month. Cost for incoming data transfer is free, and for data transfer out, the cost is based on the rate in Gigabytes. The costs for a local server system consist of buying hardware/software, system maintenance/updating, and operating cost. The results showed that the Cloud platform had a 38% better performance and cost 36% less than the local system. This investigation shows the potential of cloud computing to increase system performance and lower the overall cost of system management.

  11. Generalized Software Architecture Applied to the Continuous Lunar Water Separation Process and the Lunar Greenhouse Amplifier

    NASA Technical Reports Server (NTRS)

    Perusich, Stephen; Moos, Thomas; Muscatello, Anthony

    2011-01-01

    This innovation provides the user with autonomous on-screen monitoring, embedded computations, and tabulated output for two new processes. The software was originally written for the Continuous Lunar Water Separation Process (CLWSP), but was found to be general enough to be applicable to the Lunar Greenhouse Amplifier (LGA) as well, with minor alterations. The resultant program should have general applicability to many laboratory processes (see figure). The objective for these programs was to create a software application that would provide both autonomous monitoring and data storage, along with manual manipulation. The software also allows operators the ability to input experimental changes and comments in real time without modifying the code itself. Common process elements, such as thermocouples, pressure transducers, and relative humidity sensors, are easily incorporated into the program in various configurations, along with specialized devices such as photodiode sensors. The goal of the CLWSP research project is to design, build, and test a new method to continuously separate, capture, and quantify water from a gas stream. The application is any In-Situ Resource Utilization (ISRU) process that desires to extract or produce water from lunar or planetary regolith. The present work is aimed at circumventing current problems and ultimately producing a system capable of continuous operation at moderate temperatures that can be scaled over a large capacity range depending on the ISRU process. The goal of the LGA research project is to design, build, and test a new type of greenhouse that could be used on the moon or Mars. The LGA uses super greenhouse gases (SGGs) to absorb long-wavelength radiation, thus creating a highly efficient greenhouse at a future lunar or Mars outpost. Silica-based glass, although highly efficient at trapping heat, is heavy, fragile, and not suitable for space greenhouse applications. Plastics are much lighter and resilient, but are not

  12. Processing strategies and software solutions for data-independent acquisition in mass spectrometry.

    PubMed

    Bilbao, Aivett; Varesio, Emmanuel; Luban, Jeremy; Strambio-De-Castillia, Caterina; Hopfgartner, Gérard; Müller, Markus; Lisacek, Frédérique

    2015-03-01

    Data-independent acquisition (DIA) offers several advantages over data-dependent acquisition (DDA) schemes for characterizing complex protein digests analyzed by LC-MS/MS. In contrast to the sequential detection, selection, and analysis of individual ions during DDA, DIA systematically parallelizes the fragmentation of all detectable ions within a wide m/z range regardless of intensity, thereby providing broader dynamic range of detected signals, improved reproducibility for identification, better sensitivity, and accuracy for quantification, and, potentially, enhanced proteome coverage. To fully exploit these advantages, composite or multiplexed fragment ion spectra generated by DIA require more elaborate processing algorithms compared to DDA. This review examines different DIA schemes and, in particular, discusses the concepts applied to and related to data processing. Available software implementations for identification and quantification are presented as comprehensively as possible and examples of software usage are cited. Processing workflows, including complete proprietary frameworks or combinations of modules from different open source data processing packages are described and compared in terms of software availability and usability, programming language, operating system support, input/output data formats, as well as the main principles employed in the algorithms used for identification and quantification. This comparative study concludes with further discussion of current limitations and expectable improvements in the short- and midterm future. PMID:25430050

  13. Fighter agility metrics. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Liefer, Randall K.

    1990-01-01

    Fighter flying qualities and combat capabilities are currently measured and compared in terms relating to vehicle energy, angular rates and sustained acceleration. Criteria based on these measurable quantities have evolved over the past several decades and are routinely used to design aircraft structures, aerodynamics, propulsion and control systems. While these criteria, or metrics, have the advantage of being well understood, easily verified and repeatable during test, they tend to measure the steady state capability of the aircraft and not its ability to transition quickly from one state to another. Proposed new metrics to assess fighter aircraft agility are collected and analyzed. A framework for classification of these new agility metrics is developed and applied. A complete set of transient agility metrics is evaluated with a high fidelity, nonlinear F-18 simulation. Test techniques and data reduction methods are proposed. A method of providing cuing information to the pilot during flight test is discussed. The sensitivity of longitudinal and lateral agility metrics to deviations from the pilot cues is studied in detail. The metrics are shown to be largely insensitive to reasonable deviations from the nominal test pilot commands. Instrumentation required to quantify agility via flight test is also considered. With one exception, each of the proposed new metrics may be measured with instrumentation currently available.

  14. Gamma-ray Astrophysics with AGILE

    SciTech Connect

    Longo, Francesco |; Tavani, M.; Barbiellini, G.; Argan, A.; Basset, M.; Boffelli, F.; Bulgarelli, A.; Caraveo, P.; Cattaneo, P.; Chen, A.; Costa, E.; Del Monte, E.; Di Cocco, G.; Di Persio, G.; Donnarumma, I.; Feroci, M.; Fiorini, M.; Foggetta, L.; Froysland, T.; Frutti, M.

    2007-07-12

    AGILE will explore the gamma-ray Universe with a very innovative instrument combining for the first time a gamma-ray imager and a hard X-ray imager. AGILE will be operational in spring 2007 and it will provide crucial data for the study of Active Galactic Nuclei, Gamma-Ray Bursts, unidentified gamma-ray sources. Galactic compact objects, supernova remnants, TeV sources, and fundamental physics by microsecond timing. The AGILE instrument is designed to simultaneously detect and image photons in the 30 MeV - 50 GeV and 15 - 45 keV energy bands with excellent imaging and timing capabilities, and a large field of view covering {approx} 1/5 of the entire sky at energies above 30 MeV. A CsI calorimeter is capable of GRB triggering in the energy band 0.3-50 MeV AGILE is now (March 2007) undergoing launcher integration and testing. The PLSV launch is planned in spring 2007. AGILE is then foreseen to be fully operational during the summer of 2007.

  15. Enterprise Technologies Deployment for Agile Manufacturing

    SciTech Connect

    Neal, R.E.

    1992-11-01

    This report is intended for high-level technical planners who are responsible for planning future developments for their company or Department of Energy/Defense Programs (DOE/DP) facilities. On one hand, the information may be too detailed or contain too much manufacturing technology jargon for a high-level, nontechnical executive, while at the same time an expert in any of the four infrastructure fields (Product Definition/Order Entry, Planning and Scheduling, Shop Floor Management, and Intelligent Manufacturing Systems) will know more than is conveyed here. The purpose is to describe a vision of technology deployment for an agile manufacturing enterprise. According to the 21st Century Manufacturing Enterprise Strategy, the root philosophy of agile manufacturing is that ``competitive advantage in the new systems will belong to agile manufacturing enterprises, capable of responding rapidly to demand for high-quality, highly customized products.`` Such agility will be based on flexible technologies, skilled workers, and flexible management structures which collectively will foster cooperative initiatives in and among companies. The remainder of this report is dedicated to sharpening our vision and to establishing a framework for defining specific project or pre-competitive project goals which will demonstrate agility through technology deployment.

  16. Enterprise Technologies Deployment for Agile Manufacturing

    SciTech Connect

    Neal, R.E.

    1992-11-01

    This report is intended for high-level technical planners who are responsible for planning future developments for their company or Department of Energy/Defense Programs (DOE/DP) facilities. On one hand, the information may be too detailed or contain too much manufacturing technology jargon for a high-level, nontechnical executive, while at the same time an expert in any of the four infrastructure fields (Product Definition/Order Entry, Planning and Scheduling, Shop Floor Management, and Intelligent Manufacturing Systems) will know more than is conveyed here. The purpose is to describe a vision of technology deployment for an agile manufacturing enterprise. According to the 21st Century Manufacturing Enterprise Strategy, the root philosophy of agile manufacturing is that competitive advantage in the new systems will belong to agile manufacturing enterprises, capable of responding rapidly to demand for high-quality, highly customized products.'' Such agility will be based on flexible technologies, skilled workers, and flexible management structures which collectively will foster cooperative initiatives in and among companies. The remainder of this report is dedicated to sharpening our vision and to establishing a framework for defining specific project or pre-competitive project goals which will demonstrate agility through technology deployment.

  17. MMX-I: data-processing software for multimodal X-ray imaging and tomography

    PubMed Central

    Bergamaschi, Antoine; Medjoubi, Kadda; Messaoudi, Cédric; Marco, Sergio; Somogyi, Andrea

    2016-01-01

    A new multi-platform freeware has been developed for the processing and reconstruction of scanning multi-technique X-ray imaging and tomography datasets. The software platform aims to treat different scanning imaging techniques: X-ray fluorescence, phase, absorption and dark field and any of their combinations, thus providing an easy-to-use data processing tool for the X-ray imaging user community. A dedicated data input stream copes with the input and management of large datasets (several hundred GB) collected during a typical multi-technique fast scan at the Nanoscopium beamline and even on a standard PC. To the authors’ knowledge, this is the first software tool that aims at treating all of the modalities of scanning multi-technique imaging and tomography experiments. PMID:27140159

  18. A comprehensive software system for image processing and programming. Final report

    SciTech Connect

    Rasure, J.; Hallett, S.; Jordan, R.

    1994-12-31

    XVision is an example of a comprehensive software system dedicated to the processing of multidimensional scientific data. Because it is comprehensive it is necessarily complex. This design complexity is dealt with by considering XVision as nine overlapping software systems, their components and the required standards. The complexity seen by a user of XVision is minimized by the different interfaces providing access to the image processing routines as well as an interface to ease the incorporation of new routines. The XVision project has stressed the importance of having: (1) interfaces to accommodate users with differing preferences and backgrounds and (2) tools to support the programmer and the scientist. The result is a system that provides a framework for building a powerful research, education and development tool.

  19. MMX-I: data-processing software for multimodal X-ray imaging and tomography.

    PubMed

    Bergamaschi, Antoine; Medjoubi, Kadda; Messaoudi, Cédric; Marco, Sergio; Somogyi, Andrea

    2016-05-01

    A new multi-platform freeware has been developed for the processing and reconstruction of scanning multi-technique X-ray imaging and tomography datasets. The software platform aims to treat different scanning imaging techniques: X-ray fluorescence, phase, absorption and dark field and any of their combinations, thus providing an easy-to-use data processing tool for the X-ray imaging user community. A dedicated data input stream copes with the input and management of large datasets (several hundred GB) collected during a typical multi-technique fast scan at the Nanoscopium beamline and even on a standard PC. To the authors' knowledge, this is the first software tool that aims at treating all of the modalities of scanning multi-technique imaging and tomography experiments. PMID:27140159

  20. How can usability measurement affect the re-engineering process of clinical software procedures?

    PubMed

    Terazzi, A; Giordano, A; Minuco, G

    1998-01-01

    As a consequence of the dramatic improvements achieved in information technology standards in terms of single hardware and software components, efforts in the evaluation processes have been focused on the assessment of critical human factors, such as work-flow organisation, man-machine interaction and, in general, quality of use, or usability. This trend is particularly valid when applied to medical informatics, since the human component is the basis of the information processing system in health care context. With the aim to establish an action-research project on the evaluation and assessment of clinical software procedures which constitute an integrated hospital information system, the authors adopted this strategy and considered the measurement of perceived usability as one of the main goals of the project itself: the paper reports the results of this experience. PMID:9848419

  1. Users' manual for the Hydroecological Integrity Assessment Process software (including the New Jersey Assessment Tools)

    USGS Publications Warehouse

    Henriksen, James A.; Heasley, John; Kennen, Jonathan G.; Nieswand, Steven

    2006-01-01

    Applying the Hydroecological Integrity Assessment Process involves four steps: (1) a hydrologic classification of relatively unmodified streams in a geographic area using long-term gage records and 171 ecologically relevant indices; (2) the identification of statistically significant, nonredundant, hydroecologically relevant indices associated with the five major flow components for each stream class; and (3) the development of a stream-classification tool and a hydrologic assessment tool. Four computer software tools have been developed.

  2. Laser scanner data processing and 3D modeling using a free and open source software

    SciTech Connect

    Gabriele, Fatuzzo; Michele, Mangiameli Giuseppe, Mussumeci; Salvatore, Zito

    2015-03-10

    The laser scanning is a technology that allows in a short time to run the relief geometric objects with a high level of detail and completeness, based on the signal emitted by the laser and the corresponding return signal. When the incident laser radiation hits the object to detect, then the radiation is reflected. The purpose is to build a three-dimensional digital model that allows to reconstruct the reality of the object and to conduct studies regarding the design, restoration and/or conservation. When the laser scanner is equipped with a digital camera, the result of the measurement process is a set of points in XYZ coordinates showing a high density and accuracy with radiometric and RGB tones. In this case, the set of measured points is called “point cloud” and allows the reconstruction of the Digital Surface Model. Even the post-processing is usually performed by closed source software, which is characterized by Copyright restricting the free use, free and open source software can increase the performance by far. Indeed, this latter can be freely used providing the possibility to display and even custom the source code. The experience started at the Faculty of Engineering in Catania is aimed at finding a valuable free and open source tool, MeshLab (Italian Software for data processing), to be compared with a reference closed source software for data processing, i.e. RapidForm. In this work, we compare the results obtained with MeshLab and Rapidform through the planning of the survey and the acquisition of the point cloud of a morphologically complex statue.

  3. Modifications of alpha processing software to improve calculation of limits for qualitative detection

    SciTech Connect

    Kirkpatrick, J.R.

    1997-01-01

    The work described in this report was done for the Bioassay Counting Laboratory (BCL) of the Center of Excellence for Bioassay of the Analytical Services Organization at the Oak Ridge Y-12 Plant. BCL takes urine and fecal samples and tests for alpha radiation. An automated system, supplied by Canberra Industries, counts the activities in the samples and processes the results. The Canberra system includes hardware and software. The managers of BCL want to improve the accuracy of the results they report to their final customers. The desired improvements are of particular interest to the managers of BCL because the levels of alpha-emitting radionuclides in samples measured at BCL are usually so low that a significant fraction of the measured signal is due to background and to the reagent material used to extract the radioactive nuclides from the samples. Also, the background and reagent signals show a significant level of random variation. The customers at BCL requested four major modifications of the software. The requested software changes have been made and tested. The present report is in two parts. The first part describes what the modifications were supposed to accomplish. The second part describes the changes on a line-by-line basis. The second part includes listings of the changed software and discusses possible steps to correct a particular error condition. Last, the second part describes the effect of truncation errors on the standard deviations calculated from samples whose signals are very nearly the same.

  4. WHIPPET: a collaborative software environment for medical image processing and analysis

    NASA Astrophysics Data System (ADS)

    Hu, Yangqiu; Haynor, David R.; Maravilla, Kenneth R.

    2007-03-01

    While there are many publicly available software packages for medical image processing, making them available to end users in clinical and research labs remains non-trivial. An even more challenging task is to mix these packages to form pipelines that meet specific needs seamlessly, because each piece of software usually has its own input/output formats, parameter sets, and so on. To address these issues, we are building WHIPPET (Washington Heterogeneous Image Processing Pipeline EnvironmenT), a collaborative platform for integrating image analysis tools from different sources. The central idea is to develop a set of Python scripts which glue the different packages together and make it possible to connect them in processing pipelines. To achieve this, an analysis is carried out for each candidate package for WHIPPET, describing input/output formats, parameters, ROI description methods, scripting and extensibility and classifying its compatibility with other WHIPPET components as image file level, scripting level, function extension level, or source code level. We then identify components that can be connected in a pipeline directly via image format conversion. We set up a TWiki server for web-based collaboration so that component analysis and task request can be performed online, as well as project tracking, knowledge base management, and technical support. Currently WHIPPET includes the FSL, MIPAV, FreeSurfer, BrainSuite, Measure, DTIQuery, and 3D Slicer software packages, and is expanding. Users have identified several needed task modules and we report on their implementation.

  5. An investigation of fighter aircraft agility

    NASA Technical Reports Server (NTRS)

    Valasek, John; Downing, David R.

    1993-01-01

    This report attempts to unify in a single document the results of a series of studies on fighter aircraft agility funded by the NASA Ames Research Center, Dryden Flight Research Facility and conducted at the University of Kansas Flight Research Laboratory during the period January 1989 through December 1993. New metrics proposed by pilots and the research community to assess fighter aircraft agility are collected and analyzed. The report develops a framework for understanding the context into which the various proposed fighter agility metrics fit in terms of application and testing. Since new metrics continue to be proposed, this report does not claim to contain every proposed fighter agility metric. Flight test procedures, test constraints, and related criteria are developed. Instrumentation required to quantify agility via flight test is considered, as is the sensitivity of the candidate metrics to deviations from nominal pilot command inputs, which is studied in detail. Instead of supplying specific, detailed conclusions about the relevance or utility of one candidate metric versus another, the authors have attempted to provide sufficient data and analyses for readers to formulate their own conclusions. Readers are therefore ultimately responsible for judging exactly which metrics are 'best' for their particular needs. Additionally, it is not the intent of the authors to suggest combat tactics or other actual operational uses of the results and data in this report. This has been left up to the user community. Twenty of the candidate agility metrics were selected for evaluation with high fidelity, nonlinear, non real-time flight simulation computer programs of the F-5A Freedom Fighter, F-16A Fighting Falcon, F-18A Hornet, and X-29A. The information and data presented on the 20 candidate metrics which were evaluated will assist interested readers in conducting their own extensive investigations. The report provides a definition and analysis of each metric; details

  6. Gamma-ray astrophysics with AGILE

    NASA Astrophysics Data System (ADS)

    Tavani, M.

    2003-09-01

    Gamma-ray astrophysics above 30 MeV will soon be revitalized by a new generation of high-energy detectors in space. We discuss here the AGILE Mission that will be dedicated to gamma-ray astrophysics above 30 MeV during the period 2005-2006. The main characteristics of AGILE are: (1) excellent imaging and monitoring capabilities both in the γ-ray (30 MeV - 30 GeV) and hard X-ray (10-40 keV) energy ranges (reaching an arcminute source positioning), (2) very good timing (improving by three orders of magnitude the instrumental deadtime for γ-ray detection compared to previous instruments), and (3) excellent imaging and triggering capability for Gamma-Ray Bursts. The AGILE scientific program will emphasize a quick response to gamma-ray transients and multiwavelength studies of gamma-ray sources.

  7. SuperAGILE and Gamma Ray Bursts

    SciTech Connect

    Pacciani, Luigi; Costa, Enrico; Del Monte, Ettore; Donnarumma, Immacolata; Evangelista, Yuri; Feroci, Marco; Frutti, Massimo; Lazzarotto, Francesco; Lapshov, Igor; Rubini, Alda; Soffitta, Paolo; Tavani, Marco; Barbiellini, Guido; Mastropietro, Marcello; Morelli, Ennio; Rapisarda, Massimo

    2006-05-19

    The solid-state hard X-ray imager of AGILE gamma-ray mission -- SuperAGILE -- has a six arcmin on-axis angular resolution in the 15-45 keV range, a field of view in excess of 1 steradian. The instrument is very light: 5 kg only. It is equipped with an on-board self triggering logic, image deconvolution, and it is able to transmit the coordinates of a GRB to the ground in real-time through the ORBCOMM constellation of satellites. Photon by photon Scientific Data are sent to the Malindi ground station at every contact. In this paper we review the performance of the SuperAGILE experiment (scheduled for a launch in the middle of 2006), after its first onground calibrations, and show the perspectives for Gamma Ray Bursts.

  8. Hardware and software platform for real-time processing and visualization of echographic radiofrequency signals.

    PubMed

    Scabia, Marco; Biagi, Elena; Masotti, Leonardo

    2002-10-01

    In this paper the architecture of a hardware and software platform, for ultrasonic investigation is presented. The platform, used in conjunction with an analog front-end hardware for driving the ultrasonic transducers of any commercial echograph, having the radiofrequency echo signal access, make it possible to dispose of a powerful echographic system for experimenting any processing technique, also in a clinical environment in which real-time operation mode is an essential prerequisite. The platform transforms any echograph into a test-system for evaluating the diagnostic effectiveness of new investigation techniques. A particular user interface was designed in order to allow a real-time and simultaneous visualization of the results produced in the different stages of the chosen processing procedure. This is aimed at obtaining a better optimization of the processing algorithm. The most important platform aspect, which also constitutes the basic differentiation with respect to similar systems, is the direct processing of the radiofrequency echo signal, which is essential for a complete analysis of the particular ultrasound-media interaction phenomenon. The platform completely integrates the architecture of a personal computer (PC) giving rise to several benefits, such as the quick technological evolution in the PC field and an extreme degree of programmability for different applications. The PC also constitutes the user interface, as a flexible and intuitive visualization support, and performs some software signal processing, by custom algorithms and commercial libraries. The realized close synergy between hardware and software allows the acquisition and real-time processing of the echographic radiofrequency (RF) signal with fast data representation. PMID:12403146

  9. Intelligent systems/software engineering methodology - A process to manage cost and risk

    NASA Technical Reports Server (NTRS)

    Friedlander, Carl; Lehrer, Nancy

    1991-01-01

    A systems development methodology is discussed that has been successfully applied to the construction of a number of intelligent systems. This methodology is a refinement of both evolutionary and spiral development methodologies. It is appropriate for development of intelligent systems. The application of advanced engineering methodology to the development of software products and intelligent systems is an important step toward supporting the transition of AI technology into aerospace applications. A description of the methodology and the process model from which it derives is given. Associated documents and tools are described which are used to manage the development process and record and report the emerging design.

  10. Software for Processing of Digitized Astronegatives from Archives and Databases of Virtual Observatory

    NASA Astrophysics Data System (ADS)

    Protsyuk, Yu. I.; Andruk, V. N.; Kazantseva, L. V.

    The paper discusses and illustrates the steps of basic processing of digitized image of astro negatives. Software for obtaining of a rectangular coordinates and photometric values of objects on photographic plates was created in the environment LINUX / MIDAS / ROMAFOT. The program can automatically process the specified number of files in FITS format with sizes up to 20000 x 20000 pixels. Other programs were made in FORTRAN and PASCAL with the ability to work in an environment of LINUX or WINDOWS. They were used for: identification of stars, separation and exclusion of diffraction satellites and double and triple exposures, elimination of image defects, reduction to the equatorial coordinates and magnitudes of a reference catalogs.

  11. Impact of a process improvement program in a production software environment: Are we any better?

    NASA Technical Reports Server (NTRS)

    Heller, Gerard H.; Page, Gerald T.

    1990-01-01

    For the past 15 years, Computer Sciences Corporation (CSC) has participated in a process improvement program as a member of the Software Engineering Laboratory (SEL), which is sponsored by GSFC. The benefits CSC has derived from involvement in this program are analyzed. In the environment studied, it shows that improvements were indeed achieved, as evidenced by a decrease in error rates and costs over a period in which both the size and the complexity of the developed systems increased substantially. The principles and mechanics of the process improvement program, the lessons CSC has learned, and how CSC has capitalized on these lessons are also discussed.

  12. Machine platform and software environment for rapid optics assembly process development

    NASA Astrophysics Data System (ADS)

    Sauer, Sebastian; Müller, Tobias; Haag, Sebastian; Zontar, Daniel

    2016-03-01

    The assembly of optical components for laser systems is proprietary knowledge and typically done by well-trained personnel in clean room environment as it has major impact on the overall laser performance. Rising numbers of laser systems drives laser production to industrial-level automation solutions allowing for high volumes by simultaneously ensuring stable quality, lots of variants and low cost. Therefore, an easy programmable, expandable and reconfigurable machine with intuitive and flexible software environment for process configuration is required. With Fraunhofer IPT's expertise on optical assembly processes, the next step towards industrializing the production of optical systems is made.

  13. Agile enterprise development framework utilizing services principles for building pervasive security

    NASA Astrophysics Data System (ADS)

    Farroha, Deborah; Farroha, Bassam

    2011-06-01

    We are in an environment of continuously changing mission requirements and therefore our Information Systems must adapt to accomplish new tasks, quicker, in a more proficient manner. Agility is the only way we will be able to keep up with this change. But there are subtleties that must be considered as we adopt various agile methods: secure, protect, control and authenticate are all elements needed to posture our Information Technology systems to counteract the real and perceived threats in today's environment. Many systems have been tasked to ingest process and analyze different data sets than they were originally designed for and they have to interact with multiple new systems that were unaccounted for at design time. Leveraging the tenets of security, we have devised a new framework that takes agility into a new realm where the product will built to work in a service-based environment but is developed using agile processes. Even though these two criteria promise to hone the development effort, they actually contradict each other in philosophy where Services require stable interfaces, while Agile focuses on being flexible and tolerate changes up to much later stages of development. This framework is focused on enabling a successful product development that capitalizes on both philosophies.

  14. Exploring the Process of Adult Computer Software Training Using Andragogy, Situated Cognition, and a Minimalist Approach

    ERIC Educational Resources Information Center

    Hurt, Andrew C.

    2007-01-01

    With technology advances, computer software becomes increasingly difficult to learn. Adults often rely on software training to keep abreast of these changes. Instructor-led software training is frequently used to teach adults new software skills; however there is limited research regarding the best practices in adult computer software training.…

  15. Vobi One: a data processing software package for functional optical imaging

    PubMed Central

    Takerkart, Sylvain; Katz, Philippe; Garcia, Flavien; Roux, Sébastien; Reynaud, Alexandre; Chavane, Frédéric

    2014-01-01

    Optical imaging is the only technique that allows to record the activity of a neuronal population at the mesoscopic scale. A large region of the cortex (10–20 mm diameter) is directly imaged with a CCD camera while the animal performs a behavioral task, producing spatio-temporal data with an unprecedented combination of spatial and temporal resolutions (respectively, tens of micrometers and milliseconds). However, researchers who have developed and used this technique have relied on heterogeneous software and methods to analyze their data. In this paper, we introduce Vobi One, a software package entirely dedicated to the processing of functional optical imaging data. It has been designed to facilitate the processing of data and the comparison of different analysis methods. Moreover, it should help bring good analysis practices to the community because it relies on a database and a standard format for data handling and it provides tools that allow producing reproducible research. Vobi One is an extension of the BrainVISA software platform, entirely written with the Python programming language, open source and freely available for download at https://trac.int.univ-amu.fr/vobi_one. PMID:24478623

  16. Vobi One: a data processing software package for functional optical imaging.

    PubMed

    Takerkart, Sylvain; Katz, Philippe; Garcia, Flavien; Roux, Sébastien; Reynaud, Alexandre; Chavane, Frédéric

    2014-01-01

    Optical imaging is the only technique that allows to record the activity of a neuronal population at the mesoscopic scale. A large region of the cortex (10-20 mm diameter) is directly imaged with a CCD camera while the animal performs a behavioral task, producing spatio-temporal data with an unprecedented combination of spatial and temporal resolutions (respectively, tens of micrometers and milliseconds). However, researchers who have developed and used this technique have relied on heterogeneous software and methods to analyze their data. In this paper, we introduce Vobi One, a software package entirely dedicated to the processing of functional optical imaging data. It has been designed to facilitate the processing of data and the comparison of different analysis methods. Moreover, it should help bring good analysis practices to the community because it relies on a database and a standard format for data handling and it provides tools that allow producing reproducible research. Vobi One is an extension of the BrainVISA software platform, entirely written with the Python programming language, open source and freely available for download at https://trac.int.univ-amu.fr/vobi_one. PMID:24478623

  17. Advanced multilateration theory, software development, and data processing: The MICRODOT system

    NASA Technical Reports Server (NTRS)

    Escobal, P. R.; Gallagher, J. F.; Vonroos, O. H.

    1976-01-01

    The process of geometric parameter estimation to accuracies of one centimeter, i.e., multilateration, is defined and applications are listed. A brief functional explanation of the theory is presented. Next, various multilateration systems are described in order of increasing system complexity. Expected systems accuracy is discussed from a general point of view and a summary of the errors is listed. An outline of the design of a software processing system for multilateration, called MICRODOT, is presented next. The links of this software, which can be used for multilateration data simulations or operational data reduction, are examined on an individual basis. Functional flow diagrams are presented to aid in understanding the software capability. MICRODOT capability is described with respect to vehicle configurations, interstation coordinate reduction, geophysical parameter estimation, and orbit determination. Numerical results obtained from MICRODOT via data simulations are displayed both for hypothetical and real world vehicle/station configurations such as used in the GEOS-3 Project. These simulations show the inherent power of the multilateration procedure.

  18. Toward an Efficient Icing CFD Process Using an Interactive Software Toolkit: Smagglce 2D

    NASA Technical Reports Server (NTRS)

    Vickerman, Mary B.; Choo, Yung K.; Schilling, Herbert W.; Baez, Marivell; Braun, Donald C.; Cotton, Barbara J.

    2001-01-01

    Two-dimensional CID analysis for iced airfoils can be a labor-intensive task. The software toolkit SmaggIce 2D is being developed to help streamline the CID process and provide the unique features needed for icing. When complete, it will include a combination of partially automated and fully interactive tools for all aspects of the tasks leading up to the flow analysis: geometry preparation, domain decomposition. block boundary demoralization. gridding, and linking with a flow solver. It also includes tools to perform ice shape characterization, an important aid in determining the relationship between ice characteristics and their effects on aerodynamic performance. Completed tools, work-in-progress, and planned features of the software toolkit are presented here.

  19. Capturing a failure of an ASIC in-situ, using infrared radiometry and image processing software

    NASA Technical Reports Server (NTRS)

    Ruiz, Ronald P.

    2003-01-01

    Failures in electronic devices can sometimes be tricky to locate-especially if they are buried inside radiation-shielded containers designed to work in outer space. Such was the case with a malfunctioning ASIC (Application Specific Integrated Circuit) that was drawing excessive power at a specific temperature during temperature cycle testing. To analyze the failure, infrared radiometry (thermography) was used in combination with image processing software to locate precisely where the power was being dissipated at the moment the failure took place. The IR imaging software was used to make the image of the target and background, appear as unity. As testing proceeded and the failure mode was reached, temperature changes revealed the precise location of the fault. The results gave the design engineers the information they needed to fix the problem. This paper describes the techniques and equipment used to accomplish this failure analysis.

  20. Frequency Agile Transceiver for Advanced Vehicle Data Links

    NASA Technical Reports Server (NTRS)

    Freudinger, Lawrence C.; Macias, Filiberto; Cornelius, Harold

    2009-01-01

    Emerging and next-generation test instrumentation increasingly relies on network communication to manage complex and dynamic test scenarios, particularly for uninhabited autonomous systems. Adapting wireless communication infrastructure to accommodate challenging testing needs can benefit from reconfigurable radio technology. Frequency agility is one characteristic of reconfigurable radios that to date has seen only limited progress toward programmability. This paper overviews an ongoing project to validate a promising chipset that performs conversion of RF signals directly into digital data for the wireless receiver and, for the transmitter, converts digital data into RF signals. The Software Configurable Multichannel Transceiver (SCMT) enables four transmitters and four receivers in a single unit, programmable for any frequency band between 1 MHz and 6 GHz.

  1. Enhanced detection of terrestrial gamma-ray flashes by AGILE

    NASA Astrophysics Data System (ADS)

    Marisaldi, M.; Argan, A.; Ursi, A.; Gjesteland, T.; Fuschino, F.; Labanti, C.; Galli, M.; Tavani, M.; Pittori, C.; Verrecchia, F.; D'Amico, F.; Østgaard, N.; Mereghetti, S.; Campana, R.; Cattaneo, P. W.; Bulgarelli, A.; Colafrancesco, S.; Dietrich, S.; Longo, F.; Gianotti, F.; Giommi, P.; Rappoldi, A.; Trifoglio, M.; Trois, A.

    2015-11-01

    At the end of March 2015 the onboard software configuration of the Astrorivelatore Gamma a Immagini Leggero (AGILE) satellite was modified in order to disable the veto signal of the anticoincidence shield for the minicalorimeter instrument. The motivation for such a change was the understanding that the dead time induced by the anticoincidence prevented the detection of a large fraction of Terrestrial Gamma-Ray Flashes (TGFs). The configuration change was highly successful resulting in an increase of one order of magnitude in TGF detection rate. As expected, the largest fraction of the new events has short duration (<100 μs), and part of them has simultaneous association with lightning sferics detected by the World Wide Lightning Location Network. The new configuration provides the largest TGF detection rate surface density (TGFs/km2/yr) to date, opening prospects for improved correlation studies with lightning and atmospheric parameters on short spatial and temporal scales along the equatorial region.

  2. CEval: All-in-one software for data processing and statistical evaluations in affinity capillary electrophoresis.

    PubMed

    Dubský, Pavel; Ördögová, Magda; Malý, Michal; Riesová, Martina

    2016-05-01

    We introduce CEval software (downloadable for free at echmet.natur.cuni.cz) that was developed for quicker and easier electrophoregram evaluation and further data processing in (affinity) capillary electrophoresis. This software allows for automatic peak detection and evaluation of common peak parameters, such as its migration time, area, width etc. Additionally, the software includes a nonlinear regression engine that performs peak fitting with the Haarhoff-van der Linde (HVL) function, including automated initial guess of the HVL function parameters. HVL is a fundamental peak-shape function in electrophoresis, based on which the correct effective mobility of the analyte represented by the peak is evaluated. Effective mobilities of an analyte at various concentrations of a selector can be further stored and plotted in an affinity CE mode. Consequently, the mobility of the free analyte, μA, mobility of the analyte-selector complex, μAS, and the apparent complexation constant, K('), are first guessed automatically from the linearized data plots and subsequently estimated by the means of nonlinear regression. An option that allows two complexation dependencies to be fitted at once is especially convenient for enantioseparations. Statistical processing of these data is also included, which allowed us to: i) express the 95% confidence intervals for the μA, μAS and K(') least-squares estimates, ii) do hypothesis testing on the estimated parameters for the first time. We demonstrate the benefits of the CEval software by inspecting complexation of tryptophan methyl ester with two cyclodextrins, neutral heptakis(2,6-di-O-methyl)-β-CD and charged heptakis(6-O-sulfo)-β-CD. PMID:27062723

  3. End-to-end observatory software modeling using domain specific languages

    NASA Astrophysics Data System (ADS)

    Filgueira, José M.; Bec, Matthieu; Liu, Ning; Peng, Chien; Soto, José

    2014-07-01

    The Giant Magellan Telescope (GMT) is a 25-meter extremely large telescope that is being built by an international consortium of universities and research institutions. Its software and control system is being developed using a set of Domain Specific Languages (DSL) that supports a model driven development methodology integrated with an Agile management process. This approach promotes the use of standardized models that capture the component architecture of the system, that facilitate the construction of technical specifications in a uniform way, that facilitate communication between developers and domain experts and that provide a framework to ensure the successful integration of the software subsystems developed by the GMT partner institutions.

  4. An agile enterprise regulation architecture for health information security management.

    PubMed

    Chen, Ying-Pei; Hsieh, Sung-Huai; Cheng, Po-Hsun; Chien, Tsan-Nan; Chen, Heng-Shuen; Luh, Jer-Junn; Lai, Jin-Shin; Lai, Feipei; Chen, Sao-Jie

    2010-09-01

    Information security management for healthcare enterprises is complex as well as mission critical. Information technology requests from clinical users are of such urgency that the information office should do its best to achieve as many user requests as possible at a high service level using swift security policies. This research proposes the Agile Enterprise Regulation Architecture (AERA) of information security management for healthcare enterprises to implement as part of the electronic health record process. Survey outcomes and evidential experiences from a sample of medical center users proved that AERA encourages the information officials and enterprise administrators to overcome the challenges faced within an electronically equipped hospital. PMID:20815748

  5. Digital mapping of side-scan sonar data with the Woods Hole Image Processing System software

    USGS Publications Warehouse

    Paskevich, Valerie F.

    1992-01-01

    Since 1985, the Branch of Atlantic Marine Geology has been involved in collecting, processing and digitally mosaicking high and low resolution sidescan sonar data. In the past, processing and digital mosaicking has been accomplished with a dedicated, shore-based computer system. Recent development of a UNIX-based image-processing software system includes a series of task specific programs for pre-processing sidescan sonar data. To extend the capabilities of the UNIX-based programs, development of digital mapping techniques have been developed. This report describes the initial development of an automated digital mapping procedure. Included is a description of the programs and steps required to complete the digital mosaicking on a UNIXbased computer system, and a comparison of techniques that the user may wish to select.

  6. Real-time SHVC software decoding with multi-threaded parallel processing

    NASA Astrophysics Data System (ADS)

    Gudumasu, Srinivas; He, Yuwen; Ye, Yan; He, Yong; Ryu, Eun-Seok; Dong, Jie; Xiu, Xiaoyu

    2014-09-01

    This paper proposes a parallel decoding framework for scalable HEVC (SHVC). Various optimization technologies are implemented on the basis of SHVC reference software SHM-2.0 to achieve real-time decoding speed for the two layer spatial scalability configuration. SHVC decoder complexity is analyzed with profiling information. The decoding process at each layer and the up-sampling process are designed in parallel and scheduled by a high level application task manager. Within each layer, multi-threaded decoding is applied to accelerate the layer decoding speed. Entropy decoding, reconstruction, and in-loop processing are pipeline designed with multiple threads based on groups of coding tree units (CTU). A group of CTUs is treated as a processing unit in each pipeline stage to achieve a better trade-off between parallelism and synchronization. Motion compensation, inverse quantization, and inverse transform modules are further optimized with SSE4 SIMD instructions. Simulations on a desktop with an Intel i7 processor 2600 running at 3.4 GHz show that the parallel SHVC software decoder is able to decode 1080p spatial 2x at up to 60 fps (frames per second) and 1080p spatial 1.5x at up to 50 fps for those bitstreams generated with SHVC common test conditions in the JCT-VC standardization group. The decoding performance at various bitrates with different optimization technologies and different numbers of threads are compared in terms of decoding speed and resource usage, including processor and memory.

  7. Lean and Agile: An Epistemological Reflection

    ERIC Educational Resources Information Center

    Browaeys, Marie-Joelle; Fisser, Sandra

    2012-01-01

    Purpose: The aim of the paper is to contribute to the discussion of treating the concepts of lean and agile in isolation or combination by presenting an alternative view from complexity thinking on these concepts, considering an epistemological approach to this topic. Design/methodology/approach: The paper adopts an epistemological approach, using…

  8. Space Flight Software Development Software for Intelligent System Health Management

    NASA Technical Reports Server (NTRS)

    Trevino, Luis C.; Crumbley, Tim

    2004-01-01

    The slide presentation examines the Marshall Space Flight Center Flight Software Branch, including software development projects, mission critical space flight software development, software technical insight, advanced software development technologies, and continuous improvement in the software development processes and methods.

  9. Evolution of a Reconfigurable Processing Platform for a Next Generation Space Software Defined Radio

    NASA Technical Reports Server (NTRS)

    Kacpura, Thomas J.; Downey, Joseph A.; Anderson, Keffery R.; Baldwin, Keith

    2014-01-01

    The National Aeronautics and Space Administration (NASA)Harris Ka-Band Software Defined Radio (SDR) is the first, fully reprogrammable space-qualified SDR operating in the Ka-Band frequency range. Providing exceptionally higher data communication rates than previously possible, this SDR offers in-orbit reconfiguration, multi-waveform operation, and fast deployment due to its highly modular hardware and software architecture. Currently in operation on the International Space Station (ISS), this new paradigm of reconfigurable technology is enabling experimenters to investigate navigation and networking in the space environment.The modular SDR and the NASA developed Space Telecommunications Radio System (STRS) architecture standard are the basis for Harris reusable, digital signal processing space platform trademarked as AppSTAR. As a result, two new space radio products are a synthetic aperture radar payload and an Automatic Detection Surveillance Broadcast (ADS-B) receiver. In addition, Harris is currently developing many new products similar to the Ka-Band software defined radio for other applications. For NASAs next generation flight Ka-Band radio development, leveraging these advancements could lead to a more robust and more capable software defined radio.The space environment has special considerations different from terrestrial applications that must be considered for any system operated in space. Each space mission has unique requirements that can make these systems unique. These unique requirements can make products that are expensive and limited in reuse. Space systems put a premium on size, weight and power. A key trade is the amount of reconfigurability in a space system. The more reconfigurable the hardware platform, the easier it is to adapt to the platform to the next mission, and this reduces the amount of non-recurring engineering costs. However, the more reconfigurable platforms often use more spacecraft resources. Software has similar considerations

  10. The Frequency Agile Solar Radiotelescope (FASR)

    NASA Astrophysics Data System (ADS)

    White, S. M.; Gary, D. E.; Bastian, T. S.; Hurford, G. J.; Lanzerotti, L. J.

    2003-04-01

    The Frequency Agile Solar Radiotelescope (FASR) is a radio interferometer designed to make high spatial resolution images of the Sun across a broad range of radio wavelengths simultaneously, allowing the technique of imaging spectroscopy to be exploited on a routine basis. The telescope will cover the frequency range 0.1-30 GHz using several sets of receiving elements that provide full-disk imaging, with of order 100 antennas at highest frequency range. FASR will be optimized for solar radio phenomena and will be the most powerful and versatile radioheliograph ever built, providing an improvement of orders of magnitude in image quality over existing instruments. FASR recently received the top ranking amongst all small projects considered by the decadal survey of the National Academy of Science Committee on Solar and Space Physics. FASR will probe all phenomena in the solar atmosphere from the mid-chromosphere outwards. In particular, FASR will provide direct measurement of coronal magnetic field strengths, will image the nonthermal solar atmosphere and show directly the locations of electrons accelerated by solar flares, will provide images of coronal mass ejections travelling outwwards through the solar corona, and supply extensive data products for forecasting and synoptic studies. A major emphasis in the project is to make FASR data as widely and easily used as possible, i.e., providing the general user with processed, fully-calibrated high-quality images that do not need particular knowledge of radio astronomy for interpretation. This paper will describe the telescope and its science goals, and summarize its current status.

  11. Multiple-Instruction, Multiple-Data Path Computers: Parallel Processing Impact on Flight Simulation Software. Final Report.

    ERIC Educational Resources Information Center

    Lord, Robert E.; And Others

    The purpose of this study was to evaluate the parallel processing impact of multiple-instruction multiple-data path (MIMD) computers on flight simulation software. Basic mathematical functions and arithmetic expressions from typical flight simulation software were selected and run on an MIMD computer to evaluate the improvement in execution time…

  12. Health care professional workstation: software system construction using DSSA scenario-based engineering process.

    PubMed

    Hufnagel, S; Harbison, K; Silva, J; Mettala, E

    1994-01-01

    This paper describes a new method for the evolutionary determination of user requirements and system specifications called scenario-based engineering process (SEP). Health care professional workstations are critical components of large scale health care system architectures. We suggest that domain-specific software architectures (DSSAs) be used to specify standard interfaces and protocols for reusable software components throughout those architectures, including workstations. We encourage the use of engineering principles and abstraction mechanisms. Engineering principles are flexible guidelines, adaptable to particular situations. Abstraction mechanisms are simplifications for management of complexity. We recommend object-oriented design principles, graphical structural specifications, and formal components' behavioral specifications. We give an ambulatory care scenario and associated models to demonstrate SEP. The scenario uses health care terminology and gives patients' and health care providers' system views. Our goal is to have a threefold benefit. (i) Scenario view abstractions provide consistent interdisciplinary communications. (ii) Hierarchical object-oriented structures provide useful abstractions for reuse, understandability, and long term evolution. (iii) SEP and health care DSSA integration into computer aided software engineering (CASE) environments. These environments should support rapid construction and certification of individualized systems, from reuse libraries. PMID:8125652

  13. Design of single phase inverter using microcontroller assisted by data processing applications software

    NASA Astrophysics Data System (ADS)

    Ismail, K.; Muharam, A.; Amin; Widodo Budi, S.

    2015-12-01

    Inverter is widely used for industrial, office, and residential purposes. Inverter supports the development of alternative energy such as solar cells, wind turbines and fuel cells by converting dc voltage to ac voltage. Inverter has been made with a variety of hardware and software combinations, such as the use of pure analog circuit and various types of microcontroller as controller. When using pure analog circuit, modification would be difficult because it will change the entire hardware components. In inverter with microcontroller based design (with software), calculations to generate AC modulation is done in the microcontroller. This increases programming complexity and amount of coding downloaded to the microcontroller chip (capacity flash memory in the microcontroller is limited). This paper discusses the design of a single phase inverter using unipolar modulation of sine wave and triangular wave, which is done outside the microcontroller using data processing software application (Microsoft Excel), result shows that complexity programming was reduce and resolution sampling data is very influence to THD. Resolution sampling must taking ½ A degree to get best THD (15.8%).

  14. Development of EarthCube Governance: An Agile Approach

    NASA Astrophysics Data System (ADS)

    Pearthree, G.; Allison, M. L.; Patten, K.

    2013-12-01

    Governance of geosciences cyberinfrastructure is a complex and essential undertaking, critical in enabling distributed knowledge communities to collaborate and communicate across disciplines, distances, and cultures. Advancing science with respect to 'grand challenges," such as global climate change, weather prediction, and core fundamental science, depends not just on technical cyber systems, but also on social systems for strategic planning, decision-making, project management, learning, teaching, and building a community of practice. Simply put, a robust, agile technical system depends on an equally robust and agile social system. Cyberinfrastructure development is wrapped in social, organizational and governance challenges, which may significantly impede progress. An agile development process is underway for governance of transformative investments in geosciences cyberinfrastructure through the NSF EarthCube initiative. Agile development is iterative and incremental, and promotes adaptive planning and rapid and flexible response. Such iterative deployment across a variety of EarthCube stakeholders encourages transparency, consensus, accountability, and inclusiveness. A project Secretariat acts as the coordinating body, carrying out duties for planning, organizing, communicating, and reporting. A broad coalition of stakeholder groups comprises an Assembly (Mainstream Scientists, Cyberinfrastructure Institutions, Information Technology/Computer Sciences, NSF EarthCube Investigators, Science Communities, EarthCube End-User Workshop Organizers, Professional Societies) to serve as a preliminary venue for identifying, evaluating, and testing potential governance models. To offer opportunity for broader end-user input, a crowd-source approach will engage stakeholders not involved otherwise. An Advisory Committee from the Earth, ocean, atmosphere, social, computer and library sciences is guiding the process from a high-level policy point of view. Developmental

  15. Comparison of Perfusion- and Diffusion-weighted Imaging Parameters in Brain Tumor Studies Processed Using Different Software Platforms

    PubMed Central

    Milchenko, Mikhail V.; Rajderkar, Dhanashree; LaMontagne, Pamela; Massoumzadeh, Parinaz; Bogdasarian, Ronald; Schweitzer, Gordon; Benzinger, Tammie; Marcus, Dan; Shimony, Joshua S.; Fouke, Sarah Jost

    2015-01-01

    Rationale and Objectives To compare quantitative imaging parameter measures from diffusion- and perfusion-weighted imaging magnetic resonance imaging (MRI) sequences in subjects with brain tumors that have been processed with different software platforms. Materials and Methods Scans from 20 subjects with primary brain tumors were selected from the Comprehensive Neuro-oncology Data Repository at Washington University School of Medicine (WUSM) and the Swedish Neuroscience Institute. MR images were coregistered, and each subject's data set was processed by three software packages: 1) vendor-specific scanner software, 2) research software developed at WUSM, and 3) a commercially available, Food and Drug Administration–approved, processing platform (Nordic Ice). Regions of interest (ROIs) were chosen within the brain tumor and normal nontumor tissue. The results obtained using these methods were compared. Results For diffusion parameters, including mean diffusivity and fractional anisotropy, concordance was high when comparing different processing methods. For perfusion-imaging parameters, a significant variance in cerebral blood volume, cerebral blood flow, and mean transit time (MTT) values was seen when comparing the same raw data processed using different software platforms. Correlation was better with larger ROIs (radii ≥ 5 mm). Greatest variance was observed in MTT. Conclusions Diffusion parameter values were consistent across different software processing platforms. Perfusion parameter values were more variable and were influenced by the software used. Variation in the MTT was especially large suggesting that MTT estimation may be unreliable in tumor tissues using current MRI perfusion methods. PMID:25088833

  16. Ten years of software sustainability at the Infrared Processing and Analysis Center.

    PubMed

    Berriman, G Bruce; Good, John; Deelman, Ewa; Alexov, Anastasia

    2011-08-28

    This paper presents a case study of an approach to sustainable software architecture that has been successfully applied over a period of 10 years to astronomy software services at the NASA Infrared Processing and Analysis Center (IPAC), Caltech (http://www.ipac.caltech.edu). The approach was developed in response to the need to build and maintain the NASA Infrared Science Archive (http://irsa.ipac.caltech.edu), NASA's archive node for infrared astronomy datasets. When the archive opened for business in 1999 serving only two datasets, it was understood that the holdings would grow rapidly in size and diversity, and consequently in the number of queries and volume of data download. It was also understood that platforms and browsers would be modernized, that user interfaces would need to be replaced and that new functionality outside of the scope of the original specifications would be needed. The changes in scientific functionality over time are largely driven by the archive user community, whose interests are represented by a formal user panel. The approach has been extended to support four more major astronomy archives, which today host data from more than 40 missions and projects, to support a complete modernization of a powerful and unique legacy astronomy application for co-adding survey data, and to support deployment of Montage, a powerful image mosaic engine for astronomy. The approach involves using a component-based architecture, designed from the outset to support sustainability, extensibility and portability. Although successful, the approach demands careful assessment of new and emerging technologies before adopting them, and attention to a disciplined approach to software engineering and maintenance. The paper concludes with a list of best practices for software sustainability that are based on 10 years of experience at IPAC. PMID:21768146

  17. Real time video processing software for the analysis of endoscopic guided-biopsies

    NASA Astrophysics Data System (ADS)

    Ordoñez, C.; Bouchet, A.; Pastore, J.; Blotta, E.

    2011-12-01

    The severity in Barrett esophagus disease is, undoubtedly, the possibility of its malignization. To make an early diagnosis in order to avoid possible complications, it is absolutely necessary collect biopsies to make a histological analysis. This should be done under endoscopic control to avoid mucus areas that may co-exist within the columnar epithelial, which could lead to a false diagnosis. This paper presents a video processing software in real-time in order to delineate and enhance areas of interest to facilitate the work of the expert.

  18. 2D-CELL: image processing software for extraction and analysis of 2-dimensional cellular structures

    NASA Astrophysics Data System (ADS)

    Righetti, F.; Telley, H.; Leibling, Th. M.; Mocellin, A.

    1992-01-01

    2D-CELL is a software package for the processing and analyzing of photographic images of cellular structures in a largely interactive way. Starting from a binary digitized image, the programs extract the line network (skeleton) of the structure and determine the graph representation that best models it. Provision is made for manually correcting defects such as incorrect node positions or dangling bonds. Then a suitable algorithm retrieves polygonal contours which define individual cells — local boundary curvatures are neglected for simplicity. Using elementary analytical geometry relations, a range of metric and topological parameters describing the population are then computed, organized into statistical distributions and graphically displayed.

  19. Clinical, information and business process modeling to promote development of safe and flexible software.

    PubMed

    Liaw, Siaw-Teng; Deveny, Elizabeth; Morrison, Iain; Lewis, Bryn

    2006-09-01

    Using a factorial vignette survey and modeling methodology, we developed clinical and information models - incorporating evidence base, key concepts, relevant terms, decision-making and workflow needed to practice safely and effectively - to guide the development of an integrated rule-based knowledge module to support prescribing decisions in asthma. We identified workflows, decision-making factors, factor use, and clinician information requirements. The Unified Modeling Language (UML) and public domain software and knowledge engineering tools (e.g. Protégé) were used, with the Australian GP Data Model as the starting point for expressing information needs. A Web Services service-oriented architecture approach was adopted within which to express functional needs, and clinical processes and workflows were expressed in the Business Process Execution Language (BPEL). This formal analysis and modeling methodology to define and capture the process and logic of prescribing best practice in a reference implementation is fundamental to tackling deficiencies in prescribing decision support software. PMID:17023408

  20. Developing Engineering and Science Process Skills Using Design Software in an Elementary Education

    NASA Astrophysics Data System (ADS)

    Fusco, Christopher

    This paper examines the development of process skills through an engineering design approach to instruction in an elementary lesson that combines Science, Technology, Engineering, and Math (STEM). The study took place with 25 fifth graders in a public, suburban school district. Students worked in groups of five to design and construct model bridges based on research involving bridge building design software. The assessment was framed around individual student success as well as overall group processing skills. These skills were assessed through an engineering design packet rubric (student work), student surveys of learning gains, observation field notes, and pre- and post-assessment data. The results indicate that students can successfully utilize design software to inform constructions of model bridges, develop science process skills through problem based learning, and understand academic concepts through a design project. The final result of this study shows that design engineering is effective for developing cooperative learning skills. The study suggests that an engineering program offered as an elective or as part of the mandatory curriculum could be beneficial for developing students' critical thinking, inter- and intra-personal skills, along with an increased their understanding and awareness for scientific phenomena. In conclusion, combining a design approach to instruction with STEM can increase efficiency in these areas, generate meaningful learning, and influence student attitudes throughout their education.

  1. FACET: A simulation software framework for modeling complex societal processes and interactions

    SciTech Connect

    Christiansen, J. H.

    2000-06-02

    FACET, the Framework for Addressing Cooperative Extended Transactions, was developed at Argonne National Laboratory to address the need for a simulation software architecture in the style of an agent-based approach, but with sufficient robustness, expressiveness, and flexibility to be able to deal with the levels of complexity seen in real-world social situations. FACET is an object-oriented software framework for building models of complex, cooperative behaviors of agents. It can be used to implement simulation models of societal processes such as the complex interplay of participating individuals and organizations engaged in multiple concurrent transactions in pursuit of their various goals. These transactions can be patterned on, for example, clinical guidelines and procedures, business practices, government and corporate policies, etc. FACET can also address other complex behaviors such as biological life cycles or manufacturing processes. To date, for example, FACET has been applied to such areas as land management, health care delivery, avian social behavior, and interactions between natural and social processes in ancient Mesopotamia.

  2. T-REX: software for the processing and analysis of T-RFLP data

    PubMed Central

    Culman, Steven W; Bukowski, Robert; Gauch, Hugh G; Cadillo-Quiroz, Hinsby; Buckley, Daniel H

    2009-01-01

    Background Despite increasing popularity and improvements in terminal restriction fragment length polymorphism (T-RFLP) and other microbial community fingerprinting techniques, there are still numerous obstacles that hamper the analysis of these datasets. Many steps are required to process raw data into a format ready for analysis and interpretation. These steps can be time-intensive, error-prone, and can introduce unwanted variability into the analysis. Accordingly, we developed T-REX, free, online software for the processing and analysis of T-RFLP data. Results Analysis of T-RFLP data generated from a multiple-factorial study was performed with T-REX. With this software, we were able to i) label raw data with attributes related to the experimental design of the samples, ii) determine a baseline threshold for identification of true peaks over noise, iii) align terminal restriction fragments (T-RFs) in all samples (i.e., bin T-RFs), iv) construct a two-way data matrix from labeled data and process the matrix in a variety of ways, v) produce several measures of data matrix complexity, including the distribution of variance between main and interaction effects and sample heterogeneity, and vi) analyze a data matrix with the additive main effects and multiplicative interaction (AMMI) model. Conclusion T-REX provides a free, platform-independent tool to the research community that allows for an integrated, rapid, and more robust analysis of T-RFLP data. PMID:19500385

  3. Selection of Steady-State Process Simulation Software to Optimize Treatment of Radioactive and Hazardous Waste

    SciTech Connect

    Nichols, T. T.; Barnes, C. M.; Lauerhass, L.; Taylor, D. D.

    2001-06-01

    The process used for selecting a steady-state process simulator under conditions of high uncertainty and limited time is described. Multiple waste forms, treatment ambiguity, and the uniqueness of both the waste chemistries and alternative treatment technologies result in a large set of potential technical requirements that no commercial simulator can totally satisfy. The aim of the selection process was two-fold. First, determine the steady-state simulation software that best, albeit not completely, satisfies the requirements envelope. And second, determine if the best is good enough to justify the cost. Twelve simulators were investigated with varying degrees of scrutiny. The candidate list was narrowed to three final contenders: ASPEN Plus 10.2, PRO/II 5.11, and CHEMCAD 5.1.0. It was concluded from ''road tests'' that ASPEN Plus appears to satisfy the project's technical requirements the best and is worth acquiring. The final software decisions provide flexibility: they involve annual rather than multi-year licensing, and they include periodic re-assessment.

  4. Image processing software for providing radiometric inputs to land surface climatology models

    NASA Technical Reports Server (NTRS)

    Newcomer, Jeffrey A.; Goetz, Scott J.; Strebel, Donald E.; Hall, Forrest G.

    1989-01-01

    During the First International Land Surface Climatology Project (ISLSCP) Field Experiment (FIFE), 80 gigabytes of image data were generated from a variety of satellite and airborne sensors in a multidisciplinary attempt to study energy and mass exchange between the land surface and the atmosphere. To make these data readily available to researchers with a range of image data handling experience and capabilities, unique image-processing software was designed to perform a variety of nonstandard image-processing manipulations and to derive a set of standard-format image products. The nonconventional features of the software include: (1) adding new layers of geographic coordinates, and solar and viewing conditions to existing data; (2) providing image polygon extraction and calibration of data to at-sensor radiances; and, (3) generating standard-format derived image products that can be easily incorporated into radiometric or climatology models. The derived image products consist of easily handled ASCII descriptor files, byte image data files, and additional per-pixel integer data files (e.g., geographic coordinates, and sun and viewing conditions). Details of the solutions to the image-processing problems, the conventions adopted for handling a variety of satellite and aircraft image data, and the applicability of the output products to quantitative modeling are presented. They should be of general interest to future experiment and data-handling design considerations.

  5. Virtual and flexible digital signal processing system based on software PnP and component works

    NASA Astrophysics Data System (ADS)

    He, Tao; Wu, Qinghua; Zhong, Fei; Li, Wei

    2005-05-01

    An idea about software PnP (Plug & Play) is put forward according to the hardware PnP. And base on this idea, a virtual flexible digital signal processing system (FVDSPS) is carried out. FVDSPS is composed of a main control center, many sub-function modules and other hardware I/O modules. Main control center sends out commands to sub-function modules, and manages running orders, parameters and results of sub-functions. The software kernel of FVDSPS is DSP (Digital Signal Processing) module, which communicates with the main control center through some protocols, accept commands or send requirements. The data sharing and exchanging between the main control center and the DSP modules are carried out and managed by the files system of the Windows Operation System through the effective communication. FVDSPS real orients objects, orients engineers and orients engineering problems. With FVDSPS, users can freely plug and play, and fast reconfigure a signal process system according to engineering problems without programming. What you see is what you get. Thus, an engineer can orient engineering problems directly, pay more attention to engineering problems, and promote the flexibility, reliability and veracity of testing system. Because FVDSPS orients TCP/IP protocol, through Internet, testing engineers, technology experts can be connected freely without space. Engineering problems can be resolved fast and effectively. FVDSPS can be used in many fields such as instruments and meter, fault diagnosis, device maintenance and quality control.

  6. Impact of Recent Hardware and Software Trends on High Performance Transaction Processing and Analytics

    NASA Astrophysics Data System (ADS)

    Mohan, C.

    In this paper, I survey briefly some of the recent and emerging trends in hardware and software features which impact high performance transaction processing and data analytics applications. These features include multicore processor chips, ultra large main memories, flash storage, storage class memories, database appliances, field programmable gate arrays, transactional memory, key-value stores, and cloud computing. While some applications, e.g., Web 2.0 ones, were initially built without traditional transaction processing functionality in mind, slowly system architects and designers are beginning to address such previously ignored issues. The availability, analytics and response time requirements of these applications were initially given more importance than ACID transaction semantics and resource consumption characteristics. A project at IBM Almaden is studying the implications of phase change memory on transaction processing, in the context of a key-value store. Bitemporal data management has also become an important requirement, especially for financial applications. Power consumption and heat dissipation properties are also major considerations in the emergence of modern software and hardware architectural features. Considerations relating to ease of configuration, installation, maintenance and monitoring, and improvement of total cost of ownership have resulted in database appliances becoming very popular. The MapReduce paradigm is now quite popular for large scale data analysis, in spite of the major inefficiencies associated with it.

  7. Three-dimensional rotation electron diffraction: software RED for automated data collection and data processing.

    PubMed

    Wan, Wei; Sun, Junliang; Su, Jie; Hovmöller, Sven; Zou, Xiaodong

    2013-12-01

    Implementation of a computer program package for automated collection and processing of rotation electron diffraction (RED) data is described. The software package contains two computer programs: RED data collection and RED data processing. The RED data collection program controls the transmission electron microscope and the camera. Electron beam tilts at a fine step (0.05-0.20°) are combined with goniometer tilts at a coarse step (2.0-3.0°) around a common tilt axis, which allows a fine relative tilt to be achieved between the electron beam and the crystal in a large tilt range. An electron diffraction (ED) frame is collected at each combination of beam tilt and goniometer tilt. The RED data processing program processes three-dimensional ED data generated by the RED data collection program or by other approaches. It includes shift correction of the ED frames, peak hunting for diffraction spots in individual ED frames and identification of these diffraction spots as reflections in three dimensions. Unit-cell parameters are determined from the positions of reflections in three-dimensional reciprocal space. All reflections are indexed, and finally a list with hkl indices and intensities is output. The data processing program also includes a visualizer to view and analyse three-dimensional reciprocal lattices reconstructed from the ED frames. Details of the implementation are described. Data collection and data processing with the software RED are demonstrated using a calcined zeolite sample, silicalite-1. The structure of the calcined silicalite-1, with 72 unique atoms, could be solved from the RED data by routine direct methods. PMID:24282334

  8. Architecture and performances of the AGILE Telemetry Preprocessing System (TMPPS)

    NASA Astrophysics Data System (ADS)

    Trifoglio, M.; Bulgarelli, A.; Gianotti, F.; Lazzarotto, F.; Di Cocco, G.; Fuschino, F.; Tavani, M.

    2008-07-01

    AGILE is an Italian Space Agency (ASI) satellite dedicated to high energy Astrophysics. It was launched successfully on 23 April 2007, and it has been operated by the AGILE Ground Segment, consisting of the Ground Station located in Malindi (Kenia), the Mission Operations Centre (MOC) and the AGILE Data Centre (ADC) established in Italy, at Telespazio in Fucino and at the ASI Science Data Centre (ASDC) in Frascati respectively. Due to the low equatorial orbit at ~ 530 Km. with inclination angle of ~ 2.5°, the satellite passes over the Ground Station every ~ 100'. During the visibility period of . ~ 12', the Telemetry (TM) is down linked through two separated virtual channels, VC0 and VC1. The former is devoted to the real time TM generated during the pass at the average rate of 50 Kbit/s and is directly relayed to the Control Centre. The latter is used to downlink TM data collected on the satellite on-board mass memory during the non visibility period. This generates at the Ground Station a raw TM file of up to 37 MByte. Within 20' after the end of the contact, both the real time and mass memory TM arrive at ADC through the dedicated VPN ASINet. Here they are automatically detected and ingested by the TMPPS pipeline in less than 5 minutes. The TMPPS archives each TM file and sorts its packets into one stream for each of the different TM layout. Each stream is processed in parallel in order to unpack the various telemetry field and archive them into suitable FITS files. Each operation is tracked into a MySQL data base which interfaces the TMPPS pipeline to the rest of the scientific pipeline running at ADC. In this paper the architecture and the performance of the TMPPS will be described and discussed.

  9. CAVASS: a computer-assisted visualization and analysis software system - image processing aspects

    NASA Astrophysics Data System (ADS)

    Udupa, Jayaram K.; Grevera, George J.; Odhner, Dewey; Zhuge, Ying; Souza, Andre; Mishra, Shipra; Iwanaga, Tad

    2007-03-01

    The development of the concepts within 3DVIEWNIX and of the software system 3DVIEWNIX itself dates back to the 1970s. Since then, a series of software packages for Computer Assisted Visualization and Analysis (CAVA) of images came out from our group, 3DVIEWNIX released in 1993, being the most recent, and all were distributed with source code. CAVASS, an open source system, is the latest in this series, and represents the next major incarnation of 3DVIEWNIX. It incorporates four groups of operations: IMAGE PROCESSING (including ROI, interpolation, filtering, segmentation, registration, morphological, and algebraic operations), VISUALIZATION (including slice display, reslicing, MIP, surface rendering, and volume rendering), MANIPULATION (for modifying structures and surgery simulation), ANALYSIS (various ways of extracting quantitative information). CAVASS is designed to work on all platforms. Its key features are: (1) most major CAVA operations incorporated; (2) very efficient algorithms and their highly efficient implementations; (3) parallelized algorithms for computationally intensive operations; (4) parallel implementation via distributed computing on a cluster of PCs; (5) interface to other systems such as CAD/CAM software, ITK, and statistical packages; (6) easy to use GUI. In this paper, we focus on the image processing operations and compare the performance of CAVASS with that of ITK. Our conclusions based on assessing performance by utilizing a regular (6 MB), large (241 MB), and a super (873 MB) 3D image data set are as follows: CAVASS is considerably more efficient than ITK, especially in those operations which are computationally intensive. It can handle considerably larger data sets than ITK. It is easy and ready to use in applications since it provides an easy to use GUI. The users can easily build a cluster from ordinary inexpensive PCs and reap the full power of CAVASS inexpensively compared to expensive multiprocessing systems which are less

  10. Near Real Time Review of Instrument Performance using the Airborne Data Processing and Analysis Software Package

    NASA Astrophysics Data System (ADS)

    Delene, D. J.

    2014-12-01

    Research aircraft that conduct atmospheric measurements carry an increasing array of instrumentation. While on-board personnel constantly review instrument parameters and time series plots, there are an overwhelming number of items. Furthermore, directing the aircraft flight takes up much of the flight scientist time. Typically, a flight engineer is given the responsibility of reviewing the status of on-board instruments. While major issues like not receiving data are quickly identified during a flight, subtle issues like low but believable concentration measurements may go unnoticed. Therefore, it is critical to review data after a flight in near real time. The Airborne Data Processing and Analysis (ADPAA) software package used by the University of North Dakota automates the post-processing of aircraft flight data. Utilizing scripts to process the measurements recorded by data acquisition systems enables the generation of data files within an hour of flight completion. The ADPAA Cplot visualization program enables plots to be quickly generated that enable timely review of all recorded and processed parameters. Near real time review of aircraft flight data enables instrument problems to be identified, investigated and fixed before conducting another flight. On one flight, near real time data review resulted in the identification of unusually low measurements of cloud condensation nuclei, and rapid data visualization enabled the timely investigation of the cause. As a result, a leak was found and fixed before the next flight. Hence, with the high cost of aircraft flights, it is critical to find and fix instrument problems in a timely matter. The use of a automated processing scripts and quick visualization software enables scientists to review aircraft flight data in near real time to identify potential problems.

  11. Pre-Hardware Optimization of Spacecraft Image Processing Software Algorithms and Hardware Implementation

    NASA Technical Reports Server (NTRS)

    Kizhner, Semion; Flatley, Thomas P.; Hestnes, Phyllis; Jentoft-Nilsen, Marit; Petrick, David J.; Day, John H. (Technical Monitor)

    2001-01-01

    Spacecraft telemetry rates have steadily increased over the last decade presenting a problem for real-time processing by ground facilities. This paper proposes a solution to a related problem for the Geostationary Operational Environmental Spacecraft (GOES-8) image processing application. Although large super-computer facilities are the obvious heritage solution, they are very costly, making it imperative to seek a feasible alternative engineering solution at a fraction of the cost. The solution is based on a Personal Computer (PC) platform and synergy of optimized software algorithms and re-configurable computing hardware technologies, such as Field Programmable Gate Arrays (FPGA) and Digital Signal Processing (DSP). It has been shown in [1] and [2] that this configuration can provide superior inexpensive performance for a chosen application on the ground station or on-board a spacecraft. However, since this technology is still maturing, intensive pre-hardware steps are necessary to achieve the benefits of hardware implementation. This paper describes these steps for the GOES-8 application, a software project developed using Interactive Data Language (IDL) (Trademark of Research Systems, Inc.) on a Workstation/UNIX platform. The solution involves converting the application to a PC/Windows/RC platform, selected mainly by the availability of low cost, adaptable high-speed RC hardware. In order for the hybrid system to run, the IDL software was modified to account for platform differences. It was interesting to examine the gains and losses in performance on the new platform, as well as unexpected observations before implementing hardware. After substantial pre-hardware optimization steps, the necessity of hardware implementation for bottleneck code in the PC environment became evident and solvable beginning with the methodology described in [1], [2], and implementing a novel methodology for this specific application [6]. The PC-RC interface bandwidth problem for the

  12. GravProcess: An easy-to-use MATLAB software to process campaign gravity data and evaluate the associated uncertainties

    NASA Astrophysics Data System (ADS)

    Cattin, Rodolphe; Mazzotti, Stephane; Baratin, Laura-May

    2015-08-01

    We present GravProcess, a set of MATLAB routines to process gravity data from complex campaign surveys and calculate the associated gravity field. Data reduction, analysis, and representation are done using the MATLAB Graphical User Interface Tool, which can be installed on most systems and platforms. Data processing is divided into several steps: (1) Integration of gravity data, station location, and gravity line connection input files; (2) Gravity data reduction applying solid-Earth tide and instrumental drift corrections and, depending on the required processing level, air pressure and oceanic tidal corrections; (3) Automatic network adjustment and alignment to absolute base stations; (4) Free air and terrain corrections to calculate gravity values and anomalies, and to estimate the associated errors. The final step is dedicated to post-processing and includes graphical representations of data and an output text file, which can be used by Geographic Information System software. An example of this processing chain applied to a recent survey in northern Morocco is given and compared with previous available results.

  13. A Case Study of the Evolving Software Architecture for the FDA Generic Drug Application Process

    PubMed Central

    Canfield, Kip; Ritondo, Michele; Sponaugle, Richard

    1998-01-01

    This primary goal of this project was to develop a software architecture to support the Food and Drug Administration (FDA) generic drug application process by making it more efficient and effective. The secondary goal was to produce a scalable, modular, and flexible architecture that could be generalized to other contexts in interorganizational health care communications. The system described here shows improvements over the old system for the generic drug application process for most of the defined design objectives. The modular, flexible design that produced this new system offers lessons for the general design of distributed health care information systems and points the way to robust application frameworks that will allow practical development and maintenance of a distributed infrastructure. PMID:9760391

  14. Mars Science Laboratory CHIMRA/IC/DRT Flight Software for Sample Acquisition and Processing

    NASA Technical Reports Server (NTRS)

    Kim, Won S.; Leger, Chris; Carsten, Joseph; Helmick, Daniel; Kuhn, Stephen; Redick, Richard; Trujillo, Diana

    2013-01-01

    The design methodologies of using sequence diagrams, multi-process functional flow diagrams, and hierarchical state machines were successfully applied in designing three MSL (Mars Science Laboratory) flight software modules responsible for handling actuator motions of the CHIMRA (Collection and Handling for In Situ Martian Rock Analysis), IC (Inlet Covers), and DRT (Dust Removal Tool) mechanisms. The methodologies were essential to specify complex interactions with other modules, support concurrent foreground and background motions, and handle various fault protections. Studying task scenarios with multi-process functional flow diagrams yielded great insight to overall design perspectives. Since the three modules require three different levels of background motion support, the methodologies presented in this paper provide an excellent comparison. All three modules are fully operational in flight.

  15. Agile Data Curation: A conceptual framework and approach for practitioner data management

    NASA Astrophysics Data System (ADS)

    Young, J. W.; Benedict, K. K.; Lenhardt, W. C.

    2015-12-01

    Data management occurs across a range of science and related activities such as decision-support. Exemplars within the science community operate data management systems that are extensively planned before implementation, staffed with robust data management expertise, equipped with appropriate services and technologies, and often highly structured. However, this is not the only approach to data management and almost certainly not the typical experience. The other end of the spectrum is often an ad hoc practitioner team, with changing requirements, limited training in data management, and resource constrained for both equipment and human resources. Much of the existing data management literature serves the exemplar community and ignores the ad hoc practitioners. Somewhere in the middle are examples where data are repurposed for new uses thereby generating new data management challenges. This submission presents a conceptualization of an Agile Data Curation approach that provides foundational principles for data management efforts operating across the spectrum of data generation and use from large science systems to efforts with constrained resources, limited expertise, and evolving requirements. The underlying principles to Agile Data Curation are a reapplication of agile software development principles to data management. The historical reality for many data management efforts is operating in a practioner environment so Agile Data Curation utilizes historical and current case studies to validate the foundational principles and through comparison learn lessons for future application. This submission will provide an overview of the Agile Data Curation, cover the foundational principles to the approach, and introduce a framework for gathering, classifying, and applying lessons from case studies of practitioner data management.

  16. A generic software-framework for distributed, high-performance processing of multi-view video

    NASA Astrophysics Data System (ADS)

    Farin, Dirk; de With, Peter H. N.

    2007-02-01

    This paper presents a software framework providing a platform for parallel and distributed processing of video data on a cluster of SMP computers. Existing video-processing algorithms can be easily integrated into the framework by considering them as atomic processing tiles (PTs). PTs can be connected to form processing graphs that model the data flow of a specific application. This graph also defines the data dependencies that determine which tasks can be computed in parallel. Scheduling of the tasks in this graph is carried out automatically using a pool-of-tasks scheme. The data format that can be processed by the framework is not restricted to image data, such that also intermediate data, like detected feature points or object positions, can be transferred between PTs. Furthermore, the processing can optionally be carried out efficiently on special-purpose processors with separate memory, since the framework minimizes the transfer of data. Finally, we describe an example application for a multi-camera view-interpolation system that we successfully implemented on the proposed framework.

  17. Dynamic tumor tracking using the Elekta Agility MLC

    SciTech Connect

    Fast, Martin F. Nill, Simeon Bedford, James L.; Oelfke, Uwe

    2014-11-01

    Purpose: To evaluate the performance of the Elekta Agility multileaf collimator (MLC) for dynamic real-time tumor tracking. Methods: The authors have developed a new control software which interfaces to the Agility MLC to dynamically program the movement of individual leaves, the dynamic leaf guides (DLGs), and the Y collimators (“jaws”) based on the actual target trajectory. A motion platform was used to perform dynamic tracking experiments with sinusoidal trajectories. The actual target positions reported by the motion platform at 20, 30, or 40 Hz were used as shift vectors for the MLC in beams-eye-view. The system latency of the MLC (i.e., the average latency comprising target device reporting latencies and MLC adjustment latency) and the geometric tracking accuracy were extracted from a sequence of MV portal images acquired during irradiation for the following treatment scenarios: leaf-only motion, jaw + leaf motion, and DLG + leaf motion. Results: The portal imager measurements indicated a clear dependence of the system latency on the target position reporting frequency. Deducting the effect of the target frequency, the leaf adjustment latency was measured to be 38 ± 3 ms for a maximum target speed v of 13 mm/s. The jaw + leaf adjustment latency was 53 ± 3 at a similar speed. The system latency at a target position frequency of 30 Hz was in the range of 56–61 ms for the leaves (v ≤ 31 mm/s), 71–78 ms for the jaw + leaf motion (v ≤ 25 mm/s), and 58–72 ms for the DLG + leaf motion (v ≤ 59 mm/s). The tracking accuracy showed a similar dependency on the target position frequency and the maximum target speed. For the leaves, the root-mean-squared error (RMSE) was between 0.6–1.5 mm depending on the maximum target speed. For the jaw + leaf (DLG + leaf) motion, the RMSE was between 0.7–1.5 mm (1.9–3.4 mm). Conclusions: The authors have measured the latency and geometric accuracy of the Agility MLC, facilitating its future use for clinical

  18. A Quantitative Examination of Critical Success Factors Comparing Agile and Waterfall Project Management Methodologies

    ERIC Educational Resources Information Center

    Pedersen, Mitra

    2013-01-01

    This study investigated the rate of success for IT projects using agile and standard project management methodologies. Any successful project requires use of project methodology. Specifically, large projects require formal project management methodologies or models, which establish a blueprint of processes and project planning activities. This…

  19. Optical flows method for lightweight agile remote sensor design and instrumentation

    NASA Astrophysics Data System (ADS)

    Wang, Chong; Xing, Fei; Wang, Hongjian; You, Zheng

    2013-08-01

    Lightweight agile remote sensors have become one type of the most important payloads and were widely utilized in space reconnaissance and resource survey. These imaging sensors are designed to obtain the high spatial, temporary and spectral resolution imageries. Key techniques in instrumentation include flexible maneuvering, advanced imaging control algorithms and integrative measuring techniques, which are closely correlative or even acting as the bottle-necks for each other. Therefore, mutual restrictive problems must be solved and optimized. Optical flow is the critical model which to be fully represented in the information transferring as well as radiation energy flowing in dynamic imaging. For agile sensors, especially with wide-field-of view, imaging optical flows may distort and deviate seriously when they perform large angle attitude maneuvering imaging. The phenomena are mainly attributed to the geometrical characteristics of the three-dimensional earth surface as well as the coupled effects due to the complicated relative motion between the sensor and scene. Under this circumstance, velocity fields distribute nonlinearly, the imageries may badly be smeared or probably the geometrical structures are changed since the image velocity matching errors are not having been eliminated perfectly. In this paper, precise imaging optical flow model is established for agile remote sensors, for which optical flows evolving is factorized by two forms, which respectively due to translational movement and image shape changing. Moreover, base on that, agile remote sensors instrumentation was investigated. The main techniques which concern optical flow modeling include integrative design with lightweight star sensors along with micro inertial measurement units and corresponding data fusion, the assemblies of focal plane layout and control, imageries post processing for agile remote sensors etc. Some experiments show that the optical analyzing method is effective to

  20. A Decision Tool that Combines Discrete Event Software Process Models with System Dynamics Pieces for Software Development Cost Estimation and Analysis

    NASA Technical Reports Server (NTRS)

    Mizell, Carolyn Barrett; Malone, Linda

    2007-01-01

    The development process for a large software development project is very complex and dependent on many variables that are dynamic and interrelated. Factors such as size, productivity and defect injection rates will have substantial impact on the project in terms of cost and schedule. These factors can be affected by the intricacies of the process itself as well as human behavior because the process is very labor intensive. The complex nature of the development process can be investigated with software development process models that utilize discrete event simulation to analyze the effects of process changes. The organizational environment and its effects on the workforce can be analyzed with system dynamics that utilizes continuous simulation. Each has unique strengths and the benefits of both types can be exploited by combining a system dynamics model and a discrete event process model. This paper will demonstrate how the two types of models can be combined to investigate the impacts of human resource interactions on productivity and ultimately on cost and schedule.

  1. Information Models, Data Requirements, and Agile Data Curation

    NASA Astrophysics Data System (ADS)

    Hughes, John S.; Crichton, Dan; Ritschel, Bernd; Hardman, Sean; Joyner, Ron

    2015-04-01

    The Planetary Data System's next generation system, PDS4, is an example of the successful use of an ontology-based Information Model (IM) to drive the development and operations of a data system. In traditional systems engineering, requirements or statements about what is necessary for the system are collected and analyzed for input into the design stage of systems development. With the advent of big data the requirements associated with data have begun to dominate and an ontology-based information model can be used to provide a formalized and rigorous set of data requirements. These requirements address not only the usual issues of data quantity, quality, and disposition but also data representation, integrity, provenance, context, and semantics. In addition the use of these data requirements during system's development has many characteristics of Agile Curation as proposed by Young et al. [Taking Another Look at the Data Management Life Cycle: Deconstruction, Agile, and Community, AGU 2014], namely adaptive planning, evolutionary development, early delivery, continuous improvement, and rapid and flexible response to change. For example customers can be satisfied through early and continuous delivery of system software and services that are configured directly from the information model. This presentation will describe the PDS4 architecture and its three principle parts: the ontology-based Information Model (IM), the federated registries and repositories, and the REST-based service layer for search, retrieval, and distribution. The development of the IM will be highlighted with special emphasis on knowledge acquisition, the impact of the IM on development and operations, and the use of shared ontologies at multiple governance levels to promote system interoperability and data correlation.

  2. Scaling Agile Infrastructure to People

    NASA Astrophysics Data System (ADS)

    Jones, B.; McCance, G.; Traylen, S.; Barrientos Arias, N.

    2015-12-01

    When CERN migrated its infrastructure away from homegrown fabric management tools to emerging industry-standard open-source solutions, the immediate technical challenges and motivation were clear. The move to a multi-site Cloud Computing model meant that the tool chains that were growing around this ecosystem would be a good choice, the challenge was to leverage them. The use of open-source tools brings challenges other than merely how to deploy them. Homegrown software, for all the deficiencies identified at the outset of the project, has the benefit of growing with the organization. This paper will examine what challenges there were in adapting open-source tools to the needs of the organization, particularly in the areas of multi-group development and security. Additionally, the increase in scale of the plant required changes to how Change Management was organized and managed. Continuous Integration techniques are used in order to manage the rate of change across multiple groups, and the tools and workflow for this will be examined.

  3. Precise Evaluation of Anthropometric 2D Software Processing of Hand in Comparison with Direct Method.

    PubMed

    Habibi, Ehsanollah; Soury, Shiva; Zadeh, Akbar Hasan

    2013-10-01

    Various studies carried out on different photo anthropometry, but each one had some deficiencies which during the years they have been resolved. The objective of this paper is to test the efficiency of two-dimensional image processing software in photo anthropometry of hand. In this applied research, 204 office workers and industrial workers were selected. Their hands were measured by manual with photo anthropometric methods. In this study, designing the "Hand Photo Anthropometry Set," we tried to fix the angle and distance of the camera in all of the photos. Thus, some of the common mistakes in photo anthropometric method got controlled. The taken photos were analyzed by Digimizer software, version 4.1.1.0 and Digital Caliper (Model: Mitutoyo Corp., Tokyo, Japan) was used via manual method. t-test statistical test on data revealed that there is no significant difference between the manual and photo anthropometric results (P > 0.05) and the correlation coefficients for hand dimensions are similar in both methods illustrated in the range of 0.71-0.95. The statistical analyses showed that photo anthropometry can be replaced with manual methods. Furthermore, it can provide a great help to develop an anthropometric database for work gloves manufacturers. Since the hand anthropometry is a necessary input for tool design, this survey can be used to determine the percentiles of workers' hands. PMID:24696802

  4. Precise Evaluation of Anthropometric 2D Software Processing of Hand in Comparison with Direct Method

    PubMed Central

    Habibi, Ehsanollah; Soury, Shiva; Zadeh, Akbar Hasan

    2013-01-01

    Various studies carried out on different photo anthropometry, but each one had some deficiencies which during the years they have been resolved. The objective of this paper is to test the efficiency of two-dimensional image processing software in photo anthropometry of hand. In this applied research, 204 office workers and industrial workers were selected. Their hands were measured by manual with photo anthropometric methods. In this study, designing the “Hand Photo Anthropometry Set,” we tried to fix the angle and distance of the camera in all of the photos. Thus, some of the common mistakes in photo anthropometric method got controlled. The taken photos were analyzed by Digimizer software, version 4.1.1.0 and Digital Caliper (Model: Mitutoyo Corp., Tokyo, Japan) was used via manual method. t-test statistical test on data revealed that there is no significant difference between the manual and photo anthropometric results (P > 0.05) and the correlation coefficients for hand dimensions are similar in both methods illustrated in the range of 0.71-0.95. The statistical analyses showed that photo anthropometry can be replaced with manual methods. Furthermore, it can provide a great help to develop an anthropometric database for work gloves manufacturers. Since the hand anthropometry is a necessary input for tool design, this survey can be used to determine the percentiles of workers’ hands. PMID:24696802

  5. Versatile software for semiautomatic analysis and processing of laser-induced plasma spectra

    NASA Astrophysics Data System (ADS)

    Mateo, M. P.; Nicolás, G.; Piñón, V.; Alvarez, J. C.; Ramil, A.; Yáñez, A.

    2005-08-01

    The present article describes the main characteristics and operations of SALIPS (software for the analysis of laser-induced plasma spectra), a computer program designed for use in Spectroscopy. During the last years laser-induced plasma spectroscopy (LIPS) has grown in popularity and different applications have been developed in several fields. However, until now there is no software reported to perform the recognition of the elemental composition of a generic sample from its LIP spectrum, which must be achieved by hand in a tedious comparative process of experimental peaks with emission lines from databases. For this reason, a computer program that includes several tools to provide a semi-automatic identification of the peaks of a LIP spectrum has been developed. The program, written in Microsoft® Visual Basic® code, has a user-friendly graphical interface and is a flexible tool that enables to handle, edit, copy and print a quick presentation of the data including automatically the identification results in the graph. SALIPS also provides some physical properties of the elements and includes algorithms for performing the simulation of spectra. The potential of the program is illustrated with some examples.

  6. A software toolkit for processing and analyzing spectral and trace gas flux data collected via aircraft

    NASA Astrophysics Data System (ADS)

    Xu, Y.; Garrity, S. R.; Vierling, L. A.; Martins, D. K.; Shepson, P. B.; Stirm, B. H.

    2006-12-01

    In order to spatially extrapolate trace gas flux measurements made at the scale of individual flux towers to broader regions using spectral approaches, it is helpful to establish new methodologies for sampling and processing these data at scales coarser than one flux tower footprint. To this end, we mounted a dual-channel hyperspectral spectroradiometer capable of collecting spectra at ~3Hz to an experimental twin-engine Beechcraft Duchess instrumented to also measure eddy covariance fluxes of CO2. Experimental flights were conducted over a northern hardwood, deciduous forest between 21 July and 24 July 2006. To analyze these data in ecologically meaningful ways, it was necessary to first develop a software toolkit capable of marrying the spectral and flux data in appropriate spatial and spectral contexts. The toolkit is capable of merging the spectral and flux data streams with the GPS/Inertial Navigation System of the aircraft such that data can be interactively selected according to its timestamp or geographic location and queried to output a variety of preset and/or user defined spectral indices for comparison to collocated flux data. In addition, the toolkit enables the user to interactively plot the spectral target locations on any georectified image to facilitate comparisons among land cover type, topography, surface spectral characteristics, and CO2 fluxes. In this paper, we highlight the capabilities of the software toolkit as well as provide examples of ways in which it can be used to explore correlation among spectral and flux data collected via aircraft.

  7. A computer program for processing impedance cardiographic data: Improving accuracy through user-interactive software

    NASA Technical Reports Server (NTRS)

    Cowings, Patricia S.; Naifeh, Karen; Thrasher, Chet

    1988-01-01

    This report contains the source code and documentation for a computer program used to process impedance cardiography data. The cardiodynamic measures derived from impedance cardiography are ventricular stroke column, cardiac output, cardiac index and Heather index. The program digitizes data collected from the Minnesota Impedance Cardiograph, Electrocardiography (ECG), and respiratory cycles and then stores these data on hard disk. It computes the cardiodynamic functions using interactive graphics and stores the means and standard deviations of each 15-sec data epoch on floppy disk. This software was designed on a Digital PRO380 microcomputer and used version 2.0 of P/OS, with (minimally) a 4-channel 16-bit analog/digital (A/D) converter. Applications software is written in FORTRAN 77, and uses Digital's Pro-Tool Kit Real Time Interface Library, CORE Graphic Library, and laboratory routines. Source code can be readily modified to accommodate alternative detection, A/D conversion and interactive graphics. The object code utilizing overlays and multitasking has a maximum of 50 Kbytes.

  8. First GRB detections with the AGILE Minicalorimeter

    SciTech Connect

    Marisaldi, M.; Labanti, C.; Fuschino, F.; Bulgarelli, A.; Gianotti, F.; Trifoglio, M.; Galli, M.; Tavani, M.; Argan, A.

    2008-05-22

    The Minicalorimeter (MCAL) onboard the AGILE satellite is a 1400 cm{sup 2} scintillation detector sensitive in the energy range 0.3-200 MeV. MCAL works both as a slave of the AGILE Silicon Tracker and as an autonomous detector for transient events (BURST mode). A dedicated onboard Burst Search logic scans BURST mode data in search of count rate increase. Peculiar characteristics of the detector are the high energy spectral coverage and a timing resolution of about 2 microseconds. Even if a trigger is not issued, BURST mode data are used to build a broad band energy spectrum (scientific ratemeters) organized in 11 bands for each of the two MCAL detection planes, with a time resolution of 1 second. After the first engineering commissioning phase, following the AGILE launch on 23rd April 2007, between 22nd June and 5th November 2007 eighteen GRBs were detected offline in the scientific ratemeters data, with a detection rate of about one per week. In this paper the capabilities of the detector will be described and an overview of the first detected GRBs will be given.

  9. PDS4 - Some Principles for Agile Data Curation

    NASA Astrophysics Data System (ADS)

    Hughes, J. S.; Crichton, D. J.; Hardman, S. H.; Joyner, R.; Algermissen, S.; Padams, J.

    2015-12-01

    PDS4, a research data management and curation system for NASA's Planetary Science Archive, was developed using principles that promote the characteristics of agile development. The result is an efficient system that produces better research data products while using less resources (time, effort, and money) and maximizes their usefulness for current and future scientists. The key principle is architectural. The PDS4 information architecture is developed and maintained independent of the infrastructure's process, application and technology architectures. The information architecture is based on an ontology-based information model developed to leverage best practices from standard reference models for digital archives, digital object registries, and metadata registries and capture domain knowledge from a panel of planetary science domain experts. The information model provides a sharable, stable, and formal set of information requirements for the system and is the primary source for information to configure most system components, including the product registry, search engine, validation and display tools, and production pipelines. Multi-level governance is also allowed for the effective management of the informational elements at the common, discipline, and project level. This presentation will describe the development principles, components, and uses of the information model and how an information model-driven architecture exhibits characteristics of agile curation including early delivery, evolutionary development, adaptive planning, continuous improvement, and rapid and flexible response to change.

  10. Using Modern Methodologies with Maintenance Software

    NASA Technical Reports Server (NTRS)

    Streiffert, Barbara A.; Francis, Laurie K.; Smith, Benjamin D.

    2014-01-01

    Jet Propulsion Laboratory uses multi-mission software produced by the Mission Planning and Sequencing (MPS) team to process, simulate, translate, and package the commands that are sent to a spacecraft. MPS works under the auspices of the Multi-Mission Ground Systems and Services (MGSS). This software consists of nineteen applications that are in maintenance. The MPS software is classified as either class B (mission critical) or class C (mission important). The scheduling of tasks is difficult because mission needs must be addressed prior to performing any other tasks and those needs often spring up unexpectedly. Keeping track of the tasks that everyone is working on is also difficult because each person is working on a different software component. Recently the group adopted the Scrum methodology for planning and scheduling tasks. Scrum is one of the newer methodologies typically used in agile development. In the Scrum development environment, teams pick their tasks that are to be completed within a sprint based on priority. The team specifies the sprint length usually a month or less. Scrum is typically used for new development of one application. In the Scrum methodology there is a scrum master who is a facilitator who tries to make sure that everything moves smoothly, a product owner who represents the user(s) of the software and the team. MPS is not the traditional environment for the Scrum methodology. MPS has many software applications in maintenance, team members who are working on disparate applications, many users, and is interruptible based on mission needs, issues and requirements. In order to use scrum, the methodology needed adaptation to MPS. Scrum was chosen because it is adaptable. This paper is about the development of the process for using scrum, a new development methodology, with a team that works on disparate interruptible tasks on multiple software applications.

  11. SAMPLE (Sandia agile MEMS prototyping, layout tools, and education)

    NASA Astrophysics Data System (ADS)

    Davies, Brady R.; Craig Barron, Carole; Sniegowski, Jeffry J.; Rodgers, M. Steven

    1997-09-01

    The SAMPLE (Sandia agile MEMS prototyping, layout tools, and education) service makes Sandia's state-of-the-art surface micromachining fabrication process, known as SUMMiT, available to U.S. industry for the first time. The service provides a short course and customized computer-aided design (CAD) tools to assist customers in designing micromachine prototypes to be fabricated in SUMMiT. Frequent small-scale manufacturing runs then provide SAMPLE designers with hundreds of sophisticated MEMS (microelectromechanical systems) chips. SUMMiT (Sandia ultra-planar, multi-level MEMS technology) offers unique surface-micromachining capabilities, including four levels of polycrystalline silicon (including the ground layer), flanged hubs, substrate contacts, one-micron design rules, and chemical-mechanical polishing (CMP) planarization. This paper describes the SUMMiT process, design tools, and other information relevant to the SAMPLE service and SUMMiT process.

  12. SAMPLE (Sandia Agile MEMS Prototyping, Layout tools, and Education)

    SciTech Connect

    Davies, B.R.; Barron, C.C.; Sniegowski, J.J.; Rodgers, M.S.

    1997-08-01

    The SAMPLE (Sandia Agile MEMS Protyping, Layout tools, and Education) service makes Sandia`s state-of-the-art surface-micromachining fabrication process, known as SUMMiT, available to US industry for the first time. The service provides a short cause and customized computer-aided design (CAD) tools to assist customers in designing micromachine prototypes to be fabricated in SUMMiT. Frequent small-scale manufacturing runs then provide SAMPLE designers with hundreds of sophisticated MEMS (MicroElectroMechanical Systems) chips. SUMMiT (Sandia Ultra-planar, Multi-level MEMS Technology) offers unique surface-micromachining capabilities, including four levels of polycrystalline silicon (including the ground layer), flanged hubs, substrate contacts, one-micron design rules, and chemical-mechanical polishing (CMP) planarization. This paper describes the SUMMiT process, design tools, and other information relevant to the SAMPLE service and SUMMiT process.

  13. Implementationof a modular software system for multiphysical processes in porous media

    NASA Astrophysics Data System (ADS)

    Naumov, Dmitri; Watanabe, Norihiro; Bilke, Lars; Fischer, Thomas; Lehmann, Christoph; Rink, Karsten; Walther, Marc; Wang, Wenqing; Kolditz, Olaf

    2016-04-01

    Subsurface georeservoirs are a candidate technology for large scale energy storage required as part of the transition to renewable energy sources. The increased use of the subsurface results in competing interests and possible impacts on protected entities. To optimize and plan the use of the subsurface in large scale scenario analyses,powerful numerical frameworks are required that aid process understanding and can capture the coupled thermal (T), hydraulic (H), mechanical (M), and chemical (C) processes with high computational efficiency. Due to having a multitude of different couplings between basic T, H, M, or C processes and the necessity to implement new numerical schemes the development focus has moved to software's modularity. The decreased coupling between the components results in two major advantages: easier addition of specialized processes and improvement of the code's testability and therefore its quality. The idea of modularization is implemented on several levels, in addition to library based separation of the previous code version, by using generalized algorithms available in the Standard Template Library and the Boost library, relying on efficient implementations of liner algebra solvers, using concepts when designing new types, and localization of frequently accessed data structures. This procedure shows certain benefits for a flexible high-performance framework applied to the analysis of multipurpose georeservoirs.

  14. Using Material Processing Simulation Software To Predict A Part ``In Use'' Properties

    NASA Astrophysics Data System (ADS)

    Ducloux, R.; Lasne, P.; Wey, E.

    2004-06-01

    Today, material processing simulation software is commonly used in the metal and polymer transformation industry for forging, casting and injection mold filling. In addition, classical FEM packages are also used to compute the behaviour of the final formed part under different loading conditions. Until now, there were very few bridges between these two types of computations, even though it is common knowledge that the stress analysis of a mechanical part in use could be more precisely computed using as input for the material characteristics the results derived from the forming process. In this paper, we give some examples where the results of the material process simulation are used to define more precisely the part "in use" properties. These examples cover hot and cold forming of metals, glass forming and quenching with FORGE2® and FORGE3®, casting with THERCAST® and polymer injection mold filling with REM3D®, studying the effects of the real shape deviation, of the residual stresses and damage and of the metallurgy resulting from the forming process.

  15. Autonomous, agile micro-satellites and supporting technologies

    SciTech Connect

    Breitfeller, E; Dittman, M D; Gaughan, R J; Jones, M S; Kordas, J F; Ledebuhr, A G; Ng, L C; Whitehead, J C; Wilson, B

    1999-07-19

    This paper updates the on-going effort at Lawrence Livermore National Laboratory to develop autonomous, agile micro-satellites (MicroSats). The objective of this development effort is to develop MicroSats weighing only a few tens of kilograms, that are able to autonomously perform precision maneuvers and can be used telerobotically in a variety of mission modes. The required capabilities include satellite rendezvous, inspection, proximity-operations, docking, and servicing. The MicroSat carries an integrated proximity-operations sensor-suite incorporating advanced avionics. A new self-pressurizing propulsion system utilizing a miniaturized pump and non-toxic mono-propellant hydrogen peroxide was successfully tested. This system can provide a nominal 25 kg MicroSat with 200-300 m/s delta-v including a warm-gas attitude control system. The avionics is based on the latest PowerPC processor using a CompactPCI bus architecture, which is modular, high-performance and processor-independent. This leverages commercial-off-the-shelf (COTS) technologies and minimizes the effects of future changes in processors. The MicroSat software development environment uses the Vx-Works real-time operating system (RTOS) that provides a rapid development environment for integration of new software modules, allowing early integration and test. We will summarize results of recent integrated ground flight testing of our latest non-toxic pumped propulsion MicroSat testbed vehicle operated on our unique dynamic air-rail.

  16. Airborne Doppler Wind Lidar Post Data Processing Software DAPS-LV

    NASA Technical Reports Server (NTRS)

    Beyon, Jeffrey Y. (Inventor); Koch, Grady J. (Inventor); Kavaya, Michael J. (Inventor)

    2015-01-01

    Systems, methods, and devices of the present invention enable post processing of airborne Doppler wind LIDAR data. In an embodiment, airborne Doppler wind LIDAR data software written in LabVIEW may be provided and may run two versions of different airborne wind profiling algorithms. A first algorithm may be the Airborne Wind Profiling Algorithm for Doppler Wind LIDAR ("APOLO") using airborne wind LIDAR data from two orthogonal directions to estimate wind parameters, and a second algorithm may be a five direction based method using pseudo inverse functions to estimate wind parameters. The various embodiments may enable wind profiles to be compared using different algorithms, may enable wind profile data for long haul color displays to be generated, may display long haul color displays, and/or may enable archiving of data at user-selectable altitudes over a long observation period for data distribution and population.

  17. Next Generation Astronomical Data Processing using Big Data Technologies from the Apache Software Foundation

    NASA Astrophysics Data System (ADS)

    Mattmann, Chris

    2014-04-01

    In this era of exascale instruments for astronomy we must naturally develop next generation capabilities for the unprecedented data volume and velocity that will arrive due to the veracity of these ground-based sensor and observatories. Integrating scientific algorithms stewarded by scientific groups unobtrusively and rapidly; intelligently selecting data movement technologies; making use of cloud computing for storage and processing; and automatically extracting text and metadata and science from any type of file are all needed capabilities in this exciting time. Our group at NASA JPL has promoted the use of open source data management technologies available from the Apache Software Foundation (ASF) in pursuit of constructing next generation data management and processing systems for astronomical instruments including the Expanded Very Large Array (EVLA) in Socorro, NM and the Atacama Large Milimetre/Sub Milimetre Array (ALMA); as well as for the KAT-7 project led by SKA South Africa as a precursor to the full MeerKAT telescope. In addition we are funded currently by the National Science Foundation in the US to work with MIT Haystack Observatory and the University of Cambridge in the UK to construct a Radio Array of Portable Interferometric Devices (RAPID) that will undoubtedly draw from the rich technology advances underway. NASA JPL is investing in a strategic initiative for Big Data that is pulling in these capabilities and technologies for astronomical instruments and also for Earth science remote sensing. In this talk I will describe the above collaborative efforts underway and point to solutions in open source from the Apache Software Foundation that can be deployed and used today and that are already bringing our teams and projects benefits. I will describe how others can take advantage of our experience and point towards future application and contribution of these tools.

  18. A software to digital image processing to be used in the voxel phantom development.

    PubMed

    Vieira, J W; Lima, F R A

    2009-01-01

    Anthropomorphic models used in computational dosimetry, also denominated phantoms, are based on digital images recorded from scanning of real people by Computed Tomography (CT) or Magnetic Resonance Imaging (MRI). The voxel phantom construction requests computational processing for transformations of image formats, to compact two-dimensional (2-D) images forming of three-dimensional (3-D) matrices, image sampling and quantization, image enhancement, restoration and segmentation, among others. Hardly the researcher of computational dosimetry will find all these available abilities in single software, and almost always this difficulty presents as a result the decrease of the rhythm of his researches or the use, sometimes inadequate, of alternative tools. The need to integrate the several tasks mentioned above to obtain an image that can be used in an exposure computational model motivated the development of the Digital Image Processing (DIP) software, mainly to solve particular problems in Dissertations and Thesis developed by members of the Grupo de Pesquisa em Dosimetria Numérica (GDN/CNPq). Because of this particular objective, the software uses the Portuguese idiom in their implementations and interfaces. This paper presents the second version of the DIP, whose main changes are the more formal organization on menus and menu items, and menu for digital image segmentation. Currently, the DIP contains the menus Fundamentos, Visualizações, Domínio Espacial, Domínio de Frequências, Segmentações and Estudos. Each menu contains items and sub-items with functionalities that, usually, request an image as input and produce an image or an attribute in the output. The DIP reads edits and writes binary files containing the 3-D matrix corresponding to a stack of axial images from a given geometry that can be a human body or other volume of interest. It also can read any type of computational image and to make conversions. When the task involves only an output image

  19. Data Processing System (DPS) software with experimental design, statistical analysis and data mining developed for use in entomological research.

    PubMed

    Tang, Qi-Yi; Zhang, Chuan-Xi

    2013-04-01

    A comprehensive but simple-to-use software package called DPS (Data Processing System) has been developed to execute a range of standard numerical analyses and operations used in experimental design, statistics and data mining. This program runs on standard Windows computers. Many of the functions are specific to entomological and other biological research and are not found in standard statistical software. This paper presents applications of DPS to experimental design, statistical analysis and data mining in entomology. PMID:23955865

  20. Short Serious Games Creation under the Paradigm of Software Process and Competencies as Software Requirements. Case Study: Elementary Math Competencies

    ERIC Educational Resources Information Center

    Barajas-Saavedra, Arturo; Álvarez-Rodriguez, Francisco J.; Mendoza-González, Ricardo; Oviedo-De-Luna, Ana C.

    2015-01-01

    Development of digital resources is difficult due to their particular complexity relying on pedagogical aspects. Another aspect is the lack of well-defined development processes, experiences documented, and standard methodologies to guide and organize game development. Added to this, there is no documented technique to ensure correct…

  1. A flexible software architecture for scalable real-time image and video processing applications

    NASA Astrophysics Data System (ADS)

    Usamentiaga, Rubén; Molleda, Julio; García, Daniel F.; Bulnes, Francisco G.

    2012-06-01

    Real-time image and video processing applications require skilled architects, and recent trends in the hardware platform make the design and implementation of these applications increasingly complex. Many frameworks and libraries have been proposed or commercialized to simplify the design and tuning of real-time image processing applications. However, they tend to lack flexibility because they are normally oriented towards particular types of applications, or they impose specific data processing models such as the pipeline. Other issues include large memory footprints, difficulty for reuse and inefficient execution on multicore processors. This paper presents a novel software architecture for real-time image and video processing applications which addresses these issues. The architecture is divided into three layers: the platform abstraction layer, the messaging layer, and the application layer. The platform abstraction layer provides a high level application programming interface for the rest of the architecture. The messaging layer provides a message passing interface based on a dynamic publish/subscribe pattern. A topic-based filtering in which messages are published to topics is used to route the messages from the publishers to the subscribers interested in a particular type of messages. The application layer provides a repository for reusable application modules designed for real-time image and video processing applications. These modules, which include acquisition, visualization, communication, user interface and data processing modules, take advantage of the power of other well-known libraries such as OpenCV, Intel IPP, or CUDA. Finally, we present different prototypes and applications to show the possibilities of the proposed architecture.

  2. MOVIE: a hardware building block for software-only real-time video processing

    NASA Astrophysics Data System (ADS)

    Barzic, Ronan; Bouville, Christian; Charot, Francois; Le Fol, Gwendal; Lemonnier, Pascal; Wagner, Charles

    1996-03-01

    The goal of the MOVIE VLSI chip is to facilitate the development of software-only solutions for real time video processing applications. This chip can be seen as a building block for SIMD arrays of processing elements and its architecture has been designed so as to facilitate high level language programming. The basic architecture building block associates a sub-array of computational processors with a I/O processor. A module can be seen as a small linear, systolic-like array of processing elements, connected at each end to the I/O processor. The module can communicate with its two nearest neighbors via two communication ports. The chip architecture also includes three 16-bit video ports. One important aspect in the programming environment is the C-stolic programming language. C-stolic is a C-like language augmented with parallel constructs which allow to differentiate between the array controller variables (scalar variables) and the local variables in the array structure (systolic variables). A statement operating on systolic variables implies a simultaneous execution on all the cells of the structure. Implementation examples of MOVIE-based architectures dealing with video compression algorithms are given.

  3. CLIMLAB: a Python-based software toolkit for interactive, process-oriented climate modeling

    NASA Astrophysics Data System (ADS)

    Rose, B. E. J.

    2015-12-01

    Global climate is a complex emergent property of the rich interactions between simpler components of the climate system. We build scientific understanding of this system by breaking it down into component process models (e.g. radiation, large-scale dynamics, boundary layer turbulence), understanding each components, and putting them back together. Hands-on experience and freedom to tinker with climate models (whether simple or complex) is invaluable for building physical understanding. CLIMLAB is an open-ended software engine for interactive, process-oriented climate modeling. With CLIMLAB you can interactively mix and match model components, or combine simpler process models together into a more comprehensive model. It was created primarily to support classroom activities, using hands-on modeling to teach fundamentals of climate science at both undergraduate and graduate levels. CLIMLAB is written in Python and ties in with the rich ecosystem of open-source scientific Python tools for numerics and graphics. The IPython notebook format provides an elegant medium for distributing interactive example code. I will give an overview of the current capabilities of CLIMLAB, the curriculum we have developed thus far, and plans for the future. Using CLIMLAB requires some basic Python coding skills. We consider this an educational asset, as we are targeting upper-level undergraduates and Python is an increasingly important language in STEM fields. However CLIMLAB is well suited to be deployed as a computational back-end for a graphical gaming environment based on earth-system modeling.

  4. Wideband Agile Digital Microwave Radiometer

    NASA Technical Reports Server (NTRS)

    Gaier, Todd C.; Brown, Shannon T.; Ruf, Christopher; Gross, Steven

    2012-01-01

    The objectives of this work were to take the initial steps needed to develop a field programmable gate array (FPGA)- based wideband digital radiometer backend (>500 MHz bandwidth) that will enable passive microwave observations with minimal performance degradation in a radiofrequency-interference (RFI)-rich environment. As manmade RF emissions increase over time and fill more of the microwave spectrum, microwave radiometer science applications will be increasingly impacted in a negative way, and the current generation of spaceborne microwave radiometers that use broadband analog back ends will become severely compromised or unusable over an increasing fraction of time on orbit. There is a need to develop a digital radiometer back end that, for each observation period, uses digital signal processing (DSP) algorithms to identify the maximum amount of RFI-free spectrum across the radiometer band to preserve bandwidth to minimize radiometer noise (which is inversely related to the bandwidth). Ultimately, the objective is to incorporate all processing necessary in the back end to take contaminated input spectra and produce a single output value free of manmade signals to minimize data rates for spaceborne radiometer missions. But, to meet these objectives, several intermediate processing algorithms had to be developed, and their performance characterized relative to typical brightness temperature accuracy re quirements for current and future microwave radiometer missions, including those for measuring salinity, soil moisture, and snow pack.

  5. A software pipeline for processing and identification of fungal ITS sequences

    PubMed Central

    Nilsson, R Henrik; Bok, Gunilla; Ryberg, Martin; Kristiansson, Erik; Hallenberg, Nils

    2009-01-01

    Background Fungi from environmental samples are typically identified to species level through DNA sequencing of the nuclear ribosomal internal transcribed spacer (ITS) region for use in BLAST-based similarity searches in the International Nucleotide Sequence Databases. These searches are time-consuming and regularly require a significant amount of manual intervention and complementary analyses. We here present software – in the form of an identification pipeline for large sets of fungal ITS sequences – developed to automate the BLAST process and several additional analysis steps. The performance of the pipeline was evaluated on a dataset of 350 ITS sequences from fungi growing as epiphytes on building material. Results The pipeline was written in Perl and uses a local installation of NCBI-BLAST for the similarity searches of the query sequences. The variable subregion ITS2 of the ITS region is extracted from the sequences and used for additional searches of higher sensitivity. Multiple alignments of each query sequence and its closest matches are computed, and query sequences sharing at least 50% of their best matches are clustered to facilitate the evaluation of hypothetically conspecific groups. The pipeline proved to speed up the processing, as well as enhance the resolution, of the evaluation dataset considerably, and the fungi were found to belong chiefly to the Ascomycota, with Penicillium and Aspergillus as the two most common genera. The ITS2 was found to indicate a different taxonomic affiliation than did the complete ITS region for 10% of the query sequences, though this figure is likely to vary with the taxonomic scope of the query sequences. Conclusion The present software readily assigns large sets of fungal query sequences to their respective best matches in the international sequence databases and places them in a larger biological context. The output is highly structured to be easy to process, although it still needs to be inspected and possibly

  6. Application of FE software Elmer to the modeling of crustal-scale processes

    NASA Astrophysics Data System (ADS)

    Maierová, Petra; Guy, Alexandra; Lexa, Ondrej; Cadek, Ondrej

    2010-05-01

    We extended Elmer (the open source finite element software for multiphysical problems, http://www.csc.fi/english/pages/elmer) by user-written procedures for the two-dimensional modeling of crustal-scale processes. The standard version of Elmer is an appropriate tool for modeling of thermomechanical convection with non-linear viscous rheology. In geophysics, it might be suitable for some type of mantle convection modeling. Unlike the mantle, the crust is very heterogeneous. It consists of materials with distinct rheological properties that are subject to highly varied conditions: low pressure and temperature near the surface of the Earth and relatively high pressure and temperature at a depth of several tens of kilometers. Moreover, the deformation in the upper crust is mostly brittle and the strain is concentrated into narrow shear zones and thrusts. In order to simulate the brittle behavior of the crust, we implemented pressure-dependent visco-plastic rheology. The material heterogeneity and chemical convection is implemented in terms of active markers. Another special feature of the crust, the moving free surface, is already included in Elmer by means of a moving computational grid. Erosion can easily be added in this scheme. We tested the properties of our formulation of plastic flow on several numerical experiments simulating the deformation of material under compressional and extensional stresses. In the first step, we examined angles of shear zones that form in a plastically deforming material for different material parameters and grid resolutions. A more complex setting of "sandbox-type" experiments containing heterogeneous material, strain-softening and boundary friction was considered as a next testing case. To illustrate the abilities of the extended Elmer software in crustal deformation studies, we present two models of geological processes: diapirism of the lower crust and a channel flow forced by indentation. Both these processes are assumed to take

  7. Development of 2D SIP Data Processing Software for a Metallic Mineral Deposit Exploration

    NASA Astrophysics Data System (ADS)

    PARK, M.; Kim, K. S.; Seo, H. K.; Son, J.; Park, S.; Kim, C.; Kim, J. H.

    2015-12-01

    In this study, we developed commercially two dimensional SIP (Spectral Induced Polarization) data processing software for measured SIP data, because the end user to comfortably use it. In order to consider the application of the developed technique, two dimensional SIP was tested in the area of hydro-thermal mineral deposit, Haenam in South Korea. We also acquired time-domain IP data for the same profile in order to verify the accuracy of SIP data, and compared both data after data processing and analysis completed. Separation of transmitter and receiver line was used to get more accurate data, and porous pot electrode was also used to remove the polarization effect of receiver electrodes. As results of both survey methods, we knew that resistivity images were nearly same but the chargeability and phase images were slight different. From the previous experience of SIP survey on the close test, phase anomaly was closely related and expected to the mineralized zone also in this survey. The site where the test survey was conducted was a small hills, and on the top of hill silicified alteration zone was identified which were shown as a high-resistivity anomaly on the resistivity mage. Below this high-resistivity anomaly, we identified phase anomaly that showed a consistent trend originated from the deep anomaly directly under the mountain, and it continued from south to north, and deep to shallow. This trend of phase anomaly was not clearly identified on the inverted chargeability images for the averaged chargeability of time-domain IP data. But when we use new inversion algorithm which use all the chargeability data of 20 time windows simultaneously, we got similar inverted results for the middle-time IP data. Through the test survey of SIP and IP, we know that S/N ration of SIP measurements was superior to those of IP measurement because SIP measurement was made during the transmitter on but IP measurement did not. And if we use the newly developed IP inversion

  8. Ranking of Business Process Simulation Software Tools with DEX/QQ Hierarchical Decision Model

    PubMed Central

    2016-01-01

    The omnipresent need for optimisation requires constant improvements of companies’ business processes (BPs). Minimising the risk of inappropriate BP being implemented is usually performed by simulating the newly developed BP under various initial conditions and “what-if” scenarios. An effectual business process simulations software (BPSS) is a prerequisite for accurate analysis of an BP. Characterisation of an BPSS tool is a challenging task due to the complex selection criteria that includes quality of visual aspects, simulation capabilities, statistical facilities, quality reporting etc. Under such circumstances, making an optimal decision is challenging. Therefore, various decision support models are employed aiding the BPSS tool selection. The currently established decision support models are either proprietary or comprise only a limited subset of criteria, which affects their accuracy. Addressing this issue, this paper proposes a new hierarchical decision support model for ranking of BPSS based on their technical characteristics by employing DEX and qualitative to quantitative (QQ) methodology. Consequently, the decision expert feeds the required information in a systematic and user friendly manner. There are three significant contributions of the proposed approach. Firstly, the proposed hierarchical model is easily extendible for adding new criteria in the hierarchical structure. Secondly, a fully operational decision support system (DSS) tool that implements the proposed hierarchical model is presented. Finally, the effectiveness of the proposed hierarchical model is assessed by comparing the resulting rankings of BPSS with respect to currently available results. PMID:26871694

  9. A Software Product Line Process to Develop Agents for the IoT.

    PubMed

    Ayala, Inmaculada; Amor, Mercedes; Fuentes, Lidia; Troya, José M

    2015-01-01

    One of the most important challenges of this decade is the Internet of Things (IoT), which aims to enable things to be connected anytime, anyplace, with anything and anyone, ideally using any path/network and any service. IoT systems are usually composed of heterogeneous and interconnected lightweight devices that support applications that are subject to change in their external environment and in the functioning of these devices. The management of the variability of these changes, autonomously, is a challenge in the development of these systems. Agents are a good option for developing self-managed IoT systems due to their distributed nature, context-awareness and self-adaptation. Our goal is to enhance the development of IoT applications using agents and software product lines (SPL). Specifically, we propose to use Self-StarMASMAS, multi-agent system) agents and to define an SPL process using the Common Variability Language. In this contribution, we propose an SPL process for Self-StarMAS, paying particular attention to agents embedded in sensor motes. PMID:26140350

  10. A Software Product Line Process to Develop Agents for the IoT

    PubMed Central

    Ayala, Inmaculada; Amor, Mercedes; Fuentes, Lidia; Troya, José M.

    2015-01-01

    One of the most important challenges of this decade is the Internet of Things (IoT), which aims to enable things to be connected anytime, anyplace, with anything and anyone, ideally using any path/network and any service. IoT systems are usually composed of heterogeneous and interconnected lightweight devices that support applications that are subject to change in their external environment and in the functioning of these devices. The management of the variability of these changes, autonomously, is a challenge in the development of these systems. Agents are a good option for developing self-managed IoT systems due to their distributed nature, context-awareness and self-adaptation. Our goal is to enhance the development of IoT applications using agents and software product lines (SPL). Specifically, we propose to use Self-StarMASMAS, multi-agent system) agents and to define an SPL process using the Common Variability Language. In this contribution, we propose an SPL process for Self-StarMAS, paying particular attention to agents embedded in sensor motes. PMID:26140350

  11. Xmipp 3.0: an improved software suite for image processing in electron microscopy.

    PubMed

    de la Rosa-Trevín, J M; Otón, J; Marabini, R; Zaldívar, A; Vargas, J; Carazo, J M; Sorzano, C O S

    2013-11-01

    Xmipp is a specialized software package for image processing in electron microscopy, and that is mainly focused on 3D reconstruction of macromolecules through single-particles analysis. In this article we present Xmipp 3.0, a major release which introduces several improvements and new developments over the previous version. A central improvement is the concept of a project that stores the entire processing workflow from data import to final results. It is now possible to monitor, reproduce and restart all computing tasks as well as graphically explore the complete set of interrelated tasks associated to a given project. Other graphical tools have also been improved such as data visualization, particle picking and parameter "wizards" that allow the visual selection of some key parameters. Many standard image formats are transparently supported for input/output from all programs. Additionally, results have been standardized, facilitating the interoperation between different Xmipp programs. Finally, as a result of a large code refactoring, the underlying C++ libraries are better suited for future developments and all code has been optimized. Xmipp is an open-source package that is freely available for download from: http://xmipp.cnb.csic.es. PMID:24075951

  12. Ranking of Business Process Simulation Software Tools with DEX/QQ Hierarchical Decision Model.

    PubMed

    Damij, Nadja; Boškoski, Pavle; Bohanec, Marko; Mileva Boshkoska, Biljana

    2016-01-01

    The omnipresent need for optimisation requires constant improvements of companies' business processes (BPs). Minimising the risk of inappropriate BP being implemented is usually performed by simulating the newly developed BP under various initial conditions and "what-if" scenarios. An effectual business process simulations software (BPSS) is a prerequisite for accurate analysis of an BP. Characterisation of an BPSS tool is a challenging task due to the complex selection criteria that includes quality of visual aspects, simulation capabilities, statistical facilities, quality reporting etc. Under such circumstances, making an optimal decision is challenging. Therefore, various decision support models are employed aiding the BPSS tool selection. The currently established decision support models are either proprietary or comprise only a limited subset of criteria, which affects their accuracy. Addressing this issue, this paper proposes a new hierarchical decision support model for ranking of BPSS based on their technical characteristics by employing DEX and qualitative to quantitative (QQ) methodology. Consequently, the decision expert feeds the required information in a systematic and user friendly manner. There are three significant contributions of the proposed approach. Firstly, the proposed hierarchical model is easily extendible for adding new criteria in the hierarchical structure. Secondly, a fully operational decision support system (DSS) tool that implements the proposed hierarchical model is presented. Finally, the effectiveness of the proposed hierarchical model is assessed by comparing the resulting rankings of BPSS with respect to currently available results. PMID:26871694

  13. A Cooperative Application to Improve the Educational Software Design Using Re-usable Processes

    NASA Astrophysics Data System (ADS)

    Garcia, I.; Pacheco, C.; Garcia, W.

    In the last few years, Educational Software has developed enormously, but a large part of this has been badly organized and poorly documented. Recent advances in the software technology can promote the cooperative learning that is a teaching strategy in which small teams, each composed by students of different levels of ability, use different learning activities to improve their understanding of a subject. How can we design Educational Software if we never learnt how to do it? This paper describes how the Technological University of the Mixtec Region is using a cooperative application to improve the quality of education offered to its students in the Educational Software design.

  14. Multithreaded real-time 3D image processing software architecture and implementation

    NASA Astrophysics Data System (ADS)

    Ramachandra, Vikas; Atanassov, Kalin; Aleksic, Milivoje; Goma, Sergio R.

    2011-03-01

    Recently, 3D displays and videos have generated a lot of interest in the consumer electronics industry. To make 3D capture and playback popular and practical, a user friendly playback interface is desirable. Towards this end, we built a real time software 3D video player. The 3D video player displays user captured 3D videos, provides for various 3D specific image processing functions and ensures a pleasant viewing experience. Moreover, the player enables user interactivity by providing digital zoom and pan functionalities. This real time 3D player was implemented on the GPU using CUDA and OpenGL. The player provides user interactive 3D video playback. Stereo images are first read by the player from a fast drive and rectified. Further processing of the images determines the optimal convergence point in the 3D scene to reduce eye strain. The rationale for this convergence point selection takes into account scene depth and display geometry. The first step in this processing chain is identifying keypoints by detecting vertical edges within the left image. Regions surrounding reliable keypoints are then located on the right image through the use of block matching. The difference in the positions between the corresponding regions in the left and right images are then used to calculate disparity. The extrema of the disparity histogram gives the scene disparity range. The left and right images are shifted based upon the calculated range, in order to place the desired region of the 3D scene at convergence. All the above computations are performed on one CPU thread which calls CUDA functions. Image upsampling and shifting is performed in response to user zoom and pan. The player also consists of a CPU display thread, which uses OpenGL rendering (quad buffers). This also gathers user input for digital zoom and pan and sends them to the processing thread.

  15. An Effective On-line Polymer Characterization Technique by Using SALS Image Processing Software and Wavelet Analysis

    PubMed Central

    Xian, Guang-ming; Qu, Jin-ping; Zeng, Bi-qing

    2008-01-01

    This paper describes an effective on-line polymer characterization technique by using small-angle light-scattering (SALS) image processing software and wavelet analysis. The phenomenon of small-angle light scattering has been applied to give information about transparent structures on morphology. Real-time visualization of various scattered light image and light intensity matrices is performed by the optical image real-time processing software for SALS. The software can measure the signal intensity of light scattering images, draw the frequency-intensity curves and the amplitude-intensity curves to indicate the variation of the intensity of scattered light in different processing conditions, and estimate the parameters. The current study utilizes a one-dimensional wavelet to delete noise from the original SALS signal and estimate the variation trend of maximum intensity area of the scattered light. So, the system brought the qualitative analysis of the structural information of transparent film success. PMID:19229343

  16. Software Development in the Water Sciences: a view from the divide (Invited)

    NASA Astrophysics Data System (ADS)

    Miles, B.; Band, L. E.

    2013-12-01

    While training in statistical methods is an important part of many earth scientists' training, these scientists often learn the bulk of their software development skills in an ad hoc, just-in-time manner. Yet to carry out contemporary research scientists are spending more and more time developing software. Here I present perspectives - as an earth sciences graduate student with professional software engineering experience - on the challenges scientists face adopting software engineering practices, with an emphasis on areas of the science software development lifecycle that could benefit most from improved engineering. This work builds on experience gained as part of the NSF-funded Water Science Software Institute (WSSI) conceptualization award (NSF Award # 1216817). Throughout 2013, the WSSI team held a series of software scoping and development sprints with the goals of: (1) adding features to better model green infrastructure within the Regional Hydro-Ecological Simulation System (RHESSys); and (2) infusing test-driven agile software development practices into the processes employed by the RHESSys team. The goal of efforts such as the WSSI is to ensure that investments by current and future scientists in software engineering training will enable transformative science by improving both scientific reproducibility and researcher productivity. Experience with the WSSI indicates: (1) the potential for achieving this goal; and (2) while scientists are willing to adopt some software engineering practices, transformative science will require continued collaboration between domain scientists and cyberinfrastructure experts for the foreseeable future.

  17. Supply chain network design problem for a new market opportunity in an agile manufacturing system

    NASA Astrophysics Data System (ADS)

    Babazadeh, Reza; Razmi, Jafar; Ghodsi, Reza

    2012-08-01

    The characteristics of today's competitive environment, such as the speed with which products are designed, manufactured, and distributed, and the need for higher responsiveness and lower operational cost, are forcing companies to search for innovative ways to do business. The concept of agile manufacturing has been proposed in response to these challenges for companies. This paper copes with the strategic and tactical level decisions in agile supply chain network design. An efficient mixed-integer linear programming model that is able to consider the key characteristics of agile supply chain such as direct shipments, outsourcing, different transportation modes, discount, alliance (process and information integration) between opened facilities, and maximum waiting time of customers for deliveries is developed. In addition, in the proposed model, the capacity of facilities is determined as decision variables, which are often assumed to be fixed. Computational results illustrate that the proposed model can be applied as a power tool in agile supply chain network design as well as in the integration of strategic decisions with tactical decisions.

  18. Biomarker Discovery Using New Metabolomics Software for Automated Processing of High Resolution LC-MS Data

    PubMed Central

    Hnatyshyn, S.; Reily, M.; Shipkova, P.; McClure, T.; Sanders, M.; Peake, D.

    2011-01-01

    Robust biomarkers of target engagement and efficacy are required in different stages of drug discovery. Liquid chromatography coupled to high resolution mass spectrometry provides sensitivity, accuracy and wide dynamic range required for identification of endogenous metabolites in biological matrices. LCMS is widely-used tool for biomarker identification and validation. Typical high resolution LCMS profiles from biological samples may contain greater than a million mass spectral peaks corresponding to several thousand endogenous metabolites. Reduction of the total number of peaks, component identification and statistical comparison across sample groups remains to be a difficult and time consuming challenge. Blood samples from four groups of rats (male vs. female, fully satiated and food deprived) were analyzed using high resolution accurate mass (HRAM) LCMS. All samples were separated using a 15 minute reversed-phase C18 LC gradient and analyzed in both positive and negative ion modes. Data was acquired using 15K resolution and 5ppm mass measurement accuracy. The entire data set was analyzed using software developed in collaboration between Bristol Meyers Squibb and Thermo Fisher Scientific to determine the metabolic effects of food deprivation on rats. Metabolomic LC-MS data files are extraordinarily complex and appropriate reduction of the number of spectral peaks via identification of related peaks and background removal is essential. A single component such as hippuric acid generates more than 20 related peaks including isotopic clusters, adducts and dimers. Plasma and urine may contain 500-1500 unique quantifiable metabolites. Noise filtering approaches including blank subtraction were used to reduce the number of irrelevant peaks. By grouping related signals such as isotopic peaks and alkali adducts, data processing was greatly simplified by reducing the total number of components by 10-fold. The software processes 48 samples in under 60minutes. Principle

  19. Full Text Information Retrieval Software for Use in Government Administration (A Selection Procedure and Process).

    ERIC Educational Resources Information Center

    Silbergeld, Israel; Reginiano-Peterson, Naomi

    1987-01-01

    A project involving selection procedures for choosing full-text information retrieval software for government administration-oriented applications and users was undertaken by an Israeli government-owned computer company. A case study and comparison of six software systems were done and a list of 27 selection criteria compiled. (Author/EM)

  20. NeuronMetrics: Software for Semi-Automated Processing of Cultured-Neuron Images

    PubMed Central

    Narro, Martha L.; Yang, Fan; Kraft, Robert; Wenk, Carola; Efrat, Alon; Restifo, Linda L.

    2007-01-01

    Using primary cell culture to screen for changes in neuronal morphology requires specialized analysis software. We developed NeuronMetrics™ for semi-automated, quantitative analysis of two-dimensional (2D) images of fluorescently labeled cultured neurons. It skeletonizes the neuron image using two complementary image-processing techniques, capturing fine terminal neurites with high fidelity. An algorithm was devised to span wide gaps in the skeleton. NeuronMetrics uses a novel strategy based on geometric features called faces to extract a branch-number estimate from complex arbors with numerous neurite-to-neurite contacts, without creating a precise, contact-free representation of the neurite arbor. It estimates total neurite length, branch number, primary neurite number, territory (the area of the convex polygon bounding the skeleton and cell body), and Polarity Index (a measure of neuronal polarity). These parameters provide fundamental information about the size and shape of neurite arbors, which are critical factors for neuronal function. NeuronMetrics streamlines optional manual tasks such as removing noise, isolating the largest primary neurite, and correcting length for self-fasciculating neurites. Numeric data are output in a single text file, readily imported into other applications for further analysis. Written as modules for ImageJ, NeuronMetrics provides practical analysis tools that are easy to use and support batch processing. Depending on the need for manual intervention, processing time for a batch of ~60 2D images is 1.0–2.5 hours, from a folder of images to a table of numeric data. NeuronMetrics’ output accelerates the quantitative detection of mutations and chemical compounds that alter neurite morphology in vitro, and will contribute to the use of cultured neurons for drug discovery. PMID:17270152

  1. Software safety

    NASA Technical Reports Server (NTRS)

    Leveson, Nancy

    1987-01-01

    Software safety and its relationship to other qualities are discussed. It is shown that standard reliability and fault tolerance techniques will not solve the safety problem for the present. A new attitude requires: looking at what you do NOT want software to do along with what you want it to do; and assuming things will go wrong. New procedures and changes to entire software development process are necessary: special software safety analysis techniques are needed; and design techniques, especially eliminating complexity, can be very helpful.

  2. An Adaptive, Agile, Reconfigurable Photonic System for Handling Analog Signals

    NASA Astrophysics Data System (ADS)

    Middleton, C.; DeSalvo, R.; Escalera, N.

    2014-09-01

    Photonic techniques can be applied to microwave and millimeter wave transmission and signal processing challenges, including signal transport, distribution, filtering, and up- and down-conversion. We present measured performance results for a wideband photonic-assisted frequency converter with 4 GHz instantaneous bandwidth and full spectral coverage up to 45 GHz. The photonic-assisted converter is applicable for both ground and space applications. We show the system performance in a ground station application, in which high frequency analog signals were transported over a moderate distance and down-converted directly into a digitizing receiver. We also describe our progress in the packaging and space qualification of the photonic system, and discuss the next steps toward higher TRL. The photonic system provides an adaptive, agile, reconfigurable backbone for handling analog signals, with performance superior to existing microwave systems.

  3. Agile Bodies: A New Imperative in Neoliberal Governance

    ERIC Educational Resources Information Center

    Gillies, Donald

    2011-01-01

    Modern business discourse suggests that a key bulwark against market fluctuation and the threat of failure is for organizations to become "agile'", a more dynamic and proactive position than that previously afforded by mere "flexibility". The same idea is also directed at the personal level, it being argued that the "agile" individual is better…

  4. Integrated product definition representation for agile numerical control applications

    SciTech Connect

    Simons, W.R. Jr.; Brooks, S.L.; Kirk, W.J. III; Brown, C.W.

    1994-11-01

    Realization of agile manufacturing capabilities for a virtual enterprise requires the integration of technology, management, and work force into a coordinated, interdependent system. This paper is focused on technology enabling tools for agile manufacturing within a virtual enterprise specifically relating to Numerical Control (N/C) manufacturing activities and product definition requirements for these activities.

  5. Agile manufacturing in Intelligence, Surveillance and Reconnaissance (ISR)

    NASA Astrophysics Data System (ADS)

    DiPadua, Mark; Dalton, George

    2016-05-01

    The objective of the Agile Manufacturing for Intelligence, Surveillance, and Reconnaissance (AMISR) effort is to research, develop, design and build a prototype multi-intelligence (multi-INT), reconfigurable pod demonstrating benefits of agile manufacturing and a modular open systems approach (MOSA) to make podded intelligence, surveillance, and reconnaissance (ISR) capability more affordable and operationally flexible.

  6. X-36 Tailless Fighter Agility Research Aircraft arrival at Dryden

    NASA Technical Reports Server (NTRS)

    1996-01-01

    NASA and McDonnell Douglas Corporation (MDC) personnel remove protective covers from the newly arrived NASA/McDonnell Douglas Corporation X-36 Tailless Fighter Agility Research Aircraft. It arrived at NASA Dryden Flight Research Center, Edwards, California, on July 2, 1996. The NASA/Boeing X-36 Tailless Fighter Agility Research Aircraft program successfully demonstrated the tailless fighter design using advanced technologies to improve the maneuverability and survivability of possible future fighter aircraft. The program met or exceeded all project goals. For 31 flights during 1997 at the Dryden Flight Research Center, Edwards, California, the project team examined the aircraft's agility at low speed / high angles of attack and at high speed / low angles of attack. The aircraft's speed envelope reached up to 206 knots (234 mph). This aircraft was very stable and maneuverable. It handled very well. The X-36 vehicle was designed to fly without the traditional tail surfaces common on most aircraft. Instead, a canard forward of the wing was used as well as split ailerons and an advanced thrust-vectoring nozzle for directional control. The X-36 was unstable in both pitch and yaw axes, so an advanced, single-channel digital fly-by-wire control system (developed with some commercially available components) was put in place to stabilize the aircraft. Using a video camera mounted in the nose of the aircraft and an onboard microphone, the X-36 was remotely controlled by a pilot in a ground station virtual cockpit. A standard fighter-type head-up display (HUD) and a moving-map representation of the vehicle's position within the range in which it flew provided excellent situational awareness for the pilot. This pilot-in-the-loop approach eliminated the need for expensive and complex autonomous flight control systems and the risks associated with their inability to deal with unknown or unforeseen phenomena in flight. Fully fueled the X-36 prototype weighed approximately 1

  7. A software architecture for multi-cellular system simulations on graphics processing units.

    PubMed

    Jeannin-Girardon, Anne; Ballet, Pascal; Rodin, Vincent

    2013-09-01

    The first aim of simulation in virtual environment is to help biologists to have a better understanding of the simulated system. The cost of such simulation is significantly reduced compared to that of in vivo simulation. However, the inherent complexity of biological system makes it hard to simulate these systems on non-parallel architectures: models might be made of sub-models and take several scales into account; the number of simulated entities may be quite large. Today, graphics cards are used for general purpose computing which has been made easier thanks to frameworks like CUDA or OpenCL. Parallelization of models may however not be easy: parallel computer programing skills are often required; several hardware architectures may be used to execute models. In this paper, we present the software architecture we built in order to implement various models able to simulate multi-cellular system. This architecture is modular and it implements data structures adapted for graphics processing units architectures. It allows efficient simulation of biological mechanisms. PMID:23900760

  8. The Pan-STARRS Data Processing and Science Analysis Software Systems

    SciTech Connect

    Heasley, J. N.

    2008-12-05

    The Panoramic Survey Telescope and Rapid Response System (Pan-STARRS) will use gigapixel CCD cameras on multiaperture telescopes to survey the sky in the visible and infrared bands. A single telescope system (PS1) has been deployed on Maui, and a four-telescope system (PS4) will be sited on Mauna Kea on the Big Island of Hawaii. These systems will survey the sky repeatedly and will generate petabytes of image data and catalogs of billions of stars and galaxies. Each set of images will be combined to create a very sensitive multicolor image of the sky, and differences between images will provide for a massive database of 'time domain astronomy' including the study of moving objects and transient or variable objects. All data from PS1 will be put into the public domain following its 3.5 year survey. The project faces formidable challenges in processing the image data in near real time and making the catalog data accessible via relational databases. In this talk, I describe the software systems developed by the Pan-STARRS project and how these core systems will be augmented by an assortment of science 'servers' being developed by astronomers in the PS1 Science Consortium.

  9. The Spectral Image Processing System (SIPS): Software for integrated analysis of AVIRIS data

    NASA Technical Reports Server (NTRS)

    Kruse, F. A.; Lefkoff, A. B.; Boardman, J. W.; Heidebrecht, K. B.; Shapiro, A. T.; Barloon, P. J.; Goetz, A. F. H.

    1992-01-01

    The Spectral Image Processing System (SIPS) is a software package developed by the Center for the Study of Earth from Space (CSES) at the University of Colorado, Boulder, in response to a perceived need to provide integrated tools for analysis of imaging spectrometer data both spectrally and spatially. SIPS was specifically designed to deal with data from the Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) and the High Resolution Imaging Spectrometer (HIRIS), but was tested with other datasets including the Geophysical and Environmental Research Imaging Spectrometer (GERIS), GEOSCAN images, and Landsat TM. SIPS was developed using the 'Interactive Data Language' (IDL). It takes advantage of high speed disk access and fast processors running under the UNIX operating system to provide rapid analysis of entire imaging spectrometer datasets. SIPS allows analysis of single or multiple imaging spectrometer data segments at full spatial and spectral resolution. It also allows visualization and interactive analysis of image cubes derived from quantitative analysis procedures such as absorption band characterization and spectral unmixing. SIPS consists of three modules: SIPS Utilities, SIPS_View, and SIPS Analysis. SIPS version 1.1 is described below.

  10. Knowledge work productivity effect on quality of knowledge work in software development process in SME

    NASA Astrophysics Data System (ADS)

    Yusoff, Mohd Zairol; Mahmuddin, Massudi; Ahmad, Mazida

    2016-08-01

    Knowledge and skill are necessary to develop the capability of knowledge workers. However, there is very little understanding of what the necessary knowledge work (KW) is, and how they influence the quality of knowledge work or knowledge work productivity (KWP) in software development process, including that in small and medium-sized (SME) enterprise. The SME constitutes a major part of the economy and it has been relatively unsuccessful in developing KWP. Accordingly, this paper seeks to explore the influencing dimensions of KWP that effect on the quality of KW in SME environment. First, based on the analysis of the existing literatures, the key characteristics of KW productivity are defined. Second, the conceptual model is proposed, which explores the dimensions of the KWP and its quality. This study analyses data collected from 150 respondents (based on [1], who involve in SME in Malaysia and validates the models by using structural equation modeling (SEM). The results provide an analysis of the effect of KWP on the quality of KW and business success, and have a significant relevance for both research and practice in the SME

  11. Overview of software options for processing, analysis and interpretation of mass spectrometric proteomic data.

    PubMed

    Haga, Steve W; Wu, Hui-Fen

    2014-10-01

    Recently, the interests in proteomics have been intensively increased, and the proteomic methods have been widely applied to many problems in cell biology. If the age of 1990s is considered to be a decade of genomics, we can claim that the following years of the new century is a decade of proteomics. The rapid evolution of proteomics has continued through these years, with a series of innovations in separation techniques and the core technologies of two-dimensional gel electrophoresis and MS. Both technologies are fueled by automation and high throughput computation for profiling of proteins from biological systems. As Patterson ever mentioned, 'data analysis is the Achilles heel of proteomics and our ability to generate data now outstrips our ability to analyze it'. The development of automatic and high throughput technologies for rapid identification of proteins is essential for large-scale proteome projects and automatic protein identification and characterization is essential for high throughput proteomics. This review provides a snap shot of the tools and applications that are available for mass spectrometric high throughput biocomputation. The review starts with a brief introduction of proteomics and MS. Computational tools that can be employed at various stages of analysis are presented, including that for data processing, identification, quantification, and the understanding of the biological functions of individual proteins and their dynamic interactions. The challenges of computation software development and its future trends in MS-based proteomics have also been speculated. PMID:25303385

  12. New image processing software for analyzing object size-frequency distributions, geometry, orientation, and spatial distribution

    NASA Astrophysics Data System (ADS)

    Beggan, Ciarán; Hamilton, Christopher W.

    2010-04-01

    Geological Image Analysis Software (GIAS) combines basic tools for calculating object area, abundance, radius, perimeter, eccentricity, orientation, and centroid location, with the first automated method for characterizing the aerial distribution of objects using sample-size-dependent nearest neighbor (NN) statistics. The NN analyses include tests for (1) Poisson, (2) Normalized Poisson, (3) Scavenged k=1, and (4) Scavenged k=2 NN distributions. GIAS is implemented in MATLAB with a Graphical User Interface (GUI) that is available as pre-parsed pseudocode for use with MATLAB, or as a stand-alone application that runs on Windows and Unix systems. GIAS can process raster data (e.g., satellite imagery, photomicrographs, etc.) and tables of object coordinates to characterize the size, geometry, orientation, and spatial organization of a wide range of geological features. This information expedites quantitative measurements of 2D object properties, provides criteria for validating the use of stereology to transform 2D object sections into 3D models, and establishes a standardized NN methodology that can be used to compare the results of different geospatial studies and identify objects using non-morphological parameters.

  13. Software-Based Real-Time Acquisition and Processing of PET Detector Raw Data.

    PubMed

    Goldschmidt, Benjamin; Schug, David; Lerche, Christoph W; Salomon, André; Gebhardt, Pierre; Weissler, Bjoern; Wehner, Jakob; Dueppenbecker, Peter M; Kiessling, Fabian; Schulz, Volkmar

    2016-02-01

    In modern positron emission tomography (PET) readout architectures, the position and energy estimation of scintillation events (singles) and the detection of coincident events (coincidences) are typically carried out on highly integrated, programmable printed circuit boards. The implementation of advanced singles and coincidence processing (SCP) algorithms for these architectures is often limited by the strict constraints of hardware-based data processing. In this paper, we present a software-based data acquisition and processing architecture (DAPA) that offers a high degree of flexibility for advanced SCP algorithms through relaxed real-time constraints and an easily extendible data processing framework. The DAPA is designed to acquire detector raw data from independent (but synchronized) detector modules and process the data for singles and coincidences in real-time using a center-of-gravity (COG)-based, a least-squares (LS)-based, or a maximum-likelihood (ML)-based crystal position and energy estimation approach (CPEEA). To test the DAPA, we adapted it to a preclinical PET detector that outputs detector raw data from 60 independent digital silicon photomultiplier (dSiPM)-based detector stacks and evaluated it with a [(18)F]-fluorodeoxyglucose-filled hot-rod phantom. The DAPA is highly reliable with less than 0.1% of all detector raw data lost or corrupted. For high validation thresholds (37.1 ± 12.8 photons per pixel) of the dSiPM detector tiles, the DAPA is real time capable up to 55 MBq for the COG-based CPEEA, up to 31 MBq for the LS-based CPEEA, and up to 28 MBq for the ML-based CPEEA. Compared to the COG-based CPEEA, the rods in the image reconstruction of the hot-rod phantom are only slightly better separable and less blurred for the LS- and ML-based CPEEA. While the coincidence time resolution (∼ 500 ps) and energy resolution (∼12.3%) are comparable for all three CPEEA, the system sensitivity is up to 2.5 × higher for the LS- and ML-based CPEEA

  14. Software Engineering Guidebook

    NASA Technical Reports Server (NTRS)

    Connell, John; Wenneson, Greg

    1993-01-01

    The Software Engineering Guidebook describes SEPG (Software Engineering Process Group) supported processes and techniques for engineering quality software in NASA environments. Three process models are supported: structured, object-oriented, and evolutionary rapid-prototyping. The guidebook covers software life-cycles, engineering, assurance, and configuration management. The guidebook is written for managers and engineers who manage, develop, enhance, and/or maintain software under the Computer Software Services Contract.

  15. Exploration of the OBIA methods available in SPRING noncommercial software to UAV data processing

    NASA Astrophysics Data System (ADS)

    Teodoro, A. C.; Araújo, R.

    2014-10-01

    The increasing developments in Unmanned Aerial Vehicles (UAVs) platforms and associated sensing technologies, offer a broad range of solutions for different applications related to the acquisition of information about objects or phenomenon at the Earth. The huge amount of data, provided by UAVs, represents a new challenge regarding developments of image processing techniques. Object-based image classification (OBIA) is highly suitable for very high resolution imagery, where pixel-based classification is less successful due to the high spatial variability within objects of interest. An OBIA approach using SPRING® non-commercial software was implemented in this work. The UAV system used was a Swinglet from Sensefly. The ortho-mosaic, with 0.04 m of pixel size, from 20 of January of 2012 of Coimbra (Portugal) region with an apx. 500×400 m area was processed using the original 41 images. Different "similarity" and "area" parameters combination were computed in the segmentation stage (region-based). Firstly, a supervised classification was employed, considering 7 classes based on Corine land cover nomenclature. For several parameter combinations were obtained a Kappa>0.9 and an overall accuracy >90%. However, several objects were not classified. An unsupervised classification was performed and 27 classes were defined. After, a new supervised classification was performed considered 22 of the 27 classes identified, with an overall accuracy of 82.58%, and a Kappa of 0.817. We conclude that the algorithms employed in this work are not the most suitable for this kind of spatial resolution. The use data mining algorithms could improve the results.

  16. Analysis of VLF signals associated to AGILE Terrestrial Gamma-ray Flashes detected over Central America

    NASA Astrophysics Data System (ADS)

    Marisaldi, Martino; Lyu, Fanchao; Cummer, Steven; Ursi, Alessandro

    2016-04-01

    Analysis of radio signals detected on ground and associated to Terrestrial Gamma-ray Flashes (TGFs) have proven to be a successful tool to extract information on the TGF itself and the possible associated lightning process. Triangulation of Very Low Frequency (VLF) signals by means of the Time Of Arrival technique provides TGF location with few km accuracy. The AGILE satellite is routinely observing TGFs on a narrow band across the Equator, limited by the small satellite orbital inclination (2.5°). However, until recently it was not possible to provide firm associations between AGILE TGFs and radio signals, because of two main limiting factors. First, dead-time effects led to a bias towards long duration events in AGILE TGF sample, which are less likely associated to strong radio pulses. In addition, most VLF detection networks are less sensitive along the equatorial region. Since the end of March 2015 a major change in the AGILE MiniCalorimeter instrument configuration resulted in a ten fold increase in TGF detection rate, and in the detection of events as short as 20 microseconds. 14% of the events in the new sample resulted simultaneous (within 200 microseconds) to sferics detected by the World Wide Lightning Location Network (WWLLN), therefore a source localisation is available for these events. We present here the first analysis of VLF waveforms associated to AGILE TGFs observed above Central America, detected by magnetic field sensors deployed in Puerto Rico. Among the seven TGFs with a WWLLN location at a distance lower than 10000 km from the sensors, four of them have detectable signals. These events are the closest to the sensors, with distance less than 7500 km. We present here the properties of these TGFs and the characteristics of the associated radio waveforms.

  17. Prima Platform: A Scheme for Managing Equipment-Dependent Onboard Functions and Impacts on the Avionics Software Production Process

    NASA Astrophysics Data System (ADS)

    Candia, Sante; Lisio, Giovanni; Campolo, Giovanni; Pascucci, Dario

    2010-08-01

    The Avionics Software (ASW), in charge of controlling the Low Earth Orbit (LEO) Spacecraft PRIMA Platform (Piattaforma Ri-configurabile Italiana Multi-Applicativa), is evolving towards a highly modular and re-usable architecture based on an architectural framework allowing the effective integration of the software building blocks (SWBBs) providing the on-board control functions. During the recent years, the PRIMA ASW design and production processes have been improved to reach the following objectives: (a) at PUS Services level, separation of the mission-independent software mechanisms from the mission-dependent configuration information; (b) at Application level, identification of mission-independent recurrent functions for promoting abstraction and obtaining a more efficient and safe ASW production, with positive implications also on the software validation activities. This paper is dedicated to the characterisation activity which has been performed at Application level for a software component abstracting a set of functions for the generic On-Board Assembly (OBA), a set of hardware units used to deliver an on-board service. Moreover, the ASW production process is specified to show how it results after the introduction of the new design features.

  18. The Seismic Tool-Kit (STK): an open source software for seismology and signal processing.

    NASA Astrophysics Data System (ADS)

    Reymond, Dominique

    2016-04-01

    We present an open source software project (GNU public license), named STK: Seismic ToolKit, that is dedicated mainly for seismology and signal processing. The STK project that started in 2007, is hosted by SourceForge.net, and count more than 19 500 downloads at the date of writing. The STK project is composed of two main branches: First, a graphical interface dedicated to signal processing (in the SAC format (SAC_ASCII and SAC_BIN): where the signal can be plotted, zoomed, filtered, integrated, derivated, ... etc. (a large variety of IFR and FIR filter is proposed). The estimation of spectral density of the signal are performed via the Fourier transform, with visualization of the Power Spectral Density (PSD) in linear or log scale, and also the evolutive time-frequency representation (or sonagram). The 3-components signals can be also processed for estimating their polarization properties, either for a given window, or either for evolutive windows along the time. This polarization analysis is useful for extracting the polarized noises, differentiating P waves, Rayleigh waves, Love waves, ... etc. Secondly, a panel of Utilities-Program are proposed for working in a terminal mode, with basic programs for computing azimuth and distance in spherical geometry, inter/auto-correlation, spectral density, time-frequency for an entire directory of signals, focal planes, and main components axis, radiation pattern of P waves, Polarization analysis of different waves (including noize), under/over-sampling the signals, cubic-spline smoothing, and linear/non linear regression analysis of data set. A MINimum library of Linear AlGebra (MIN-LINAG) is also provided for computing the main matrix process like: QR/QL decomposition, Cholesky solve of linear system, finding eigen value/eigen vectors, QR-solve/Eigen-solve of linear equations systems ... etc. STK is developed in C/C++, mainly under Linux OS, and it has been also partially implemented under MS-Windows. Usefull links: http

  19. Extensible Markup Language: How Might It Alter the Software Documentation Process and the Role of the Technical Communicator?

    ERIC Educational Resources Information Center

    Battalio, John T.

    2002-01-01

    Describes the influence that Extensible Markup Language (XML) will have on the software documentation process and subsequently on the curricula of advanced undergraduate and master's programs in technical communication. Recommends how curricula of advanced undergraduate and master's programs in technical communication ought to change in order to…

  20. What's New in Software? Computers and the Writing Process: Strategies That Work.

    ERIC Educational Resources Information Center

    Ellsworth, Nancy J.

    1990-01-01

    The computer can be a powerful tool to help students who are having difficulty learning the skills of prewriting, composition, revision, and editing. Specific software is suggested for each phase, as well as for classroom publishing. (Author/JDD)