Sample records for build successful software

  1. Software Build and Delivery Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robey, Robert W.

    2016-07-10

    This presentation deals with the hierarchy of software build and delivery systems. One of the goals is to maximize the success rate of new users and developers when first trying your software. First impressions are important. Early successes are important. This also reduces critical documentation costs. This is a presentation focused on computer science and goes into detail about code documentation.

  2. Building Successful GitHub Communities

    NASA Astrophysics Data System (ADS)

    Smith, A.

    2014-12-01

    Building successful online communities is hard, whether it's in open source software or web-based citizen science. In this presentation I'll share some lessons learned and outline some techniques employed by successful open source projects.

  3. LHCb Build and Deployment Infrastructure for run 2

    NASA Astrophysics Data System (ADS)

    Clemencic, M.; Couturier, B.

    2015-12-01

    After the successful run 1 of the LHC, the LHCb Core software team has taken advantage of the long shutdown to consolidate and improve its build and deployment infrastructure. Several of the related projects have already been presented like the build system using Jenkins, as well as the LHCb Performance and Regression testing infrastructure. Some components are completely new, like the Software Configuration Database (using the Graph DB Neo4j), or the new packaging installation using RPM packages. Furthermore all those parts are integrated to allow easier and quicker releases of the LHCb Software stack, therefore reducing the risk of operational errors. Integration and Regression tests are also now easier to implement, allowing to improve further the software checks.

  4. Building the Scientific Modeling Assistant: An interactive environment for specialized software design

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.

    1991-01-01

    The construction of scientific software models is an integral part of doing science, both within NASA and within the scientific community at large. Typically, model-building is a time-intensive and painstaking process, involving the design of very large, complex computer programs. Despite the considerable expenditure of resources involved, completed scientific models cannot easily be distributed and shared with the larger scientific community due to the low-level, idiosyncratic nature of the implemented code. To address this problem, we have initiated a research project aimed at constructing a software tool called the Scientific Modeling Assistant. This tool provides automated assistance to the scientist in developing, using, and sharing software models. We describe the Scientific Modeling Assistant, and also touch on some human-machine interaction issues relevant to building a successful tool of this type.

  5. Unidata: 30 Years of FOSS for the Geosciences

    NASA Astrophysics Data System (ADS)

    Davis, E.; Ramamurthy, M. K.; Young, J. W.; Fisher, W. I.; Rew, R. K.

    2015-12-01

    Unidata's core mission is to serve academic research and education communities by facilitating access and use of real-time weather data. To this end, Unidata develops, distributes, and supports several Free and Open Source Software (FOSS) packages. These packages are largely focused on data management, access, analysis and visualization. This presentation will discuss the lessons Unidata has gathered over thirty years of FOSS development, support, and community building. These lessons include what it takes to be a successful FOSS organization, how to adapt to changing "best practices" and the emergence of new FOSS tools and services, and techniques for dealing with software end-of-life. We will also discuss our approach when supporting a varied user community spanning end users and software developers. Strong user support has been an important key to Unidata's successful community building.

  6. Critical Factors Analysis for Offshore Software Development Success by Structural Equation Modeling

    NASA Astrophysics Data System (ADS)

    Wada, Yoshihisa; Tsuji, Hiroshi

    In order to analyze the success/failure factors in offshore software development service by the structural equation modeling, this paper proposes to follow two approaches together; domain knowledge based heuristic analysis and factor analysis based rational analysis. The former works for generating and verifying of hypothesis to find factors and causalities. The latter works for verifying factors introduced by theory to build the model without heuristics. Following the proposed combined approaches for the responses from skilled project managers of the questionnaire, this paper found that the vendor property has high causality for the success compared to software property and project property.

  7. SOA Governance: A Critical SOA Success Factor

    DTIC Science & Technology

    2010-04-01

    Software Perspective Service Consumer Service Providers Interface Optimize tomorrow today. ® Building Blocks...of a SOA Service – Software implemented capability that is well-defined, self contained and does not depend on context or state of other services ... Service Consumer – Service , application or other software component that requires a specific service . – Located through registry – Initiates service

  8. What makes computational open source software libraries successful?

    NASA Astrophysics Data System (ADS)

    Bangerth, Wolfgang; Heister, Timo

    2013-01-01

    Software is the backbone of scientific computing. Yet, while we regularly publish detailed accounts about the results of scientific software, and while there is a general sense of which numerical methods work well, our community is largely unaware of best practices in writing the large-scale, open source scientific software upon which our discipline rests. This is particularly apparent in the commonly held view that writing successful software packages is largely the result of simply ‘being a good programmer’ when in fact there are many other factors involved, for example the social skill of community building. In this paper, we consider what we have found to be the necessary ingredients for successful scientific software projects and, in particular, for software libraries upon which the vast majority of scientific codes are built today. In particular, we discuss the roles of code, documentation, communities, project management and licenses. We also briefly comment on the impact on academic careers of engaging in software projects.

  9. Tips for Ensuring Successful Software Implementation

    ERIC Educational Resources Information Center

    Weathers, Robert

    2013-01-01

    Implementing an enterprise-level, mission-critical software system is an infrastructure project akin to other sizable projects, such as building a school. It's costly and complex, takes a year or more to complete, requires the collaboration of many different parties, involves uncertainties, results in a long-lived asset requiring ongoing…

  10. Large-scale visualization projects for teaching software engineering.

    PubMed

    Müller, Christoph; Reina, Guido; Burch, Michael; Weiskopf, Daniel

    2012-01-01

    The University of Stuttgart's software engineering major complements the traditional computer science major with more practice-oriented education. Two-semester software projects in various application areas offered by the university's different computer science institutes are a successful building block in the curriculum. With this realistic, complex project setting, students experience the practice of software engineering, including software development processes, technologies, and soft skills. In particular, visualization-based projects are popular with students. Such projects offer them the opportunity to gain profound knowledge that would hardly be possible with only regular lectures and homework assignments.

  11. Network Solutions.

    ERIC Educational Resources Information Center

    Vietzke, Robert; And Others

    1996-01-01

    This special section explains the latest developments in networking technologies, profiles school districts benefiting from successful implementations, and reviews new products for building networks. Highlights include ATM (asynchronous transfer mode), cable modems, networking switches, Internet screening software, file servers, network management…

  12. Achieving design reuse: a case study

    NASA Astrophysics Data System (ADS)

    Young, Peter J.; Nielsen, Jon J.; Roberts, William H.; Wilson, Greg M.

    2008-08-01

    The RSAA CICADA data acquisition and control software package uses an object-oriented approach to model astronomical instrumentation and a layered architecture for implementation. Emphasis has been placed on building reusable C++ class libraries and on the use of attribute/value tables for dynamic configuration. This paper details how the approach has been successfully used in the construction of the instrument control software for the Gemini NIFS and GSAOI instruments. The software is again being used for the new RSAA SkyMapper and WiFeS instruments.

  13. Building Information Model: advantages, tools and adoption efficiency

    NASA Astrophysics Data System (ADS)

    Abakumov, R. G.; Naumov, A. E.

    2018-03-01

    The paper expands definition and essence of Building Information Modeling. It describes content and effects from application of Information Modeling at different stages of a real property item. Analysis of long-term and short-term advantages is given. The authors included an analytical review of Revit software package in comparison with Autodesk with respect to: features, advantages and disadvantages, cost and pay cutoff. A prognostic calculation is given for efficiency of adoption of the Building Information Modeling technology, with examples of its successful adoption in Russia and worldwide.

  14. A software tool to analyze clinical workflows from direct observations.

    PubMed

    Schweitzer, Marco; Lasierra, Nelia; Hoerbst, Alexander

    2015-01-01

    Observational data of clinical processes need to be managed in a convenient way, so that process information is reliable, valid and viable for further analysis. However, existing tools for allocating observations fail in systematic data collection of specific workflow recordings. We present a software tool which was developed to facilitate the analysis of clinical process observations. The tool was successfully used in the project OntoHealth, to build, store and analyze observations of diabetes routine consultations.

  15. Agile Acceptance Test–Driven Development of Clinical Decision Support Advisories: Feasibility of Using Open Source Software

    PubMed Central

    Baldwin, Krystal L; Kannan, Vaishnavi; Flahaven, Emily L; Parks, Cassandra J; Ott, Jason M; Willett, Duwayne L

    2018-01-01

    Background Moving to electronic health records (EHRs) confers substantial benefits but risks unintended consequences. Modern EHRs consist of complex software code with extensive local configurability options, which can introduce defects. Defects in clinical decision support (CDS) tools are surprisingly common. Feasible approaches to prevent and detect defects in EHR configuration, including CDS tools, are needed. In complex software systems, use of test–driven development and automated regression testing promotes reliability. Test–driven development encourages modular, testable design and expanding regression test coverage. Automated regression test suites improve software quality, providing a “safety net” for future software modifications. Each automated acceptance test serves multiple purposes, as requirements (prior to build), acceptance testing (on completion of build), regression testing (once live), and “living” design documentation. Rapid-cycle development or “agile” methods are being successfully applied to CDS development. The agile practice of automated test–driven development is not widely adopted, perhaps because most EHR software code is vendor-developed. However, key CDS advisory configuration design decisions and rules stored in the EHR may prove amenable to automated testing as “executable requirements.” Objective We aimed to establish feasibility of acceptance test–driven development of clinical decision support advisories in a commonly used EHR, using an open source automated acceptance testing framework (FitNesse). Methods Acceptance tests were initially constructed as spreadsheet tables to facilitate clinical review. Each table specified one aspect of the CDS advisory’s expected behavior. Table contents were then imported into a test suite in FitNesse, which queried the EHR database to automate testing. Tests and corresponding CDS configuration were migrated together from the development environment to production, with tests becoming part of the production regression test suite. Results We used test–driven development to construct a new CDS tool advising Emergency Department nurses to perform a swallowing assessment prior to administering oral medication to a patient with suspected stroke. Test tables specified desired behavior for (1) applicable clinical settings, (2) triggering action, (3) rule logic, (4) user interface, and (5) system actions in response to user input. Automated test suite results for the “executable requirements” are shown prior to building the CDS alert, during build, and after successful build. Conclusions Automated acceptance test–driven development and continuous regression testing of CDS configuration in a commercial EHR proves feasible with open source software. Automated test–driven development offers one potential contribution to achieving high-reliability EHR configuration. Vetting acceptance tests with clinicians elicits their input on crucial configuration details early during initial CDS design and iteratively during rapid-cycle optimization. PMID:29653922

  16. Agile Acceptance Test-Driven Development of Clinical Decision Support Advisories: Feasibility of Using Open Source Software.

    PubMed

    Basit, Mujeeb A; Baldwin, Krystal L; Kannan, Vaishnavi; Flahaven, Emily L; Parks, Cassandra J; Ott, Jason M; Willett, Duwayne L

    2018-04-13

    Moving to electronic health records (EHRs) confers substantial benefits but risks unintended consequences. Modern EHRs consist of complex software code with extensive local configurability options, which can introduce defects. Defects in clinical decision support (CDS) tools are surprisingly common. Feasible approaches to prevent and detect defects in EHR configuration, including CDS tools, are needed. In complex software systems, use of test-driven development and automated regression testing promotes reliability. Test-driven development encourages modular, testable design and expanding regression test coverage. Automated regression test suites improve software quality, providing a "safety net" for future software modifications. Each automated acceptance test serves multiple purposes, as requirements (prior to build), acceptance testing (on completion of build), regression testing (once live), and "living" design documentation. Rapid-cycle development or "agile" methods are being successfully applied to CDS development. The agile practice of automated test-driven development is not widely adopted, perhaps because most EHR software code is vendor-developed. However, key CDS advisory configuration design decisions and rules stored in the EHR may prove amenable to automated testing as "executable requirements." We aimed to establish feasibility of acceptance test-driven development of clinical decision support advisories in a commonly used EHR, using an open source automated acceptance testing framework (FitNesse). Acceptance tests were initially constructed as spreadsheet tables to facilitate clinical review. Each table specified one aspect of the CDS advisory's expected behavior. Table contents were then imported into a test suite in FitNesse, which queried the EHR database to automate testing. Tests and corresponding CDS configuration were migrated together from the development environment to production, with tests becoming part of the production regression test suite. We used test-driven development to construct a new CDS tool advising Emergency Department nurses to perform a swallowing assessment prior to administering oral medication to a patient with suspected stroke. Test tables specified desired behavior for (1) applicable clinical settings, (2) triggering action, (3) rule logic, (4) user interface, and (5) system actions in response to user input. Automated test suite results for the "executable requirements" are shown prior to building the CDS alert, during build, and after successful build. Automated acceptance test-driven development and continuous regression testing of CDS configuration in a commercial EHR proves feasible with open source software. Automated test-driven development offers one potential contribution to achieving high-reliability EHR configuration. Vetting acceptance tests with clinicians elicits their input on crucial configuration details early during initial CDS design and iteratively during rapid-cycle optimization. ©Mujeeb A Basit, Krystal L Baldwin, Vaishnavi Kannan, Emily L Flahaven, Cassandra J Parks, Jason M Ott, Duwayne L Willett. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 13.04.2018.

  17. Streamlining the Design-to-Build Transition with Build-Optimization Software Tools.

    PubMed

    Oberortner, Ernst; Cheng, Jan-Fang; Hillson, Nathan J; Deutsch, Samuel

    2017-03-17

    Scaling-up capabilities for the design, build, and test of synthetic biology constructs holds great promise for the development of new applications in fuels, chemical production, or cellular-behavior engineering. Construct design is an essential component in this process; however, not every designed DNA sequence can be readily manufactured, even using state-of-the-art DNA synthesis methods. Current biological computer-aided design and manufacture tools (bioCAD/CAM) do not adequately consider the limitations of DNA synthesis technologies when generating their outputs. Designed sequences that violate DNA synthesis constraints may require substantial sequence redesign or lead to price-premiums and temporal delays, which adversely impact the efficiency of the DNA manufacturing process. We have developed a suite of build-optimization software tools (BOOST) to streamline the design-build transition in synthetic biology engineering workflows. BOOST incorporates knowledge of DNA synthesis success determinants into the design process to output ready-to-build sequences, preempting the need for sequence redesign. The BOOST web application is available at https://boost.jgi.doe.gov and its Application Program Interfaces (API) enable integration into automated, customized DNA design processes. The herein presented results highlight the effectiveness of BOOST in reducing DNA synthesis costs and timelines.

  18. The Use of High Performance Computing (HPC) to Strengthen the Development of Army Systems

    DTIC Science & Technology

    2011-11-01

    accurately predicting the supersonic magus effect about spinning cones, ogive- cylinders , and boat-tailed afterbodies. This work led to the successful...successful computer model of the proposed product or system, one can then build prototypes on the computer and study the effects on the performance of...needed. The NRC report discusses the requirements for effective use of such computing power. One needs “models, algorithms, software, hardware

  19. Measurements of the LHCb software stack on the ARM architecture

    NASA Astrophysics Data System (ADS)

    Vijay Kartik, S.; Couturier, Ben; Clemencic, Marco; Neufeld, Niko

    2014-06-01

    The ARM architecture is a power-efficient design that is used in most processors in mobile devices all around the world today since they provide reasonable compute performance per watt. The current LHCb software stack is designed (and thus expected) to build and run on machines with the x86/x86_64 architecture. This paper outlines the process of measuring the performance of the LHCb software stack on the ARM architecture - specifically, the ARMv7 architecture on Cortex-A9 processors from NVIDIA and on full-fledged ARM servers with chipsets from Calxeda - and makes comparisons with the performance on x86_64 architectures on the Intel Xeon L5520/X5650 and AMD Opteron 6272. The paper emphasises the aspects of performance per core with respect to the power drawn by the compute nodes for the given performance - this ensures a fair real-world comparison with much more 'powerful' Intel/AMD processors. The comparisons of these real workloads in the context of LHCb are also complemented with the standard synthetic benchmarks HEPSPEC and Coremark. The pitfalls and solutions for the non-trivial task of porting the source code to build for the ARMv7 instruction set are presented. The specific changes in the build process needed for ARM-specific portions of the software stack are described, to serve as pointers for further attempts taken up by other groups in this direction. Cases where architecture-specific tweaks at the assembler lever (both in ROOT and the LHCb software stack) were needed for a successful compile are detailed - these cases are good indicators of where/how the software stack as well as the build system can be made more portable and multi-arch friendly. The experience gained from the tasks described in this paper are intended to i) assist in making an informed choice about ARM-based server solutions as a feasible low-power alternative to the current compute nodes, and ii) revisit the software design and build system for portability and generic improvements.

  20. Requirements Engineering in Building Climate Science Software

    NASA Astrophysics Data System (ADS)

    Batcheller, Archer L.

    Software has an important role in supporting scientific work. This dissertation studies teams that build scientific software, focusing on the way that they determine what the software should do. These requirements engineering processes are investigated through three case studies of climate science software projects. The Earth System Modeling Framework assists modeling applications, the Earth System Grid distributes data via a web portal, and the NCAR (National Center for Atmospheric Research) Command Language is used to convert, analyze and visualize data. Document analysis, observation, and interviews were used to investigate the requirements-related work. The first research question is about how and why stakeholders engage in a project, and what they do for the project. Two key findings arise. First, user counts are a vital measure of project success, which makes adoption important and makes counting tricky and political. Second, despite the importance of quantities of users, a few particular "power users" develop a relationship with the software developers and play a special role in providing feedback to the software team and integrating the system into user practice. The second research question focuses on how project objectives are articulated and how they are put into practice. The team seeks to both build a software system according to product requirements but also to conduct their work according to process requirements such as user support. Support provides essential communication between users and developers that assists with refining and identifying requirements for the software. It also helps users to learn and apply the software to their real needs. User support is a vital activity for scientific software teams aspiring to create infrastructure. The third research question is about how change in scientific practice and knowledge leads to changes in the software, and vice versa. The "thickness" of a layer of software infrastructure impacts whether the software team or users have control and responsibility for making changes in response to new scientific ideas. Thick infrastructure provides more functionality for users, but gives them less control of it. The stability of infrastructure trades off against the responsiveness that the infrastructure can have to user needs.

  1. Sustaining Open Source Communities through Hackathons - An Example from the ASPECT Community

    NASA Astrophysics Data System (ADS)

    Heister, T.; Hwang, L.; Bangerth, W.; Kellogg, L. H.

    2016-12-01

    The ecosystem surrounding a successful scientific open source software package combines both social and technical aspects. Much thought has been given to the technology side of writing sustainable software for large infrastructure projects and software libraries, but less about building the human capacity to perpetuate scientific software used in computational modeling. One effective format for building capacity is regular multi-day hackathons. Scientific hackathons bring together a group of science domain users and scientific software contributors to make progress on a specific software package. Innovation comes through the chance to work with established and new collaborations. Especially in the domain sciences with small communities, hackathons give geographically distributed scientists an opportunity to connect face-to-face. They foster lively discussions amongst scientists with different expertise, promote new collaborations, and increase transparency in both the technical and scientific aspects of code development. ASPECT is an open source, parallel, extensible finite element code to simulate thermal convection, that began development in 2011 under the Computational Infrastructure for Geodynamics. ASPECT hackathons for the past 3 years have grown the number of authors to >50, training new code maintainers in the process. Hackathons begin with leaders establishing project-specific conventions for development, demonstrating the workflow for code contributions, and reviewing relevant technical skills. Each hackathon expands the developer community. Over 20 scientists add >6,000 lines of code during the >1 week event. Participants grow comfortable contributing to the repository and over half continue to contribute afterwards. A high return rate of participants ensures continuity and stability of the group as well as mentoring for novice members. We hope to build other software communities on this model, but anticipate each to bring their own unique challenges.

  2. Protyping machine vision software on the World Wide Web

    NASA Astrophysics Data System (ADS)

    Karantalis, George; Batchelor, Bruce G.

    1998-10-01

    Interactive image processing is a proven technique for analyzing industrial vision applications and building prototype systems. Several of the previous implementations have used dedicated hardware to perform the image processing, with a top layer of software providing a convenient user interface. More recently, self-contained software packages have been devised and these run on a standard computer. The advent of the Java programming language has made it possible to write platform-independent software, operating over the Internet, or a company-wide Intranet. Thus, there arises the possibility of designing at least some shop-floor inspection/control systems, without the vision engineer ever entering the factories where they will be used. It successful, this project will have a major impact on the productivity of vision systems designers.

  3. Inductive knowledge acquisition experience with commercial tools for space shuttle main engine testing

    NASA Technical Reports Server (NTRS)

    Modesitt, Kenneth L.

    1990-01-01

    Since 1984, an effort has been underway at Rocketdyne, manufacturer of the Space Shuttle Main Engine (SSME), to automate much of the analysis procedure conducted after engine test firings. Previously published articles at national and international conferences have contained the context of and justification for this effort. Here, progress is reported in building the full system, including the extensions of integrating large databases with the system, known as Scotty. Inductive knowledge acquisition has proven itself to be a key factor in the success of Scotty. The combination of a powerful inductive expert system building tool (ExTran), a relational data base management system (Reliance), and software engineering principles and Computer-Assisted Software Engineering (CASE) tools makes for a practical, useful and state-of-the-art application of an expert system.

  4. Type Safe Extensible Programming

    NASA Astrophysics Data System (ADS)

    Chae, Wonseok

    2009-10-01

    Software products evolve over time. Sometimes they evolve by adding new features, and sometimes by either fixing bugs or replacing outdated implementations with new ones. When software engineers fail to anticipate such evolution during development, they will eventually be forced to re-architect or re-build from scratch. Therefore, it has been common practice to prepare for changes so that software products are extensible over their lifetimes. However, making software extensible is challenging because it is difficult to anticipate successive changes and to provide adequate abstraction mechanisms over potential changes. Such extensibility mechanisms, furthermore, should not compromise any existing functionality during extension. Software engineers would benefit from a tool that provides a way to add extensions in a reliable way. It is natural to expect programming languages to serve this role. Extensible programming is one effort to address these issues. In this thesis, we present type safe extensible programming using the MLPolyR language. MLPolyR is an ML-like functional language whose type system provides type-safe extensibility mechanisms at several levels. After presenting the language, we will show how these extensibility mechanisms can be put to good use in the context of product line engineering. Product line engineering is an emerging software engineering paradigm that aims to manage variations, which originate from successive changes in software.

  5. An Online Course of Business Statistics: The Proportion of Successful Students

    ERIC Educational Resources Information Center

    Pena-Sanchez, Rolando

    2009-01-01

    This article describes the students' academic progress in an online course of business statistics through interactive software assignments and diverse educational homework, which helps these students to build their own e-learning through basic competences; i.e. interpreting results and solving problems. Cross-tables were built for the categorical…

  6. From BIM to GIS at the Smithsonian Institution

    NASA Astrophysics Data System (ADS)

    Günther-Diringer, Detlef

    2018-05-01

    BIM-files (Building Information Models) are in modern architecture and building management a basic prerequisite for successful creation of construction engineering projects. At the facilities department of the Smithsonian Institution more than six hundred buildings were maintained. All facilities were digital available in an ESRI ArcGIS-environment with connection to the database information about single rooms with the usage and further maintenance information. These data are organization wide available by an intranet viewer, but only in a two-dimensional representation. Goal of the carried out project was the development of a workflow from available BIM-models to the given GIS-structure. The test-environment were the BIM-models of the buildings of the Smithsonian museums along the Washington Mall. Based on new software editions of Autodesk Revit, FME and ArcGIS Pro the workflow from BIM to the GIS-data structure of the Smithsonian was successfully developed and may be applied for the setup of the future 3D intranet viewer.

  7. Investigating interoperability of the LSST data management software stack with Astropy

    NASA Astrophysics Data System (ADS)

    Jenness, Tim; Bosch, James; Owen, Russell; Parejko, John; Sick, Jonathan; Swinbank, John; de Val-Borro, Miguel; Dubois-Felsmann, Gregory; Lim, K.-T.; Lupton, Robert H.; Schellart, Pim; Krughoff, K. S.; Tollerud, Erik J.

    2016-07-01

    The Large Synoptic Survey Telescope (LSST) will be an 8.4m optical survey telescope sited in Chile and capable of imaging the entire sky twice a week. The data rate of approximately 15TB per night and the requirements to both issue alerts on transient sources within 60 seconds of observing and create annual data releases means that automated data management systems and data processing pipelines are a key deliverable of the LSST construction project. The LSST data management software has been in development since 2004 and is based on a C++ core with a Python control layer. The software consists of nearly a quarter of a million lines of code covering the system from fundamental WCS and table libraries to pipeline environments and distributed process execution. The Astropy project began in 2011 as an attempt to bring together disparate open source Python projects and build a core standard infrastructure that can be used and built upon by the astronomy community. This project has been phenomenally successful in the years since it has begun and has grown to be the de facto standard for Python software in astronomy. Astropy brings with it considerable expectations from the community on how astronomy Python software should be developed and it is clear that by the time LSST is fully operational in the 2020s many of the prospective users of the LSST software stack will expect it to be fully interoperable with Astropy. In this paper we describe the overlap between the LSST science pipeline software and Astropy software and investigate areas where the LSST software provides new functionality. We also discuss the possibilities of re-engineering the LSST science pipeline software to build upon Astropy, including the option of contributing affliated packages.

  8. Software Carpentry In The Hydrological Sciences

    NASA Astrophysics Data System (ADS)

    Ahmadia, A. J.; Kees, C. E.

    2014-12-01

    Scientists are spending an increasing amount of time building and using hydrology software. However, most scientists are never taught how to do this efficiently. As a result, many are unaware of tools and practices that would allow them to write more reliable and maintainable code with less effort. As hydrology models increase in capability and enter use by a growing number of scientists and their communities, it is important that the scientific software development practices scale up to meet the challenges posed by increasing software complexity, lengthening software lifecycles, a growing number of stakeholders and contributers, and a broadened developer base that extends from application domains to high performance computing centers. Many of these challenges in complexity, lifecycles, and developer base have been successfully met by the open source community, and there are many lessons to be learned from their experiences and practices. Additionally, there is much wisdom to be found in the results of research studies conducted on software engineering itself. Software Carpentry aims to bridge the gap between the current state of software development and these known best practices for scientific software development, with a focus on hands-on exercises and practical advice. In 2014, Software Carpentry workshops targeting earth/environmental sciences and hydrological modeling have been organized and run at the Massachusetts Institute of Technology, the US Army Corps of Engineers, the Community Surface Dynamics Modeling System Annual Meeting, and the Earth Science Information Partners Summer Meeting. In this presentation, we will share some of the successes in teaching this material, as well as discuss and present instructional material specific to hydrological modeling.

  9. An automatic speech recognition system with speaker-independent identification support

    NASA Astrophysics Data System (ADS)

    Caranica, Alexandru; Burileanu, Corneliu

    2015-02-01

    The novelty of this work relies on the application of an open source research software toolkit (CMU Sphinx) to train, build and evaluate a speech recognition system, with speaker-independent support, for voice-controlled hardware applications. Moreover, we propose to use the trained acoustic model to successfully decode offline voice commands on embedded hardware, such as an ARMv6 low-cost SoC, Raspberry PI. This type of single-board computer, mainly used for educational and research activities, can serve as a proof-of-concept software and hardware stack for low cost voice automation systems.

  10. SAGA: A project to automate the management of software production systems

    NASA Technical Reports Server (NTRS)

    Campbell, Roy H.; Beckman-Davies, C. S.; Benzinger, L.; Beshers, G.; Laliberte, D.; Render, H.; Sum, R.; Smith, W.; Terwilliger, R.

    1986-01-01

    Research into software development is required to reduce its production cost and to improve its quality. Modern software systems, such as the embedded software required for NASA's space station initiative, stretch current software engineering techniques. The requirements to build large, reliable, and maintainable software systems increases with time. Much theoretical and practical research is in progress to improve software engineering techniques. One such technique is to build a software system or environment which directly supports the software engineering process, i.e., the SAGA project, comprising the research necessary to design and build a software development which automates the software engineering process. Progress under SAGA is described.

  11. Building Project Management Communities: Exploring the Contribution of Patterns Supported by Web 2.0 Technologies

    ERIC Educational Resources Information Center

    Burd, Elizabeth L.; Hatch, Andrew; Ashurst, Colin; Jessop, Alan

    2009-01-01

    This article describes an approach whereby patterns are used to describe management issues and solutions to be used during the project management of team-based software development. The work describes how web 2.0 technologies have been employed to support the use and development of such patterns. To evaluate the success of patterns and the…

  12. Assessment of Indoor Route-finding Technology for People with Visual Impairment

    PubMed Central

    Kalia, Amy A.; Legge, Gordon E.; Roy, Rudrava; Ogale, Advait

    2010-01-01

    This study investigated navigation with route instructions generated by digital-map software and synthetic speech. Participants, either visually impaired or sighted wearing blind folds, successfully located rooms in an unfamiliar building. Users with visual impairment demonstrated better route-finding performance when the technology provided distance information in number of steps rather than walking time or number of feet. PMID:21869851

  13. Rapid Risk Evaluation (ER2) Using MS Excel Spreadsheet: a Case Study of Fredericton (new Brunswick, Canada)

    NASA Astrophysics Data System (ADS)

    McGrath, H.; Stefanakis, E.; Nastev, M.

    2016-06-01

    Conventional knowledge of the flood hazard alone (extent and frequency) is not sufficient for informed decision-making. The public safety community needs tools and guidance to adequately undertake flood hazard risk assessment in order to estimate respective damages and social and economic losses. While many complex computer models have been developed for flood risk assessment, they require highly trained personnel to prepare the necessary input (hazard, inventory of the built environment, and vulnerabilities) and analyze model outputs. As such, tools which utilize open-source software or are built within popular desktop software programs are appealing alternatives. The recently developed Rapid Risk Evaluation (ER2) application runs scenario based loss assessment analyses in a Microsoft Excel spreadsheet. User input is limited to a handful of intuitive drop-down menus utilized to describe the building type, age, occupancy and the expected water level. In anticipation of local depth damage curves and other needed vulnerability parameters, those from the U.S. FEMA's Hazus-Flood software have been imported and temporarily accessed in conjunction with user input to display exposure and estimated economic losses related to the structure and the content of the building. Building types and occupancies representative of those most exposed to flooding in Fredericton (New Brunswick) were introduced and test flood scenarios were run. The algorithm was successfully validated against results from the Hazus-Flood model for the same building types and flood depths.

  14. Design and development of Building energy simulation Software for prefabricated cabin type of industrial building (PCES)

    NASA Astrophysics Data System (ADS)

    Zhang, Jun; Li, Ri Yi

    2018-06-01

    Building energy simulation is an important supporting tool for green building design and building energy consumption assessment, At present, Building energy simulation software can't meet the needs of energy consumption analysis and cabinet level micro environment control design of prefabricated building. thermal physical model of prefabricated building is proposed in this paper, based on the physical model, the energy consumption calculation software of prefabricated cabin building(PCES) is developed. we can achieve building parameter setting, energy consumption simulation and building thermal process and energy consumption analysis by PCES.

  15. SUNREL Energy Simulation Software | Buildings | NREL

    Science.gov Websites

    SUNREL Energy Simulation Software SUNREL Energy Simulation Software SUNREL® is a hourly building energy simulation program that aids in the design of small energy-efficient buildings where the loads are

  16. The Computational Infrastructure for Geodynamics as a Community of Practice

    NASA Astrophysics Data System (ADS)

    Hwang, L.; Kellogg, L. H.

    2016-12-01

    Computational Infrastructure for Geodynamics (CIG), geodynamics.org, originated in 2005 out of community recognition that the efforts of individual or small groups of researchers to develop scientifically-sound software is impossible to sustain, duplicates effort, and makes it difficult for scientists to adopt state-of-the art computational methods that promote new discovery. As a community of practice, participants in CIG share an interest in computational modeling in geodynamics and work together on open source software to build the capacity to support complex, extensible, scalable, interoperable, reliable, and reusable software in an effort to increase the return on investment in scientific software development and increase the quality of the resulting software. The group interacts regularly to learn from each other and better their practices formally through webinar series, workshops, and tutorials and informally through listservs and hackathons. Over the past decade, we have learned that successful scientific software development requires at a minimum: collaboration between domain-expert researchers, software developers and computational scientists; clearly identified and committed lead developer(s); well-defined scientific and computational goals that are regularly evaluated and updated; well-defined benchmarks and testing throughout development; attention throughout development to usability and extensibility; understanding and evaluation of the complexity of dependent libraries; and managed user expectations through education, training, and support. CIG's code donation standards provide the basis for recently formalized best practices in software development (geodynamics.org/cig/dev/best-practices/). Best practices include use of version control; widely used, open source software libraries; extensive test suites; portable configuration and build systems; extensive documentation internal and external to the code; and structured, human readable input formats.

  17. Building Geographic Information System Capacity in Local Health Departments: Lessons From a North Carolina Project

    PubMed Central

    Miranda, Marie Lynn; Silva, Jennifer M.; Overstreet Galeano, M. Alicia; Brown, Jeffrey P.; Campbell, Douglas S.; Coley, Evelyn; Cowan, Christopher S.; Harvell, Dianne; Lassiter, Jenny; Parks, Jerry L.; Sandelé, Wanda

    2005-01-01

    State government, university, and local health department (LHD) partners collaborated to build the geographic information system (GIS) capacity of 5 LHDs in North Carolina. Project elements included procuring hardware and software, conducting individualized and group training, developing data layers, guiding the project development process, coordinating participation in technical conferences, providing ongoing project consultation, and evaluating project milestones. The project provided health department personnel with the skills and resources required to use sophisticated information management systems, particularly those that address spatial dimensions of public health practice. This capacity-building project helped LHDs incorporate GIS technology into daily operations, resulting in improved time and cost efficiency. Keys to success included (1) methods training rooted in problems specific to the LHD, (2) required project identification by LHD staff with associated timelines for development, (3) ongoing technical support as staff returned to home offices after training, (4) subgrants to LHDs to ease hardware and software resource constraints, (5) networks of relationships among LHDs and other professional GIS users, and (6) senior LHD leadership who supported the professional development activities being undertaken by staff. PMID:16257950

  18. Noninvasive Test Detects Cardiovascular Disease

    NASA Technical Reports Server (NTRS)

    2007-01-01

    At NASA's Jet Propulsion Laboratory (JPL), NASA-developed Video Imaging Communication and Retrieval (VICAR) software laid the groundwork for analyzing images of all kinds. A project seeking to use imaging technology for health care diagnosis began when the imaging team considered using the VICAR software to analyze X-ray images of soft tissue. With marginal success using X-rays, the team applied the same methodology to ultrasound imagery, which was already digitally formatted. The new approach proved successful for assessing amounts of plaque build-up and arterial wall thickness, direct predictors of heart disease, and the result was a noninvasive diagnostic system with the ability to accurately predict heart health. Medical Technologies International Inc. (MTI) further developed and then submitted the technology to a vigorous review process at the FDA, which cleared the software for public use. The software, patented under the name Prowin, is being used in MTI's patented ArterioVision, a carotid intima-media thickness (CIMT) test that uses ultrasound image-capturing and analysis software to noninvasively identify the risk for the major cause of heart attack and strokes: atherosclerosis. ArterioVision provides a direct measurement of atherosclerosis by safely and painlessly measuring the thickness of the first two layers of the carotid artery wall using an ultrasound procedure and advanced image-analysis software. The technology is now in use in all 50 states and in many countries throughout the world.

  19. The Supertree Toolkit 2: a new and improved software package with a Graphical User Interface for supertree construction.

    PubMed

    Hill, Jon; Davis, Katie E

    2014-01-01

    Building large supertrees involves the collection, storage, and processing of thousands of individual phylogenies to create large phylogenies with thousands to tens of thousands of taxa. Such large phylogenies are useful for macroevolutionary studies, comparative biology and in conservation and biodiversity. No easy to use and fully integrated software package currently exists to carry out this task. Here, we present a new Python-based software package that uses well defined XML schema to manage both data and metadata. It builds on previous versions by 1) including new processing steps, such as Safe Taxonomic Reduction, 2) using a user-friendly GUI that guides the user to complete at least the minimum information required and includes context-sensitive documentation, and 3) a revised storage format that integrates both tree- and meta-data into a single file. These data can then be manipulated according to a well-defined, but flexible, processing pipeline using either the GUI or a command-line based tool. Processing steps include standardising names, deleting or replacing taxa, ensuring adequate taxonomic overlap, ensuring data independence, and safe taxonomic reduction. This software has been successfully used to store and process data consisting of over 1000 trees ready for analyses using standard supertree methods. This software makes large supertree creation a much easier task and provides far greater flexibility for further work.

  20. Build and Execute Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guan, Qiang

    At exascale, the challenge becomes to develop applications that run at scale and use exascale platforms reliably, efficiently, and flexibly. Workflows become much more complex because they must seamlessly integrate simulation and data analytics. They must include down-sampling, post-processing, feature extraction, and visualization. Power and data transfer limitations require these analysis tasks to be run in-situ or in-transit. We expect successful workflows will comprise multiple linked simulations along with tens of analysis routines. Users will have limited development time at scale and, therefore, must have rich tools to develop, debug, test, and deploy applications. At this scale, successful workflows willmore » compose linked computations from an assortment of reliable, well-defined computation elements, ones that can come and go as required, based on the needs of the workflow over time. We propose a novel framework that utilizes both virtual machines (VMs) and software containers to create a workflow system that establishes a uniform build and execution environment (BEE) beyond the capabilities of current systems. In this environment, applications will run reliably and repeatably across heterogeneous hardware and software. Containers, both commercial (Docker and Rocket) and open-source (LXC and LXD), define a runtime that isolates all software dependencies from the machine operating system. Workflows may contain multiple containers that run different operating systems, different software, and even different versions of the same software. We will run containers in open-source virtual machines (KVM) and emulators (QEMU) so that workflows run on any machine entirely in user-space. On this platform of containers and virtual machines, we will deliver workflow software that provides services, including repeatable execution, provenance, checkpointing, and future proofing. We will capture provenance about how containers were launched and how they interact to annotate workflows for repeatable and partial re-execution. We will coordinate the physical snapshots of virtual machines with parallel programming constructs, such as barriers, to automate checkpoint and restart. We will also integrate with HPC-specific container runtimes to gain access to accelerators and other specialized hardware to preserve native performance. Containers will link development to continuous integration. When application developers check code in, it will automatically be tested on a suite of different software and hardware architectures.« less

  1. Improving Building Energy Simulation Programs Through Diagnostic Testing (Fact Sheet)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    2012-02-01

    New test procedure evaluates quality and accuracy of energy analysis tools for the residential building retrofit market. Reducing the energy use of existing homes in the United States offers significant energy-saving opportunities, which can be identified through building simulation software tools that calculate optimal packages of efficiency measures. To improve the accuracy of energy analysis for residential buildings, the National Renewable Energy Laboratory's (NREL) Buildings Research team developed the Building Energy Simulation Test for Existing Homes (BESTEST-EX), a method for diagnosing and correcting errors in building energy audit software and calibration procedures. BESTEST-EX consists of building physics and utility billmore » calibration test cases, which software developers can use to compare their tools simulation findings to reference results generated with state-of-the-art simulation tools. Overall, the BESTEST-EX methodology: (1) Tests software predictions of retrofit energy savings in existing homes; (2) Ensures building physics calculations and utility bill calibration procedures perform to a minimum standard; and (3) Quantifies impacts of uncertainties in input audit data and occupant behavior. BESTEST-EX is helping software developers identify and correct bugs in their software, as well as develop and test utility bill calibration procedures.« less

  2. Emile: Software-Realized Scaffolding for Science Learners Programming in Mixed Media

    NASA Astrophysics Data System (ADS)

    Guzdial, Mark Joseph

    Emile is a computer program that facilitates students using programming to create models of kinematics (physics of motion without forces) and executing these models as simulations. Emile facilitates student programming and model-building with software-realized scaffolding (SRS). Emile integrates a range of SRS and provides mechanisms to fade (or diminish) most scaffolding. By fading Emile's SRS, students can adapt support to their individual needs. Programming in Emile involves graphic and text elements (as compared with more traditional text-based programming). For example, students create graphical objects which can be dragged on the screen, and when dropped, fall as if in a gravitational field. Emile supports a simplified, HyperCard-like mixed media programming framework. Scaffolding is defined as support which enables student performance (called the immediate benefit of scaffolding) and which facilitates student learning (called the lasting benefit of scaffolding). Scaffolding provides this support through three methods: Modeling, coaching, and eliciting articulation. For example, Emile has tools to structure the programming task (modeling), menus identify the next step in the programming and model-building process (coaching), and prompts for student plans and predictions (eliciting articulation). Five students used Emile in a summer workshop (45 hours total) focusing on creating kinematics simulations and multimedia demonstrations. Evaluation of Emile's scaffolding addressed use of scaffolding and the expected immediate and lasting benefits. Emile created records of student interactions (log files) which were analyzed to determine how students used Emile's SRS and how they faded that scaffolding. Student projects and articulations about those projects were analyzed to assess success of student's model-building and programming activities. Clinical interviews were conducted before and after the workshop to determine students' conceptualizations of kinematics and programming and how they changed. The results indicate that students were successful at model-building and programming, learned physics and programming, and used and faded Emile's scaffolding over time. These results are from a small sample who were self -selected and highly-motivated. Nonetheless, this study provides a theory and operationalization for SRS, an example of a successful model-building environment, and a description of student use of mixed media programming.

  3. Real-time seismic monitoring needs of a building owner - And the solution: A cooperative effort

    USGS Publications Warehouse

    Celebi, M.; Sanli, A.; Sinclair, M.; Gallant, S.; Radulescu, D.

    2004-01-01

    A recently implemented advanced seismic monitoring system for a 24-story building facilitates recording of accelerations and computing displacements and drift ratios in near-real time to measure the earthquake performance of the building. The drift ratio is related to the damage condition of the specific building. This system meets the owner's needs for rapid quantitative input to assessments and decisions on post-earthquake occupancy. The system is now successfully working and, in absence of strong shaking to date, is producing low-amplitude data in real time for routine analyses and assessment. Studies of such data to date indicate that the configured monitoring system with its building specific software can be a useful tool in rapid assessment of buildings and other structures following an earthquake. Such systems can be used for health monitoring of a building, for assessing performance-based design and analyses procedures, for long-term assessment of structural characteristics, and for long-term damage detection.

  4. Understanding and Predicting the Process of Software Maintenance Releases

    NASA Technical Reports Server (NTRS)

    Basili, Victor; Briand, Lionel; Condon, Steven; Kim, Yong-Mi; Melo, Walcelio L.; Valett, Jon D.

    1996-01-01

    One of the major concerns of any maintenance organization is to understand and estimate the cost of maintenance releases of software systems. Planning the next release so as to maximize the increase in functionality and the improvement in quality are vital to successful maintenance management. The objective of this paper is to present the results of a case study in which an incremental approach was used to better understand the effort distribution of releases and build a predictive effort model for software maintenance releases. This study was conducted in the Flight Dynamics Division (FDD) of NASA Goddard Space Flight Center(GSFC). This paper presents three main results: 1) a predictive effort model developed for the FDD's software maintenance release process; 2) measurement-based lessons learned about the maintenance process in the FDD; and 3) a set of lessons learned about the establishment of a measurement-based software maintenance improvement program. In addition, this study provides insights and guidelines for obtaining similar results in other maintenance organizations.

  5. Requirements Engineering in Building Climate Science Software

    ERIC Educational Resources Information Center

    Batcheller, Archer L.

    2011-01-01

    Software has an important role in supporting scientific work. This dissertation studies teams that build scientific software, focusing on the way that they determine what the software should do. These requirements engineering processes are investigated through three case studies of climate science software projects. The Earth System Modeling…

  6. SUNREL Related Links | Buildings | NREL

    Science.gov Websites

    SUNREL Related Links SUNREL Related Links DOE Simulation Software Tools Directory a directory of 301 building software tools for evaluation of energy efficiency, renewable energy, and sustainability in buildings. TREAT Software Program a computer program that uses SUNREL and is designed to provide

  7. Taking advantage of ground data systems attributes to achieve quality results in testing software

    NASA Technical Reports Server (NTRS)

    Sigman, Clayton B.; Koslosky, John T.; Hageman, Barbara H.

    1994-01-01

    During the software development life cycle process, basic testing starts with the development team. At the end of the development process, an acceptance test is performed for the user to ensure that the deliverable is acceptable. Ideally, the delivery is an operational product with zero defects. However, the goal of zero defects is normally not achieved but is successful to various degrees. With the emphasis on building low cost ground support systems while maintaining a quality product, a key element in the test process is simulator capability. This paper reviews the Transportable Payload Operations Control Center (TPOCC) Advanced Spacecraft Simulator (TASS) test tool that is used in the acceptance test process for unmanned satellite operations control centers. The TASS is designed to support the development, test and operational environments of the Goddard Space Flight Center (GSFC) operations control centers. The TASS uses the same basic architecture as the operations control center. This architecture is characterized by its use of distributed processing, industry standards, commercial off-the-shelf (COTS) hardware and software components, and reusable software. The TASS uses much of the same TPOCC architecture and reusable software that the operations control center developer uses. The TASS also makes use of reusable simulator software in the mission specific versions of the TASS. Very little new software needs to be developed, mainly mission specific telemetry communication and command processing software. By taking advantage of the ground data system attributes, successful software reuse for operational systems provides the opportunity to extend the reuse concept into the test area. Consistency in test approach is a major step in achieving quality results.

  8. Reuse Adoption Guidebook. Version 02.00.05

    DTIC Science & Technology

    1993-11-01

    Oriented Domain Analysis ( FODA ) Feasibiity Study, W Novak, and S. Peterson CMU/SEI-90-TR-21 Pittsburgh, Pennsylvania: Software 1990 Engineering Institute...Mettala and Graham 1992). " SEI has developed domain analysis techniques (Kang et al. 1990) and other reuse technology. Additionally, the SEI is in the...continue to build on your success. Figure 2-1 illustrates the Reuse Adoption process using a Structured Analysis and Design Thchmque (SADT) diagram

  9. Large Scale Software Building with CMake in ATLAS

    NASA Astrophysics Data System (ADS)

    Elmsheuser, J.; Krasznahorkay, A.; Obreshkov, E.; Undrus, A.; ATLAS Collaboration

    2017-10-01

    The offline software of the ATLAS experiment at the Large Hadron Collider (LHC) serves as the platform for detector data reconstruction, simulation and analysis. It is also used in the detector’s trigger system to select LHC collision events during data taking. The ATLAS offline software consists of several million lines of C++ and Python code organized in a modular design of more than 2000 specialized packages. Because of different workflows, many stable numbered releases are in parallel production use. To accommodate specific workflow requests, software patches with modified libraries are distributed on top of existing software releases on a daily basis. The different ATLAS software applications also require a flexible build system that strongly supports unit and integration tests. Within the last year this build system was migrated to CMake. A CMake configuration has been developed that allows one to easily set up and build the above mentioned software packages. This also makes it possible to develop and test new and modified packages on top of existing releases. The system also allows one to detect and execute partial rebuilds of the release based on single package changes. The build system makes use of CPack for building RPM packages out of the software releases, and CTest for running unit and integration tests. We report on the migration and integration of the ATLAS software to CMake and show working examples of this large scale project in production.

  10. The Supertree Toolkit 2: a new and improved software package with a Graphical User Interface for supertree construction

    PubMed Central

    2014-01-01

    Abstract Building large supertrees involves the collection, storage, and processing of thousands of individual phylogenies to create large phylogenies with thousands to tens of thousands of taxa. Such large phylogenies are useful for macroevolutionary studies, comparative biology and in conservation and biodiversity. No easy to use and fully integrated software package currently exists to carry out this task. Here, we present a new Python-based software package that uses well defined XML schema to manage both data and metadata. It builds on previous versions by 1) including new processing steps, such as Safe Taxonomic Reduction, 2) using a user-friendly GUI that guides the user to complete at least the minimum information required and includes context-sensitive documentation, and 3) a revised storage format that integrates both tree- and meta-data into a single file. These data can then be manipulated according to a well-defined, but flexible, processing pipeline using either the GUI or a command-line based tool. Processing steps include standardising names, deleting or replacing taxa, ensuring adequate taxonomic overlap, ensuring data independence, and safe taxonomic reduction. This software has been successfully used to store and process data consisting of over 1000 trees ready for analyses using standard supertree methods. This software makes large supertree creation a much easier task and provides far greater flexibility for further work. PMID:24891820

  11. Center for Center for Technology for Advanced Scientific Component Software (TASCS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kostadin, Damevski

    A resounding success of the Scientific Discovery through Advanced Computing (SciDAC) program is that high-performance computational science is now universally recognized as a critical aspect of scientific discovery [71], complementing both theoretical and experimental research. As scientific communities prepare to exploit unprecedented computing capabilities of emerging leadership-class machines for multi-model simulations at the extreme scale [72], it is more important than ever to address the technical and social challenges of geographically distributed teams that combine expertise in domain science, applied mathematics, and computer science to build robust and flexible codes that can incorporate changes over time. The Center for Technologymore » for Advanced Scientific Component Software (TASCS)1 tackles these these issues by exploiting component-based software development to facilitate collaborative high-performance scientific computing.« less

  12. Modernized build and test infrastructure for control software at ESO: highly flexible building, testing, and automatic quality practices for telescope control software

    NASA Astrophysics Data System (ADS)

    Pellegrin, F.; Jeram, B.; Haucke, J.; Feyrin, S.

    2016-07-01

    The paper describes the introduction of a new automatized build and test infrastructure, based on the open-source software Jenkins1, into the ESO Very Large Telescope control software to replace the preexisting in-house solution. A brief introduction to software quality practices is given, a description of the previous solution, the limitations of it and new upcoming requirements. Modifications required to adapt the new system are described, how these were implemented to current software and the results obtained. An overview on how the new system may be used in future projects is also presented.

  13. Near Real-Time Event Detection & Prediction Using Intelligent Software Agents

    DTIC Science & Technology

    2006-03-01

    value was 0.06743. Multiple autoregressive integrated moving average ( ARIMA ) models were then build to see if the raw data, differenced data, or...slight improvement. The best adjusted r^2 value was found to be 0.1814. Successful results were not expected from linear or ARIMA -based modelling ...appear, 2005. [63] Mora-Lopez, L., Mora, J., Morales-Bueno, R., et al. Modelling time series of climatic parameters with probabilistic finite

  14. Racing to Success: Using Professional 3-D Design Software to Build CO[2]-Powered Cars in Middle School Science. In the Curriculum--Technology/Education/Science

    ERIC Educational Resources Information Center

    Ogle, Thomas

    2004-01-01

    In 1995, the Farmington Board of Education (Michigan) adopted a new student profile it hoped to achieve by 2007, the year students then in kindergarten would graduate from high school. This article describes a course named "2007"--named so because the 10-week, Grade 8 course was designed to address some of the profile's key attributes:…

  15. Virtual building environments (VBE) - Applying information modeling to buildings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bazjanac, Vladimir

    2004-06-21

    A Virtual Building Environment (VBE) is a ''place'' where building industry project staffs can get help in creating Building Information Models (BIM) and in the use of virtual buildings. It consists of a group of industry software that is operated by industry experts who are also experts in the use of that software. The purpose of a VBE is to facilitate expert use of appropriate software applications in conjunction with each other to efficiently support multidisciplinary work. This paper defines BIM and virtual buildings, and describes VBE objectives, set-up and characteristics of operation. It informs about the VBE Initiative andmore » the benefits from a couple of early VBE projects.« less

  16. Programming Makes Software; Support Makes Users

    NASA Astrophysics Data System (ADS)

    Batcheller, A. L.

    2010-12-01

    Skilled software engineers may build fantastic software for climate modeling, yet fail to achieve their project’s objectives. Software support and related activities are just as critical as writing software. This study followed three different software projects in the climate sciences, using interviews, observation, and document analysis to examine the value added by support work. Supporting the project and interacting with users was a key task for software developers, who often spent 50% of their time on it. Such support work most often involved replying to questions on an email list, but also included talking to users on teleconference calls and in person. Software support increased adoption by building the software’s reputation and showing individuals how the software can meet their needs. In the process of providing support, developers often learned new of requirements as users reported features they desire and bugs they found. As software matures and gains widespread use, support work often increases. In fact, such increases can be one signal that the software has achieved broad acceptance. Maturing projects also find demand for instructional classes, online tutorials and detailed examples of how to use the software. The importance of support highlights the fact that building software systems involves both social and technical aspects. Yes, we need to build the software, but we also need to “build” the users and practices that can take advantage of it.

  17. ATLAS software configuration and build tool optimisation

    NASA Astrophysics Data System (ADS)

    Rybkin, Grigory; Atlas Collaboration

    2014-06-01

    ATLAS software code base is over 6 million lines organised in about 2000 packages. It makes use of some 100 external software packages, is developed by more than 400 developers and used by more than 2500 physicists from over 200 universities and laboratories in 6 continents. To meet the challenge of configuration and building of this software, the Configuration Management Tool (CMT) is used. CMT expects each package to describe its build targets, build and environment setup parameters, dependencies on other packages in a text file called requirements, and each project (group of packages) to describe its policies and dependencies on other projects in a text project file. Based on the effective set of configuration parameters read from the requirements files of dependent packages and project files, CMT commands build the packages, generate the environment for their use, or query the packages. The main focus was on build time performance that was optimised within several approaches: reduction of the number of reads of requirements files that are now read once per package by a CMT build command that generates cached requirements files for subsequent CMT build commands; introduction of more fine-grained build parallelism at package task level, i.e., dependent applications and libraries are compiled in parallel; code optimisation of CMT commands used for build; introduction of package level build parallelism, i. e., parallelise the build of independent packages. By default, CMT launches NUMBER-OF-PROCESSORS build commands in parallel. The other focus was on CMT commands optimisation in general that made them approximately 2 times faster. CMT can generate a cached requirements file for the environment setup command, which is especially useful for deployment on distributed file systems like AFS or CERN VMFS. The use of parallelism, caching and code optimisation significantly-by several times-reduced software build time, environment setup time, increased the efficiency of multi-core computing resources utilisation, and considerably improved software developer and user experience.

  18. Software packager user's guide

    NASA Technical Reports Server (NTRS)

    Callahan, John R.

    1995-01-01

    Software integration is a growing area of concern for many programmers and software managers because the need to build new programs quickly from existing components is greater than ever. This includes building versions of software products for multiple hardware platforms and operating systems, building programs from components written in different languages, and building systems from components that must execute on different machines in a distributed network. The goal of software integration is to make building new programs from existing components more seamless -- programmers should pay minimal attention to the underlying configuration issues involved. Libraries of reusable components and classes are important tools but only partial solutions to software development problems. Even though software components may have compatible interfaces, there may be other reasons, such as differences between execution environments, why they cannot be integrated. Often, components must be adapted or reimplemented to fit into another application because of implementation differences -- they are implemented in different programming languages, dependent on different operating system resources, or must execute on different physical machines. The software packager is a tool that allows programmers to deal with interfaces between software components and ignore complex integration details. The packager takes modular descriptions of the structure of a software system written in the package specification language and produces an integration program in the form of a makefile. If complex integration tools are needed to integrate a set of components, such as remote procedure call stubs, their use is implied by the packager automatically and stub generation tools are invoked in the corresponding makefile. The programmer deals only with the components themselves and not the details of how to build the system on any given platform.

  19. Analysis of Software Development Methodologies to Build Safety Software Applications for the SATEX-II: A Mexican Experimental Satellite

    NASA Astrophysics Data System (ADS)

    Aguilar Cisneros, Jorge; Vargas Martinez, Hector; Pedroza Melendez, Alejandro; Alonso Arevalo, Miguel

    2013-09-01

    Mexico is a country where the experience to build software for satellite applications is beginning. This is a delicate situation because in the near future we will need to develop software for the SATEX-II (Mexican Experimental Satellite). SATEX- II is a SOMECyTA's project (the Mexican Society of Aerospace Science and Technology). We have experienced applying software development methodologies, like TSP (Team Software Process) and SCRUM in other areas. Then, we analyzed these methodologies and we concluded: these can be applied to develop software for the SATEX-II, also, we supported these methodologies with SSP-05-0 Standard in particular with ESA PSS-05-11. Our analysis was focusing on main characteristics of each methodology and how these methodologies could be used with the ESA PSS 05-0 Standards. Our outcomes, in general, may be used by teams who need to build small satellites, but, in particular, these are going to be used when we will build the on board software applications for the SATEX-II.

  20. Spacecraft fault tolerance: The Magellan experience

    NASA Technical Reports Server (NTRS)

    Kasuda, Rick; Packard, Donna Sexton

    1993-01-01

    Interplanetary and earth orbiting missions are now imposing unique fault tolerant requirements upon spacecraft design. Mission success is the prime motivator for building spacecraft with fault tolerant systems. The Magellan spacecraft had many such requirements imposed upon its design. Magellan met these requirements by building redundancy into all the major subsystem components and designing the onboard hardware and software with the capability to detect a fault, isolate it to a component, and issue commands to achieve a back-up configuration. This discussion is limited to fault protection, which is the autonomous capability to respond to a fault. The Magellan fault protection design is discussed, as well as the developmental and flight experiences and a summary of the lessons learned.

  1. ETICS: the international software engineering service for the grid

    NASA Astrophysics Data System (ADS)

    Meglio, A. D.; Bégin, M.-E.; Couvares, P.; Ronchieri, E.; Takacs, E.

    2008-07-01

    The ETICS system is a distributed software configuration, build and test system designed to fulfil the needs of improving the quality, reliability and interoperability of distributed software in general and grid software in particular. The ETICS project is a consortium of five partners (CERN, INFN, Engineering Ingegneria Informatica, 4D Soft and the University of Wisconsin-Madison). The ETICS service consists of a build and test job execution system based on the Metronome software and an integrated set of web services and software engineering tools to design, maintain and control build and test scenarios. The ETICS system allows taking into account complex dependencies among applications and middleware components and provides a rich environment to perform static and dynamic analysis of the software and execute deployment, system and interoperability tests. This paper gives an overview of the system architecture and functionality set and then describes how the EC-funded EGEE, DILIGENT and OMII-Europe projects are using the software engineering services to build, validate and distribute their software. Finally a number of significant use and test cases will be described to show how ETICS can be used in particular to perform interoperability tests of grid middleware using the grid itself.

  2. 3-D Object Recognition from Point Cloud Data

    NASA Astrophysics Data System (ADS)

    Smith, W.; Walker, A. S.; Zhang, B.

    2011-09-01

    The market for real-time 3-D mapping includes not only traditional geospatial applications but also navigation of unmanned autonomous vehicles (UAVs). Massively parallel processes such as graphics processing unit (GPU) computing make real-time 3-D object recognition and mapping achievable. Geospatial technologies such as digital photogrammetry and GIS offer advanced capabilities to produce 2-D and 3-D static maps using UAV data. The goal is to develop real-time UAV navigation through increased automation. It is challenging for a computer to identify a 3-D object such as a car, a tree or a house, yet automatic 3-D object recognition is essential to increasing the productivity of geospatial data such as 3-D city site models. In the past three decades, researchers have used radiometric properties to identify objects in digital imagery with limited success, because these properties vary considerably from image to image. Consequently, our team has developed software that recognizes certain types of 3-D objects within 3-D point clouds. Although our software is developed for modeling, simulation and visualization, it has the potential to be valuable in robotics and UAV applications. The locations and shapes of 3-D objects such as buildings and trees are easily recognizable by a human from a brief glance at a representation of a point cloud such as terrain-shaded relief. The algorithms to extract these objects have been developed and require only the point cloud and minimal human inputs such as a set of limits on building size and a request to turn on a squaring option. The algorithms use both digital surface model (DSM) and digital elevation model (DEM), so software has also been developed to derive the latter from the former. The process continues through the following steps: identify and group 3-D object points into regions; separate buildings and houses from trees; trace region boundaries; regularize and simplify boundary polygons; construct complex roofs. Several case studies have been conducted using a variety of point densities, terrain types and building densities. The results have been encouraging. More work is required for better processing of, for example, forested areas, buildings with sides that are not at right angles or are not straight, and single trees that impinge on buildings. Further work may also be required to ensure that the buildings extracted are of fully cartographic quality. A first version will be included in production software later in 2011. In addition to the standard geospatial applications and the UAV navigation, the results have a further advantage: since LiDAR data tends to be accurately georeferenced, the building models extracted can be used to refine image metadata whenever the same buildings appear in imagery for which the GPS/IMU values are poorer than those for the LiDAR.

  3. Quality Improvement With Discrete Event Simulation: A Primer for Radiologists.

    PubMed

    Booker, Michael T; O'Connell, Ryan J; Desai, Bhushan; Duddalwar, Vinay A

    2016-04-01

    The application of simulation software in health care has transformed quality and process improvement. Specifically, software based on discrete-event simulation (DES) has shown the ability to improve radiology workflows and systems. Nevertheless, despite the successful application of DES in the medical literature, the power and value of simulation remains underutilized. For this reason, the basics of DES modeling are introduced, with specific attention to medical imaging. In an effort to provide readers with the tools necessary to begin their own DES analyses, the practical steps of choosing a software package and building a basic radiology model are discussed. In addition, three radiology system examples are presented, with accompanying DES models that assist in analysis and decision making. Through these simulations, we provide readers with an understanding of the theory, requirements, and benefits of implementing DES in their own radiology practices. Copyright © 2016 American College of Radiology. All rights reserved.

  4. Development of Automated Procedures to Generate Reference Building Models for ASHRAE Standard 90.1 and India’s Building Energy Code and Implementation in OpenStudio

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parker, Andrew; Haves, Philip; Jegi, Subhash

    This paper describes a software system for automatically generating a reference (baseline) building energy model from the proposed (as-designed) building energy model. This system is built using the OpenStudio Software Development Kit (SDK) and is designed to operate on building energy models in the OpenStudio file format.

  5. Software Engineering for Scientific Computer Simulations

    NASA Astrophysics Data System (ADS)

    Post, Douglass E.; Henderson, Dale B.; Kendall, Richard P.; Whitney, Earl M.

    2004-11-01

    Computer simulation is becoming a very powerful tool for analyzing and predicting the performance of fusion experiments. Simulation efforts are evolving from including only a few effects to many effects, from small teams with a few people to large teams, and from workstations and small processor count parallel computers to massively parallel platforms. Successfully making this transition requires attention to software engineering issues. We report on the conclusions drawn from a number of case studies of large scale scientific computing projects within DOE, academia and the DoD. The major lessons learned include attention to sound project management including setting reasonable and achievable requirements, building a good code team, enforcing customer focus, carrying out verification and validation and selecting the optimum computational mathematics approaches.

  6. Safety Characteristics in System Application Software for Human Rated Exploration

    NASA Technical Reports Server (NTRS)

    Mango, E. J.

    2016-01-01

    NASA and its industry and international partners are embarking on a bold and inspiring development effort to design and build an exploration class space system. The space system is made up of the Orion system, the Space Launch System (SLS) and the Ground Systems Development and Operations (GSDO) system. All are highly coupled together and dependent on each other for the combined safety of the space system. A key area of system safety focus needs to be in the ground and flight application software system (GFAS). In the development, certification and operations of GFAS, there are a series of safety characteristics that define the approach to ensure mission success. This paper will explore and examine the safety characteristics of the GFAS development.

  7. MATTS- A Step Towards Model Based Testing

    NASA Astrophysics Data System (ADS)

    Herpel, H.-J.; Willich, G.; Li, J.; Xie, J.; Johansen, B.; Kvinnesland, K.; Krueger, S.; Barrios, P.

    2016-08-01

    In this paper we describe a Model Based approach to testing of on-board software and compare it with traditional validation strategy currently applied to satellite software. The major problems that software engineering will face over at least the next two decades are increasing application complexity driven by the need for autonomy and serious application robustness. In other words, how do we actually get to declare success when trying to build applications one or two orders of magnitude more complex than today's applications. To solve the problems addressed above the software engineering process has to be improved at least for two aspects: 1) Software design and 2) Software testing. The software design process has to evolve towards model-based approaches with extensive use of code generators. Today, testing is an essential, but time and resource consuming activity in the software development process. Generating a short, but effective test suite usually requires a lot of manual work and expert knowledge. In a model-based process, among other subtasks, test construction and test execution can also be partially automated. The basic idea behind the presented study was to start from a formal model (e.g. State Machines), generate abstract test cases which are then converted to concrete executable test cases (input and expected output pairs). The generated concrete test cases were applied to an on-board software. Results were collected and evaluated wrt. applicability, cost-efficiency, effectiveness at fault finding, and scalability.

  8. Integrating and Managing Bim in GIS, Software Review

    NASA Astrophysics Data System (ADS)

    El Meouche, R.; Rezoug, M.; Hijazi, I.

    2013-08-01

    Since the advent of Computer-Aided Design (CAD) and Geographical Information System (GIS) tools, project participants have been increasingly leveraging these tools throughout the different phases of a civil infrastructure project. In recent years the number of GIS software that provides tools to enable the integration of Building information in geo context has risen sharply. More and more GIS software are added tools for this purposes and other software projects are regularly extending these tools. However, each software has its different strength and weakness and its purpose of use. This paper provides a thorough review to investigate the software capabilities and clarify its purpose. For this study, Autodesk Revit 2012 i.e. BIM editor software was used to create BIMs. In the first step, three building models were created, the resulted models were converted to BIM format and then the software was used to integrate it. For the evaluation of the software, general characteristics was studied such as the user interface, what formats are supported (import/export), and the way building information are imported.

  9. Transportable Payload Operations Control Center reusable software: Building blocks for quality ground data systems

    NASA Technical Reports Server (NTRS)

    Mahmot, Ron; Koslosky, John T.; Beach, Edward; Schwarz, Barbara

    1994-01-01

    The Mission Operations Division (MOD) at Goddard Space Flight Center builds Mission Operations Centers which are used by Flight Operations Teams to monitor and control satellites. Reducing system life cycle costs through software reuse has always been a priority of the MOD. The MOD's Transportable Payload Operations Control Center development team established an extensive library of 14 subsystems with over 100,000 delivered source instructions of reusable, generic software components. Nine TPOCC-based control centers to date support 11 satellites and achieved an average software reuse level of more than 75 percent. This paper shares experiences of how the TPOCC building blocks were developed and how building block developer's, mission development teams, and users are all part of the process.

  10. Environmental assessment of low-energy social housing, Boatemah Walk building, Brixton

    NASA Astrophysics Data System (ADS)

    Vargas, Lidia Johansen

    Energy use from buildings represents a considerable share from the UK energy consumption as a whole and the resulting C02 emissions are considered the main driver for climate change. There is a global urge for new and existing buildings to be truly effective in reducing their energy consumption. This study evaluates the performance in use of low energy design in social housing: Boatemah Walk is a newly built residential block of 18 flats located in Angell Town, Brixton, which benefits from various low energy enhancing features such as: a low embodied energy building fabric, super insulation, photovoltaic panels integrated in the roof, rainwater recycling system and non-toxic building materials and finishes. The new building layout and surrounding landscape influences positively the community integration and safety. The evaluation has been done through observation, monitoring, interviews with tenants and the use of TAS software, throughout the year after occupation. Boatemah Walk building has proved successful in some aspects and less successful in others. It is crucial that a demonstration project like Boatemah Walk building considers all mechanisms necessary to monitor its efficiency, as this would provide feedback to prove the efficiency and encourage similar investments. However, during the course of the study it was found that a meter for the recycled water and export meters for the photovoltaic production were missing. This proved to be an obstacle for the accurate monitoring of the building performance. The annual heating in Boatemah Walk is below the national averages, which confirms the good performance of its building fabric. In hot summer days the lightweight building is expectedly vulnerable to the outside. This is not a frequent occurrence however the effects of climate change are very likely to increase the length and temperatures in the future. The tenants' energy consuming behavior has a definitive impact, as revealed through monitoring and direct interviews. There is a wide difference between tenants in terms of their environmental concern and attitudes, which is reflected in the overall performance of the building. One of the most successful aspects of this development is probably the effect it is having in the community. The tenants are highly satisfied with the building in various aspects, and the ones who used to live in Angell Town before the regeneration have experienced a very positive change in their quality of life and a sense of pride about their community.

  11. Patient care transformation: the plan and the reality.

    PubMed

    Drexler, Diane; Malloch, Kathy

    2006-01-01

    An explosion of new hospital building has created the opportunity for nurse leaders to transform the patient care experience with evidence-based architecture, technology innovations, and new patient care delivery models. The authors share the first-year results of the creation of a hospital of the future in which staff actively participated and addressed the challenges of transforming the patient care experience. Positive results include patient satisfaction at the 99th percentile, successful integration of 63 software applications, and energized nursing staff.

  12. Strategy Guideline. Modeling Enclosure Design in Above-Grade Walls

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    J. Lstiburek; Ueno, K.; Musunuru, S.

    2016-02-01

    The Strategy Guideline, written by the U.S. Department of Energy's research team Building Science Corporation, 1) describes how to model and interpret results of models for above-grade walls, and 2) analyzes the failure thresholds and criteria for above-grade walls. A library of above-grade walls with historically successful performance was used to calibrate WUFI (Wärme und Feuchte instationär) software models. The information is generalized for application to a broad population of houses within the limits of existing experience.

  13. A Team Building Model for Software Engineering Courses Term Projects

    ERIC Educational Resources Information Center

    Sahin, Yasar Guneri

    2011-01-01

    This paper proposes a new model for team building, which enables teachers to build coherent teams rapidly and fairly for the term projects of software engineering courses. Moreover, the model can also be used to build teams for any type of project, if the team member candidates are students, or if they are inexperienced on a certain subject. The…

  14. BigData as a Driver for Capacity Building in Astrophysics

    NASA Astrophysics Data System (ADS)

    Shastri, Prajval

    2015-08-01

    Exciting public interest in astrophysics acquires new significance in the era of Big Data. Since Big Data involves advanced technologies of both software and hardware, astrophysics with Big Data has the potential to inspire young minds with diverse inclinations - i.e., not just those attracted to physics but also those pursuing engineering careers. Digital technologies have become steadily cheaper, which can enable expansion of the Big Data user pool considerably, especially to communities that may not yet be in the astrophysics mainstream, but have high potential because of access to thesetechnologies. For success, however, capacity building at the early stages becomes key. The development of on-line pedagogical resources in astrophysics, astrostatistics, data-mining and data visualisation that are designed around the big facilities of the future can be an important effort that drives such capacity building, especially if facilitated by the IAU.

  15. An Ontology for Software Engineering Education

    ERIC Educational Resources Information Center

    Ling, Thong Chee; Jusoh, Yusmadi Yah; Adbullah, Rusli; Alwi, Nor Hayati

    2013-01-01

    Software agents communicate using ontology. It is important to build an ontology for specific domain such as Software Engineering Education. Building an ontology from scratch is not only hard, but also incur much time and cost. This study aims to propose an ontology through adaptation of the existing ontology which is originally built based on a…

  16. Dave Roberts | NREL

    Science.gov Websites

    Engineer in Colorado. He has expertise in building science, building energy simulation, and software simulation and software development projects, and served as product manager for the REM/Rate(tm) home energy

  17. HashDist: Reproducible, Relocatable, Customizable, Cross-Platform Software Stacks for Open Hydrological Science

    NASA Astrophysics Data System (ADS)

    Ahmadia, A. J.; Kees, C. E.

    2014-12-01

    Developing scientific software is a continuous balance between not reinventing the wheel and getting fragile codes to interoperate with one another. Binary software distributions such as Anaconda provide a robust starting point for many scientific software packages, but this solution alone is insufficient for many scientific software developers. HashDist provides a critical component of the development workflow, enabling highly customizable, source-driven, and reproducible builds for scientific software stacks, available from both the IPython Notebook and the command line. To address these issues, the Coastal and Hydraulics Laboratory at the US Army Engineer Research and Development Center has funded the development of HashDist in collaboration with Simula Research Laboratories and the University of Texas at Austin. HashDist is motivated by a functional approach to package build management, and features intelligent caching of sources and builds, parametrized build specifications, and the ability to interoperate with system compilers and packages. HashDist enables the easy specification of "software stacks", which allow both the novice user to install a default environment and the advanced user to configure every aspect of their build in a modular fashion. As an advanced feature, HashDist builds can be made relocatable, allowing the easy redistribution of binaries on all three major operating systems as well as cloud, and supercomputing platforms. As a final benefit, all HashDist builds are reproducible, with a build hash specifying exactly how each component of the software stack was installed. This talk discusses the role of HashDist in the hydrological sciences, including its use by the Coastal and Hydraulics Laboratory in the development and deployment of the Proteus Toolkit as well as the Rapid Operational Access and Maneuver Support project. We demonstrate HashDist in action, and show how it can effectively support development, deployment, teaching, and reproducibility for scientists working in the hydrological sciences. The HashDist documentation is available from: http://hashdist.readthedocs.org/en/latest/ HashDist is currently hosted at: https://github.com/hashdist/hashdist

  18. Modelface: an Application Programming Interface (API) for Homology Modeling Studies Using Modeller Software

    PubMed Central

    Sakhteman, Amirhossein; Zare, Bijan

    2016-01-01

    An interactive application, Modelface, was presented for Modeller software based on windows platform. The application is able to run all steps of homology modeling including pdb to fasta generation, running clustal, model building and loop refinement. Other modules of modeler including energy calculation, energy minimization and the ability to make single point mutations in the PDB structures are also implemented inside Modelface. The API is a simple batch based application with no memory occupation and is free of charge for academic use. The application is also able to repair missing atom types in the PDB structures making it suitable for many molecular modeling studies such as docking and molecular dynamic simulation. Some successful instances of modeling studies using Modelface are also reported. PMID:28243276

  19. Built To Last: Using Iterative Development Models for Sustainable Scientific Software Development

    NASA Astrophysics Data System (ADS)

    Jasiak, M. E.; Truslove, I.; Savoie, M.

    2013-12-01

    In scientific research, software development exists fundamentally for the results they create. The core research must take focus. It seems natural to researchers, driven by grant deadlines, that every dollar invested in software development should be used to push the boundaries of problem solving. This system of values is frequently misaligned with those of the software being created in a sustainable fashion; short-term optimizations create longer-term sustainability issues. The National Snow and Ice Data Center (NSIDC) has taken bold cultural steps in using agile and lean development and management methodologies to help its researchers meet critical deadlines, while building in the necessary support structure for the code to live far beyond its original milestones. Agile and lean software development and methodologies including Scrum, Kanban, Continuous Delivery and Test-Driven Development have seen widespread adoption within NSIDC. This focus on development methods is combined with an emphasis on explaining to researchers why these methods produce more desirable results for everyone, as well as promoting developers interacting with researchers. This presentation will describe NSIDC's current scientific software development model, how this addresses the short-term versus sustainability dichotomy, the lessons learned and successes realized by transitioning to this agile and lean-influenced model, and the current challenges faced by the organization.

  20. NASA's Earth Imagery Service as Open Source Software

    NASA Astrophysics Data System (ADS)

    De Cesare, C.; Alarcon, C.; Huang, T.; Roberts, J. T.; Rodriguez, J.; Cechini, M. F.; Boller, R. A.; Baynes, K.

    2016-12-01

    The NASA Global Imagery Browse Service (GIBS) is a software system that provides access to an archive of historical and near-real-time Earth imagery from NASA-supported satellite instruments. The imagery itself is open data, and is accessible via standards such as the Open Geospatial Consortium (OGC)'s Web Map Tile Service (WMTS) protocol. GIBS includes three core software projects: The Imagery Exchange (TIE), OnEarth, and the Meta Raster Format (MRF) project. These projects are developed using a variety of open source software, including: Apache HTTPD, GDAL, Mapserver, Grails, Zookeeper, Eclipse, Maven, git, and Apache Commons. TIE has recently been released for open source, and is now available on GitHub. OnEarth, MRF, and their sub-projects have been on GitHub since 2014, and the MRF project in particular receives many external contributions from the community. Our software has been successful beyond the scope of GIBS: the PO.DAAC State of the Ocean and COVERAGE visualization projects reuse components from OnEarth. The MRF source code has recently been incorporated into GDAL, which is a core library in many widely-used GIS software such as QGIS and GeoServer. This presentation will describe the challenges faced in incorporating open software and open data into GIBS, and also showcase GIBS as a platform on which scientists and the general public can build their own applications.

  1. Using C to build a satellite scheduling expert system: Examples from the Explorer Platform planning system

    NASA Technical Reports Server (NTRS)

    Mclean, David R.; Tuchman, Alan; Potter, William J.

    1991-01-01

    A C-based artificial intelligence (AI) development effort which is based on a software tools approach is discussed with emphasis on reusability and maintainability of code. The discussion starts with simple examples of how list processing can easily be implemented in C and then proceeds to the implementations of frames and objects which use dynamic memory allocation. The implementation of procedures which use depth first search, constraint propagation, context switching, and blackboard-like simulation environment are described. Techniques for managing the complexity of C-based AI software are noted, especially the object-oriented techniques of data encapsulation and incremental development. Finally, all these concepts are put together by describing the components of planning software called the Planning And Resource Reasoning (PARR) Shell. This shell was successfully utilized for scheduling services of the Tracking and Data Relay Satellite System for the Earth Radiation Budget Satellite since May of 1987 and will be used for operations scheduling of the Explorer Platform in Nov. of 1991.

  2. Whole earth modeling: developing and disseminating scientific software for computational geophysics.

    NASA Astrophysics Data System (ADS)

    Kellogg, L. H.

    2016-12-01

    Historically, a great deal of specialized scientific software for modeling and data analysis has been developed by individual researchers or small groups of scientists working on their own specific research problems. As the magnitude of available data and computer power has increased, so has the complexity of scientific problems addressed by computational methods, creating both a need to sustain existing scientific software, and expand its development to take advantage of new algorithms, new software approaches, and new computational hardware. To that end, communities like the Computational Infrastructure for Geodynamics (CIG) have been established to support the use of best practices in scientific computing for solid earth geophysics research and teaching. Working as a scientific community enables computational geophysicists to take advantage of technological developments, improve the accuracy and performance of software, build on prior software development, and collaborate more readily. The CIG community, and others, have adopted an open-source development model, in which code is developed and disseminated by the community in an open fashion, using version control and software repositories like Git. One emerging issue is how to adequately identify and credit the intellectual contributions involved in creating open source scientific software. The traditional method of disseminating scientific ideas, peer reviewed publication, was not designed for review or crediting scientific software, although emerging publication strategies such software journals are attempting to address the need. We are piloting an integrated approach in which authors are identified and credited as scientific software is developed and run. Successful software citation requires integration with the scholarly publication and indexing mechanisms as well, to assign credit, ensure discoverability, and provide provenance for software.

  3. Re-Writing the Construction History of Boughton House (northamptonshire, Uk) with the Help of DOCU-TOOLS®

    NASA Astrophysics Data System (ADS)

    Schuster, J. C.

    2017-08-01

    The tablet-based software docu-tools digitize the documentation of buildings, simplifies construction and facility management and the data analysis in building and construction-history research. As a plan-based software, `pins' can be set to record data (images, audio, text etc.), each data point containing a time and date stamp. Once a pin is set and information recorded, it can never be deleted from the system, creating clear contentious-free documentation. Reports to any/all data recorded can immediately be generated through various templates in order to share, document, analyze and archive the information gathered. The software both digitizes building condition assessment, as well as simplifies the fully documented management and solving of problems and monitoring of a building. Used both in the construction industry and for documenting and analyzing historic buildings, docu-tools is a versatile and flexible tool that has become integral to my work as a building historian working on the conservation and curating of the historic built environment in Europe. I used the software at Boughton House, Northamptonshire, UK, during a one-year research project into the construction history of the building. The details of how docu-tools was used during this project will be discussed in this paper.

  4. Reconfigurable Software for Mission Operations

    NASA Technical Reports Server (NTRS)

    Trimble, Jay

    2014-01-01

    We developed software that provides flexibility to mission organizations through modularity and composability. Modularity enables removal and addition of functionality through the installation of plug-ins. Composability enables users to assemble software from pre-built reusable objects, thus reducing or eliminating the walls associated with traditional application architectures and enabling unique combinations of functionality. We have used composable objects to reduce display build time, create workflows, and build scenarios to test concepts for lunar roving operations. The software is open source, and may be downloaded from https:github.comnasamct.

  5. Software Solution Builds Project Consensus.

    ERIC Educational Resources Information Center

    Graue, David

    2003-01-01

    Describes the use of Autodesk Revit, a computer software system for design and documentation of buildings, in the planning of the University Center of Chicago, a large residence hall involving the cooperation of DePaul University, Columbia College, and Roosevelt University. (EV)

  6. LCG/AA build infrastructure

    NASA Astrophysics Data System (ADS)

    Hodgkins, Alex Liam; Diez, Victor; Hegner, Benedikt

    2012-12-01

    The Software Process & Infrastructure (SPI) project provides a build infrastructure for regular integration testing and release of the LCG Applications Area software stack. In the past, regular builds have been provided using a system which has been constantly growing to include more features like server-client communication, long-term build history and a summary web interface using present-day web technologies. However, the ad-hoc style of software development resulted in a setup that is hard to monitor, inflexible and difficult to expand. The new version of the infrastructure is based on the Django Python framework, which allows for a structured and modular design, facilitating later additions. Transparency in the workflows and ease of monitoring has been one of the priorities in the design. Formerly missing functionality like on-demand builds or release triggering will support the transition to a more agile development process.

  7. Early experiences building a software quality prediction model

    NASA Technical Reports Server (NTRS)

    Agresti, W. W.; Evanco, W. M.; Smith, M. C.

    1990-01-01

    Early experiences building a software quality prediction model are discussed. The overall research objective is to establish a capability to project a software system's quality from an analysis of its design. The technical approach is to build multivariate models for estimating reliability and maintainability. Data from 21 Ada subsystems were analyzed to test hypotheses about various design structures leading to failure-prone or unmaintainable systems. Current design variables highlight the interconnectivity and visibility of compilation units. Other model variables provide for the effects of reusability and software changes. Reported results are preliminary because additional project data is being obtained and new hypotheses are being developed and tested. Current multivariate regression models are encouraging, explaining 60 to 80 percent of the variation in error density of the subsystems.

  8. Safety Characteristics in System Application of Software for Human Rated Exploration Missions for the 8th IAASS Conference

    NASA Technical Reports Server (NTRS)

    Mango, Edward J.

    2016-01-01

    NASA and its industry and international partners are embarking on a bold and inspiring development effort to design and build an exploration class space system. The space system is made up of the Orion system, the Space Launch System (SLS) and the Ground Systems Development and Operations (GSDO) system. All are highly coupled together and dependent on each other for the combined safety of the space system. A key area of system safety focus needs to be in the ground and flight application software system (GFAS). In the development, certification and operations of GFAS, there are a series of safety characteristics that define the approach to ensure mission success. This paper will explore and examine the safety characteristics of the GFAS development. The GFAS system integrates the flight software packages of the Orion and SLS with the ground systems and launch countdown sequencers through the 'agile' software development process. A unique approach is needed to develop the GFAS project capabilities within this agile process. NASA has defined the software development process through a set of standards. The standards were written during the infancy of the so-called industry 'agile development' movement and must be tailored to adapt to the highly integrated environment of human exploration systems. Safety of the space systems and the eventual crew on board is paramount during the preparation of the exploration flight systems. A series of software safety characteristics have been incorporated into the development and certification efforts to ensure readiness for use and compatibility with the space systems. Three underlining factors in the exploration architecture require the GFAS system to be unique in its approach to ensure safety for the space systems, both the flight as well as the ground systems. The first are the missions themselves, which are exploration in nature, and go far beyond the comfort of low Earth orbit operations. The second is the current exploration system will launch only one mission per year even less during its developmental phases. Finally, the third is the partnered approach through the use of many different prime contractors, including commercial and international partners, to design and build the exploration systems. These three factors make the challenges to meet the mission preparations and the safety expectations extremely difficult to implement. As NASA leads a team of partners in the exploration beyond earth's influence, it is a safety imperative that the application software used to test, checkout, prepare and launch the exploration systems put safety of the hardware and mission first. Software safety characteristics are built into the design and development process to enable the human rated systems to begin their missions safely and successfully. Exploration missions beyond Earth are inherently risky, however, with solid safety approaches in both hardware and software, the boldness of these missions can be realized for all on the home planet.

  9. A Study of the Use of Ontologies for Building Computer-Aided Control Engineering Self-Learning Educational Software

    ERIC Educational Resources Information Center

    García, Isaías; Benavides, Carmen; Alaiz, Héctor; Alonso, Angel

    2013-01-01

    This paper describes research on the use of knowledge models (ontologies) for building computer-aided educational software in the field of control engineering. Ontologies are able to represent in the computer a very rich conceptual model of a given domain. This model can be used later for a number of purposes in different software applications. In…

  10. ImageParser: a tool for finite element generation from three-dimensional medical images

    PubMed Central

    Yin, HM; Sun, LZ; Wang, G; Yamada, T; Wang, J; Vannier, MW

    2004-01-01

    Background The finite element method (FEM) is a powerful mathematical tool to simulate and visualize the mechanical deformation of tissues and organs during medical examinations or interventions. It is yet a challenge to build up an FEM mesh directly from a volumetric image partially because the regions (or structures) of interest (ROIs) may be irregular and fuzzy. Methods A software package, ImageParser, is developed to generate an FEM mesh from 3-D tomographic medical images. This software uses a semi-automatic method to detect ROIs from the context of image including neighboring tissues and organs, completes segmentation of different tissues, and meshes the organ into elements. Results The ImageParser is shown to build up an FEM model for simulating the mechanical responses of the breast based on 3-D CT images. The breast is compressed by two plate paddles under an overall displacement as large as 20% of the initial distance between the paddles. The strain and tangential Young's modulus distributions are specified for the biomechanical analysis of breast tissues. Conclusion The ImageParser can successfully exact the geometry of ROIs from a complex medical image and generate the FEM mesh with customer-defined segmentation information. PMID:15461787

  11. Feasibility of Close-Range Photogrammetric Models for Geographic Information System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, Luke; /Rice U.

    2011-06-22

    The objective of this project was to determine the feasibility of using close-range architectural photogrammetry as an alternative three dimensional modeling technique in order to place the digital models in a geographic information system (GIS) at SLAC. With the available equipment and Australis photogrammetry software, the creation of full and accurate models of an example building, Building 281 on SLAC campus, was attempted. After conducting several equipment tests to determine the precision achievable, a complete photogrammetric survey was attempted. The dimensions of the resulting models were then compared against the true dimensions of the building. A complete building model wasmore » not evidenced to be obtainable using the current equipment and software. This failure was likely attributable to the limits of the software rather than the precision of the physical equipment. However, partial models of the building were shown to be accurate and determined to still be usable in a GIS. With further development of the photogrammetric software and survey procedure, the desired generation of a complete three dimensional model is likely still feasible.« less

  12. Building Energy Simulation Test for Existing Homes (BESTEST-EX) (Presentation)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Judkoff, R.; Neymark, J.; Polly, B.

    2011-12-01

    This presentation discusses the goals of NREL Analysis Accuracy R&D; BESTEST-EX goals; what BESTEST-EX is; how it works; 'Building Physics' cases; 'Building Physics' reference results; 'utility bill calibration' cases; limitations and potential future work. Goals of NREL Analysis Accuracy R&D are: (1) Provide industry with the tools and technical information needed to improve the accuracy and consistency of analysis methods; (2) Reduce the risks associated with purchasing, financing, and selling energy efficiency upgrades; and (3) Enhance software and input collection methods considering impacts on accuracy, cost, and time of energy assessments. BESTEST-EX Goals are: (1) Test software predictions of retrofitmore » energy savings in existing homes; (2) Ensure building physics calculations and utility bill calibration procedures perform up to a minimum standard; and (3) Quantify impact of uncertainties in input audit data and occupant behavior. BESTEST-EX is a repeatable procedure that tests how well audit software predictions compare to the current state of the art in building energy simulation. There is no direct truth standard. However, reference software have been subjected to validation testing, including comparisons with empirical data.« less

  13. How Nasa's Independent Verification and Validation (IVandV) Program Builds Reliability into a Space Mission Software System (SMSS)

    NASA Technical Reports Server (NTRS)

    Fisher, Marcus S.; Northey, Jeffrey; Stanton, William

    2014-01-01

    The purpose of this presentation is to outline how the NASA Independent Verification and Validation (IVV) Program helps to build reliability into the Space Mission Software Systems (SMSSs) that its customers develop.

  14. NASA/CARES dual-use ceramic technology spinoff applications

    NASA Technical Reports Server (NTRS)

    Powers, Lynn M.; Janosik, Lesley A.; Gyekenyesi, John P.; Nemeth, Noel N.

    1994-01-01

    NASA has developed software that enables American industry to establish the reliability and life of ceramic structures in a wide variety of 21st Century applications. Designing ceramic components to survive at higher temperatures than the capability of most metals and in severe loading environments involves the disciplines of statistics and fracture mechanics. Successful application of advanced ceramics material properties and the use of a probabilistic brittle material design methodology. The NASA program, known as CARES (Ceramics Analysis and Reliability Evaluation of Structures), is a comprehensive general purpose design tool that predicts the probability of failure of a ceramic component as a function of its time in service. The latest version of this software, CARESALIFE, is coupled to several commercially available finite element analysis programs (ANSYS, MSC/NASTRAN, ABAQUS, COSMOS/N4, MARC), resulting in an advanced integrated design tool which is adapted to the computing environment of the user. The NASA-developed CARES software has been successfully used by industrial, government, and academic organizations to design and optimize ceramic components for many demanding applications. Industrial sectors impacted by this program include aerospace, automotive, electronic, medical, and energy applications. Dual-use applications include engine components, graphite and ceramic high temperature valves, TV picture tubes, ceramic bearings, electronic chips, glass building panels, infrared windows, radiant heater tubes, heat exchangers, and artificial hips, knee caps, and teeth.

  15. Developing Engineering and Science Process Skills Using Design Software in an Elementary Education

    NASA Astrophysics Data System (ADS)

    Fusco, Christopher

    This paper examines the development of process skills through an engineering design approach to instruction in an elementary lesson that combines Science, Technology, Engineering, and Math (STEM). The study took place with 25 fifth graders in a public, suburban school district. Students worked in groups of five to design and construct model bridges based on research involving bridge building design software. The assessment was framed around individual student success as well as overall group processing skills. These skills were assessed through an engineering design packet rubric (student work), student surveys of learning gains, observation field notes, and pre- and post-assessment data. The results indicate that students can successfully utilize design software to inform constructions of model bridges, develop science process skills through problem based learning, and understand academic concepts through a design project. The final result of this study shows that design engineering is effective for developing cooperative learning skills. The study suggests that an engineering program offered as an elective or as part of the mandatory curriculum could be beneficial for developing students' critical thinking, inter- and intra-personal skills, along with an increased their understanding and awareness for scientific phenomena. In conclusion, combining a design approach to instruction with STEM can increase efficiency in these areas, generate meaningful learning, and influence student attitudes throughout their education.

  16. Software Carpentry and the Hydrological Sciences

    NASA Astrophysics Data System (ADS)

    Ahmadia, A. J.; Kees, C. E.; Farthing, M. W.

    2013-12-01

    Scientists are spending an increasing amount of time building and using hydrology software. However, most scientists are never taught how to do this efficiently. As a result, many are unaware of tools and practices that would allow them to write more reliable and maintainable code with less effort. As hydrology models increase in capability and enter use by a growing number of scientists and their communities, it is important that the scientific software development practices scale up to meet the challenges posed by increasing software complexity, lengthening software lifecycles, a growing number of stakeholders and contributers, and a broadened developer base that extends from application domains to high performance computing centers. Many of these challenges in complexity, lifecycles, and developer base have been successfully met by the open source community, and there are many lessons to be learned from their experiences and practices. Additionally, there is much wisdom to be found in the results of research studies conducted on software engineering itself. Software Carpentry aims to bridge the gap between the current state of software development and these known best practices for scientific software development, with a focus on hands-on exercises and practical advice based on the following principles: 1. Write programs for people, not computers. 2. Automate repetitive tasks 3. Use the computer to record history 4. Make incremental changes 5. Use version control 6. Don't repeat yourself (or others) 7. Plan for mistakes 8. Optimize software only after it works 9. Document design and purpose, not mechanics 10. Collaborate We discuss how these best practices, arising from solid foundations in research and experience, have been shown to help improve scientist's productivity and the reliability of their software.

  17. [Extraction of buildings three-dimensional information from high-resolution satellite imagery based on Barista software].

    PubMed

    Zhang, Pei-feng; Hu, Yuan-man; He, Hong-shi

    2010-05-01

    The demand for accurate and up-to-date spatial information of urban buildings is becoming more and more important for urban planning, environmental protection, and other vocations. Today's commercial high-resolution satellite imagery offers the potential to extract the three-dimensional information of urban buildings. This paper extracted the three-dimensional information of urban buildings from QuickBird imagery, and validated the precision of the extraction based on Barista software. It was shown that the extraction of three-dimensional information of the buildings from high-resolution satellite imagery based on Barista software had the advantages of low professional level demand, powerful universality, simple operation, and high precision. One pixel level of point positioning and height determination accuracy could be achieved if the digital elevation model (DEM) and sensor orientation model had higher precision and the off-Nadir View Angle was relatively perfect.

  18. BIM cost analysis of transport infrastructure projects

    NASA Astrophysics Data System (ADS)

    Volkov, Andrey; Chelyshkov, Pavel; Grossman, Y.; Khromenkova, A.

    2017-10-01

    The article describes the method of analysis of the energy costs of transport infrastructure objects using BIM software. The paper consideres several options of orientation of a building using SketchUp and IES VE software programs. These options allow to choose the best direction of the building facades. Particular attention is given to a distribution of a temperature field in a cross-section of the wall according to the calculation made in the ELCUT software. The issues related to calculation of solar radiation penetration into a building and selection of translucent structures are considered in the paper. The article presents data on building codes relating to the transport sector, on the basis of which the calculations were made. The author emphasizes that BIM-programs should be implemented and used in order to optimize a thermal behavior of a building and increase its energy efficiency using climatic data.

  19. Improving Data Catalogs with Free and Open Source Software

    NASA Astrophysics Data System (ADS)

    Schweitzer, R.; Hankin, S.; O'Brien, K.

    2013-12-01

    The Global Earth Observation Integrated Data Environment (GEO-IDE) is NOAA's effort to successfully integrate data and information with partners in the national US-Global Earth Observation System (US-GEO) and the international Global Earth Observation System of Systems (GEOSS). As part of the GEO-IDE, the Unified Access Framework (UAF) is working to build momentum towards the goal of increased data integration and interoperability. The UAF project is moving towards this goal with an approach that includes leveraging well known and widely used standards, as well as free and open source software. The UAF project shares the widely held conviction that the use of data standards is a key ingredient necessary to achieve interoperability. Many community-based consensus standards fail, though, due to poor compliance. Compliance problems emerge for many reasons: because the standards evolve through versions, because documentation is ambiguous or because individual data providers find the standard inadequate as-is to meet their special needs. In addition, minimalist use of standards will lead to a compliant service, but one which is of low quality. In this presentation, we will be discussing the UAF effort to build a catalog cleaning tool which is designed to crawl THREDDS catalogs, analyze the data available, and then build a 'clean' catalog of data which is standards compliant and has a uniform set of data access services available. These data services include, among others, OPeNDAP, Web Coverage Service (WCS) and Web Mapping Service (WMS). We will also discuss how we are utilizing free and open source software and services to both crawl, analyze and build the clean data catalog, as well as our efforts to help data providers improve their data catalogs. We'll discuss the use of open source software such as DataNucleus, Thematic Realtime Environmental Distributed Data Services (THREDDS), ncISO and the netCDF Java Common Data Model (CDM). We'll also demonstrate how we are using free services such as Google Charts to create an easily identifiable visual metaphor which describes the quality of data catalogs. Using this rubric, in conjunction with the ncISO metadata quality rubric, will allow data providers to identify non-compliance issues in their data catalogs, thereby improving data availability to their users and to data discovery systems

  20. Requirements: Towards an understanding on why software projects fail

    NASA Astrophysics Data System (ADS)

    Hussain, Azham; Mkpojiogu, Emmanuel O. C.

    2016-08-01

    Requirement engineering is at the foundation of every successful software project. There are many reasons for software project failures; however, poorly engineered requirements process contributes immensely to the reason why software projects fail. Software project failure is usually costly and risky and could also be life threatening. Projects that undermine requirements engineering suffer or are likely to suffer from failures, challenges and other attending risks. The cost of project failures and overruns when estimated is very huge. Furthermore, software project failures or overruns pose a challenge in today's competitive market environment. It affects the company's image, goodwill, and revenue drive and decreases the perceived satisfaction of customers and clients. In this paper, requirements engineering was discussed. Its role in software projects success was elaborated. The place of software requirements process in relation to software project failure was explored and examined. Also, project success and failure factors were also discussed with emphasis placed on requirements factors as they play a major role in software projects' challenges, successes and failures. The paper relied on secondary data and empirical statistics to explore and examine factors responsible for the successes, challenges and failures of software projects in large, medium and small scaled software companies.

  1. Quantitative reactive modeling and verification.

    PubMed

    Henzinger, Thomas A

    Formal verification aims to improve the quality of software by detecting errors before they do harm. At the basis of formal verification is the logical notion of correctness , which purports to capture whether or not a program behaves as desired. We suggest that the boolean partition of software into correct and incorrect programs falls short of the practical need to assess the behavior of software in a more nuanced fashion against multiple criteria. We therefore propose to introduce quantitative fitness measures for programs, specifically for measuring the function, performance, and robustness of reactive programs such as concurrent processes. This article describes the goals of the ERC Advanced Investigator Project QUAREM. The project aims to build and evaluate a theory of quantitative fitness measures for reactive models. Such a theory must strive to obtain quantitative generalizations of the paradigms that have been success stories in qualitative reactive modeling, such as compositionality, property-preserving abstraction and abstraction refinement, model checking, and synthesis. The theory will be evaluated not only in the context of software and hardware engineering, but also in the context of systems biology. In particular, we will use the quantitative reactive models and fitness measures developed in this project for testing hypotheses about the mechanisms behind data from biological experiments.

  2. CmapTools: A Software Environment for Knowledge Modeling and Sharing

    NASA Technical Reports Server (NTRS)

    Canas, Alberto J.

    2004-01-01

    In an ongoing collaborative effort between a group of NASA Ames scientists and researchers at the Institute for Human and Machine Cognition (IHMC) of the University of West Florida, a new version of CmapTools has been developed that enable scientists to construct knowledge models of their domain of expertise, share them with other scientists, make them available to anybody on the Internet with access to a Web browser, and peer-review other scientists models. These software tools have been successfully used at NASA to build a large-scale multimedia on Mars and in knowledge model on Habitability Assessment. The new version of the software places emphasis on greater usability for experts constructing their own knowledge models, and support for the creation of large knowledge models with large number of supporting resources in the forms of images, videos, web pages, and other media. Additionally, the software currently allows scientists to cooperate with each other in the construction, sharing and criticizing of knowledge models. Scientists collaborating from remote distances, for example researchers at the Astrobiology Institute, can concurrently manipulate the knowledge models they are viewing without having to do this at a special videoconferencing facility.

  3. Ship electric propulsion simulator based on networking technology

    NASA Astrophysics Data System (ADS)

    Zheng, Huayao; Huang, Xuewu; Chen, Jutao; Lu, Binquan

    2006-11-01

    According the new ship building tense, a novel electric propulsion simulator (EPS) had been developed in Marine Simulation Center of SMU. The architecture, software function and FCS network technology of EPS and integrated power system (IPS) were described. In allusion to the POD propeller in ship, a special physical model was built. The POD power was supplied from the simulative 6.6 kV Medium Voltage Main Switchboard, its control could be realized in local or remote mode. Through LAN, the simulated feature information of EPS will pass to the physical POD model, which would reflect the real thruster working status in different sea conditions. The software includes vessel-propeller math module, thruster control system, distribution and emergency integrated management, double closed loop control system, vessel static water resistance and dynamic software; instructor main control software. The monitor and control system is realized by real time data collection system and CAN bus technology. During the construction, most devices such as monitor panels and intelligent meters, are developed in lab which were based on embedded microcomputer system with CAN interface to link the network. They had also successfully used in practice and would be suitable for the future demands of digitalization ship.

  4. ObsPy: A Python toolbox for seismology - Current state, applications, and ecosystem around it

    NASA Astrophysics Data System (ADS)

    Lecocq, Thomas; Megies, Tobias; Krischer, Lion; Sales de Andrade, Elliott; Barsch, Robert; Beyreuther, Moritz

    2016-04-01

    ObsPy (http://www.obspy.org) is a community-driven, open-source project offering a bridge for seismology into the scientific Python ecosystem. It provides * read and write support for essentially all commonly used waveform, station, and event metadata formats with a unified interface, * a comprehensive signal processing toolbox tuned to the needs of seismologists, * integrated access to all large data centers, web services and databases, and * convenient wrappers to third party codes like libmseed and evalresp. Python, in contrast to many other languages and tools, is simple enough to enable an exploratory and interactive coding style desired by many scientists. At the same time it is a full-fledged programming language usable by software engineers to build complex and large programs. This combination makes it very suitable for use in seismology where research code often has to be translated to stable and production ready environments. It furthermore offers many freely available high quality scientific modules covering most needs in developing scientific software. ObsPy has been in constant development for more than 5 years and nowadays enjoys a large rate of adoption in the community with thousands of users. Successful applications include time-dependent and rotational seismology, big data processing, event relocations, and synthetic studies about attenuation kernels and full-waveform inversions to name a few examples. Additionally it sparked the development of several more specialized packages slowly building a modern seismological ecosystem around it. This contribution will give a short introduction and overview of ObsPy and highlight a number of use cases and software built around it. We will furthermore discuss the issue of sustainability of scientific software.

  5. ObsPy: A Python toolbox for seismology - Current state, applications, and ecosystem around it

    NASA Astrophysics Data System (ADS)

    Krischer, L.; Megies, T.; Sales de Andrade, E.; Barsch, R.; Beyreuther, M.

    2015-12-01

    ObsPy (http://www.obspy.org) is a community-driven, open-source project offering a bridge for seismology into the scientific Python ecosystem. It provides read and write support for essentially all commonly used waveform, station, and event metadata formats with a unified interface, a comprehensive signal processing toolbox tuned to the needs of seismologists, integrated access to all large data centers, web services and databases, and convenient wrappers to third party codes like libmseed and evalresp. Python, in contrast to many other languages and tools, is simple enough to enable an exploratory and interactive coding style desired by many scientists. At the same time it is a full-fledged programming language usable by software engineers to build complex and large programs. This combination makes it very suitable for use in seismology where research code often has to be translated to stable and production ready environments. It furthermore offers many freely available high quality scientific modules covering most needs in developing scientific software.ObsPy has been in constant development for more than 5 years and nowadays enjoys a large rate of adoption in the community with thousands of users. Successful applications include time-dependent and rotational seismology, big data processing, event relocations, and synthetic studies about attenuation kernels and full-waveform inversions to name a few examples. Additionally it sparked the development of several more specialized packages slowly building a modern seismological ecosystem around it.This contribution will give a short introduction and overview of ObsPy and highlight a number of us cases and software built around it. We will furthermore discuss the issue of sustainability of scientific software.

  6. Rapid Prototyping Integrated With Nondestructive Evaluation and Finite Element Analysis

    NASA Technical Reports Server (NTRS)

    Abdul-Aziz, Ali; Baaklini, George Y.

    2001-01-01

    Most reverse engineering approaches involve imaging or digitizing an object then creating a computerized reconstruction that can be integrated, in three dimensions, into a particular design environment. Rapid prototyping (RP) refers to the practical ability to build high-quality physical prototypes directly from computer aided design (CAD) files. Using rapid prototyping, full-scale models or patterns can be built using a variety of materials in a fraction of the time required by more traditional prototyping techniques (refs. 1 and 2). Many software packages have been developed and are being designed to tackle the reverse engineering and rapid prototyping issues just mentioned. For example, image processing and three-dimensional reconstruction visualization software such as Velocity2 (ref. 3) are being used to carry out the construction process of three-dimensional volume models and the subsequent generation of a stereolithography file that is suitable for CAD applications. Producing three-dimensional models of objects from computed tomography (CT) scans is becoming a valuable nondestructive evaluation methodology (ref. 4). Real components can be rendered and subjected to temperature and stress tests using structural engineering software codes. For this to be achieved, accurate high-resolution images have to be obtained via CT scans and then processed, converted into a traditional file format, and translated into finite element models. Prototyping a three-dimensional volume of a composite structure by reading in a series of two-dimensional images generated via CT and by using and integrating commercial software (e.g. Velocity2, MSC/PATRAN (ref. 5), and Hypermesh (ref. 6)) is being applied successfully at the NASA Glenn Research Center. The building process from structural modeling to the analysis level is outlined in reference 7. Subsequently, a stress analysis of a composite cooling panel under combined thermomechanical loading conditions was performed to validate this process.

  7. Software Reuse Success Strategy Model: An Empirical Study of Factors Involved in the Success of Software Reuse in Information System Development

    ERIC Educational Resources Information Center

    Tran, Kiet T.

    2012-01-01

    This study examined the relationship between information technology (IT) governance and software reuse success. Software reuse has been mostly an IT problem but rarely a business one. Studies in software reuse are abundant; however, to date, none has a deep appreciation of IT governance. This study demonstrated that IT governance had a positive…

  8. [Dynamic changes of urban architecture landscape based on Barista: a case study in Tiexi District of Shenyang City].

    PubMed

    Zhang, Pei-feng; Hu, Yuan-man; He, Hong-shi; Xiong, Zai-ping; Liu, Miao

    2010-12-01

    In this paper, three-dimensional building information was extracted from high resolution satellite image based on Barista software. Combined with ArcGIS software, the dynamic changes of the building landscape in Tiexi District of Shenyang City during urban renewal process were analyzed from the conversion contribution rate, building density, average building height, and built-up area rate. It was found that during this urban renewal process, four dominant landscape types (vacant lot, residential building, industrial building, and road) were the main parts of the landscape changes. The areas of vacant lot, residential building, commercial building, and road increased, while that of industrial building decreased. The building density decreased, while the average building height increased. There was an obvious regional variation in building landscape. The building density in industrial district was higher than that in residential district, while the average building height was in adverse. The further from the city center, the lower the building density and building average height.

  9. Computational Infrastructure for Geodynamics (CIG)

    NASA Astrophysics Data System (ADS)

    Gurnis, M.; Kellogg, L. H.; Bloxham, J.; Hager, B. H.; Spiegelman, M.; Willett, S.; Wysession, M. E.; Aivazis, M.

    2004-12-01

    Solid earth geophysicists have a long tradition of writing scientific software to address a wide range of problems. In particular, computer simulations came into wide use in geophysics during the decade after the plate tectonic revolution. Solution schemes and numerical algorithms that developed in other areas of science, most notably engineering, fluid mechanics, and physics, were adapted with considerable success to geophysics. This software has largely been the product of individual efforts and although this approach has proven successful, its strength for solving problems of interest is now starting to show its limitations as we try to share codes and algorithms or when we want to recombine codes in novel ways to produce new science. With funding from the NSF, the US community has embarked on a Computational Infrastructure for Geodynamics (CIG) that will develop, support, and disseminate community-accessible software for the greater geodynamics community from model developers to end-users. The software is being developed for problems involving mantle and core dynamics, crustal and earthquake dynamics, magma migration, seismology, and other related topics. With a high level of community participation, CIG is leveraging state-of-the-art scientific computing into a suite of open-source tools and codes. The infrastructure that we are now starting to develop will consist of: (a) a coordinated effort to develop reusable, well-documented and open-source geodynamics software; (b) the basic building blocks - an infrastructure layer - of software by which state-of-the-art modeling codes can be quickly assembled; (c) extension of existing software frameworks to interlink multiple codes and data through a superstructure layer; (d) strategic partnerships with the larger world of computational science and geoinformatics; and (e) specialized training and workshops for both the geodynamics and broader Earth science communities. The CIG initiative has already started to leverage and develop long-term strategic partnerships with open source development efforts within the larger thrusts of scientific computing and geoinformatics. These strategic partnerships are essential as the frontier has moved into multi-scale and multi-physics problems in which many investigators now want to use simulation software for data interpretation, data assimilation, and hypothesis testing.

  10. From Bridges and Rockets, Lessons for Software Systems

    NASA Technical Reports Server (NTRS)

    Holloway, C. Michael

    2004-01-01

    Although differences exist between building software systems and building physical structures such as bridges and rockets, enough similarities exist that software engineers can learn lessons from failures in traditional engineering disciplines. This paper draws lessons from two well-known failures the collapse of the Tacoma Narrows Bridge in 1940 and the destruction of the space shuttle Challenger in 1986 and applies these lessons to software system development. The following specific applications are made: (1) the verification and validation of a software system should not be based on a single method, or a single style of methods; (2) the tendency to embrace the latest fad should be overcome; and (3) the introduction of software control into safety-critical systems should be done cautiously.

  11. Building Energy Management Open Source Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rahman, Saifur

    Funded by the U.S. Department of Energy in November 2013, a Building Energy Management Open Source Software (BEMOSS) platform was engineered to improve sensing and control of equipment in small- and medium-sized commercial buildings. According to the Energy Information Administration (EIA), small- (5,000 square feet or smaller) and medium-sized (between 5,001 to 50,000 square feet) commercial buildings constitute about 95% of all commercial buildings in the U.S. These buildings typically do not have Building Automation Systems (BAS) to monitor and control building operation. While commercial BAS solutions exist, including those from Siemens, Honeywell, Johnsons Controls and many more, they aremore » not cost effective in the context of small- and medium-sized commercial buildings, and typically work with specific controller products from the same company. BEMOSS targets small and medium-sized commercial buildings to address this gap.« less

  12. Software Management for the NOνAExperiment

    NASA Astrophysics Data System (ADS)

    Davies, G. S.; Davies, J. P.; C Group; Rebel, B.; Sachdev, K.; Zirnstein, J.

    2015-12-01

    The NOvAsoftware (NOνASoft) is written in C++, and built on the Fermilab Computing Division's art framework that uses ROOT analysis software. NOνASoftmakes use of more than 50 external software packages, is developed by more than 50 developers and is used by more than 100 physicists from over 30 universities and laboratories in 3 continents. The software builds are handled by Fermilab's custom version of Software Release Tools (SRT), a UNIX based software management system for large, collaborative projects that is used by several experiments at Fermilab. The system provides software version control with SVN configured in a client-server mode and is based on the code originally developed by the BaBar collaboration. In this paper, we present efforts towards distributing the NOvA software via the CernVM File System distributed file system. We will also describe our recent work to use a CMake build system and Jenkins, the open source continuous integration system, for NOνASoft.

  13. Improving the Accuracy of Software-Based Energy Analysis for Residential Buildings (Presentation)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Polly, B.

    2011-09-01

    This presentation describes the basic components of software-based energy analysis for residential buildings, explores the concepts of 'error' and 'accuracy' when analysis predictions are compared to measured data, and explains how NREL is working to continuously improve the accuracy of energy analysis methods.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yixing; Zhang, Jianshun; Pelken, Michael

    Executive Summary The objective of this study was to develop a “Virtual Design Studio (VDS)”: a software platform for integrated, coordinated and optimized design of green building systems with low energy consumption, high indoor environmental quality (IEQ), and high level of sustainability. This VDS is intended to assist collaborating architects, engineers and project management team members throughout from the early phases to the detailed building design stages. It can be used to plan design tasks and workflow, and evaluate the potential impacts of various green building strategies on the building performance by using the state of the art simulation toolsmore » as well as industrial/professional standards and guidelines for green building system design. Engaged in the development of VDS was a multi-disciplinary research team that included architects, engineers, and software developers. Based on the review and analysis of how existing professional practices in building systems design operate, particularly those used in the U.S., Germany and UK, a generic process for performance-based building design, construction and operation was proposed. It distinguishes the whole process into five distinct stages: Assess, Define, Design, Apply, and Monitoring (ADDAM). The current VDS is focused on the first three stages. The VDS considers building design as a multi-dimensional process, involving multiple design teams, design factors, and design stages. The intersection among these three dimensions defines a specific design task in terms of “who”, “what” and “when”. It also considers building design as a multi-objective process that aims to enhance the five aspects of performance for green building systems: site sustainability, materials and resource efficiency, water utilization efficiency, energy efficiency and impacts to the atmospheric environment, and IEQ. The current VDS development has been limited to energy efficiency and IEQ performance, with particular focus on evaluating thermal performance, air quality and lighting environmental quality because of their strong interaction with the energy performance of buildings. The VDS software framework contains four major functions: 1) Design coordination: It enables users to define tasks using the Input-Process-Output flow approach, which specifies the anticipated activities (i.e., the process), required input and output information, and anticipated interactions with other tasks. It also allows task scheduling to define the work flow, and sharing of the design data and information via the internet. 2) Modeling and simulation: It enables users to perform building simulations to predict the energy consumption and IEQ conditions at any of the design stages by using EnergyPlus and a combined heat, air, moisture and pollutant simulation (CHAMPS) model. A method for co-simulation was developed to allow the use of both models at the same time step for the combined energy and indoor air quality analysis. 3) Results visualization: It enables users to display a 3-D geometric design of the building by reading BIM (building information model) file generated by design software such as SketchUp, and the predicted results of heat, air, moisture, pollutant and light distributions in the building. 4) Performance evaluation: It enables the users to compare the performance of a proposed building design against a reference building that is defined for the same type of buildings under the same climate condition, and predicts the percent of improvements over the minimum requirements specified in ASHRAE Standard 55-2010, 62.1-2010 and 90.1-2010. An approach was developed to estimate the potential impact of a design factor on the whole building performance, and hence can assist the user to identify areas that have most pay back for investment. The VDS software was developed by using C++ with the conventional Model, View and Control (MVC) software architecture. The software has been verified by using a simple 3-zone case building. The application of the VDS concepts and framework for building design and performance analysis has been illustrated by using a medium-sized, five story office building that received LEED Platinum Certification from USGBC.« less

  15. Project based, Collaborative, Algorithmic Robotics for High School Students: Programming Self Driving Race Cars at MIT

    DTIC Science & Technology

    2017-02-19

    software systems: the students design and build robotics software towards real-world applications, without being distracted by hardware issues; (ii) it...high school students require the students to focus on building and integrating the hardware that make up the robot, at the expense of designing and...robotics programs focus on the mechanics; as a result, they do not have room for students to design and implement relatively complex software systems, as

  16. Hadoop distributed batch processing for Gaia: a success story

    NASA Astrophysics Data System (ADS)

    Riello, Marco

    2015-12-01

    The DPAC Cambridge Data Processing Centre (DPCI) is responsible for the photometric calibration of the Gaia data including the low resolution spectra. The large data volume produced by Gaia (~26 billion transits/year), the complexity of its data stream and the self-calibrating approach pose unique challenges for scalability, reliability and robustness of both the software pipelines and the operations infrastructure. DPCI has been the first in DPAC to realise the potential of Hadoop and Map/Reduce and to adopt them as the core technologies for its infrastructure. This has proven a winning choice allowing DPCI unmatched processing throughput and reliability within DPAC to the point that other DPCs have started following our footsteps. In this talk we will present the software infrastructure developed to build the distributed and scalable batch data processing system that is currently used in production at DPCI and the excellent results in terms of performance of the system.

  17. Modular design of synthetic gene circuits with biological parts and pools.

    PubMed

    Marchisio, Mario Andrea

    2015-01-01

    Synthetic gene circuits can be designed in an electronic fashion by displaying their basic components-Standard Biological Parts and Pools of molecules-on the computer screen and connecting them with hypothetical wires. This procedure, achieved by our add-on for the software ProMoT, was successfully applied to bacterial circuits. Recently, we have extended this design-methodology to eukaryotic cells. Here, highly complex components such as promoters and Pools of mRNA contain hundreds of species and reactions whose calculation demands a rule-based modeling approach. We showed how to build such complex modules via the joint employment of the software BioNetGen (rule-based modeling) and ProMoT (modularization). In this chapter, we illustrate how to utilize our computational tool for synthetic biology with the in silico implementation of a simple eukaryotic gene circuit that performs the logic AND operation.

  18. Experimental Internet Environment Software Development

    NASA Technical Reports Server (NTRS)

    Maddux, Gary A.

    1998-01-01

    Geographically distributed project teams need an Internet based collaborative work environment or "Intranet." The Virtual Research Center (VRC) is an experimental Intranet server that combines several services such as desktop conferencing, file archives, on-line publishing, and security. Using the World Wide Web (WWW) as a shared space paradigm, the Graphical User Interface (GUI) presents users with images of a lunar colony. Each project has a wing of the colony and each wing has a conference room, library, laboratory, and mail station. In FY95, the VRC development team proved the feasibility of this shared space concept by building a prototype using a Netscape commerce server and several public domain programs. Successful demonstrations of the prototype resulted in approval for a second phase. Phase 2, documented by this report, will produce a seamlessly integrated environment by introducing new technologies such as Java and Adobe Web Links to replace less efficient interface software.

  19. The National Si-Soft Project

    NASA Astrophysics Data System (ADS)

    Chang, Chun-Yen; Trappey, Charles V.

    2003-06-01

    Taiwan's electronics industry emerged in the 1960s with the creation of a small but well planned integrated circuit (IC) packaging industry. This industry investment led to bolder investments in research, laboratories, and the island's first semiconductor foundries in the 1980s. Following the success of the emerging IC manufacturers and design houses, hundreds of service firms and related industries (software, legal services, substrate, chemical, and test firms among others) opened for business and completed Taiwan's IC manufacturing supply chain. The challenge for Taiwan's electronics industry is to take the lead in the design, manufacture, and marketing of name brand electronic products. This paper introduces the Si-Soft (silicon software) Project, a national initiative that builds on Taiwan's achievements in manufacturing (referred to as Si-Hard or silicon hardware) to launch a new wave of companies. These firms will contribute to the core underlying technology (intellectual property) used in the creation of electronic products.

  20. Technology collaboration by means of an open source government

    NASA Astrophysics Data System (ADS)

    Berardi, Steven M.

    2009-05-01

    The idea of open source software originally began in the early 1980s, but it never gained widespread support until recently, largely due to the explosive growth of the Internet. Only the Internet has made this kind of concept possible, bringing together millions of software developers from around the world to pool their knowledge. The tremendous success of open source software has prompted many corporations to adopt the culture of open source and thus share information they previously held secret. The government, and specifically the Department of Defense (DoD), could also benefit from adopting an open source culture. In acquiring satellite systems, the DoD often builds walls between program offices, but installing doors between programs can promote collaboration and information sharing. This paper addresses the challenges and consequences of adopting an open source culture to facilitate technology collaboration for DoD space acquisitions. DISCLAIMER: The views presented here are the views of the author, and do not represent the views of the United States Government, United States Air Force, or the Missile Defense Agency.

  1. Cobalt: A GPU-based correlator and beamformer for LOFAR

    NASA Astrophysics Data System (ADS)

    Broekema, P. Chris; Mol, J. Jan David; Nijboer, R.; van Amesfoort, A. S.; Brentjens, M. A.; Loose, G. Marcel; Klijn, W. F. A.; Romein, J. W.

    2018-04-01

    For low-frequency radio astronomy, software correlation and beamforming on general purpose hardware is a viable alternative to custom designed hardware. LOFAR, a new-generation radio telescope centered in the Netherlands with international stations in Germany, France, Ireland, Poland, Sweden and the UK, has successfully used software real-time processors based on IBM Blue Gene technology since 2004. Since then, developments in technology have allowed us to build a system based on commercial off-the-shelf components that combines the same capabilities with lower operational cost. In this paper, we describe the design and implementation of a GPU-based correlator and beamformer with the same capabilities as the Blue Gene based systems. We focus on the design approach taken, and show the challenges faced in selecting an appropriate system. The design, implementation and verification of the software system show the value of a modern test-driven development approach. Operational experience, based on three years of operations, demonstrates that a general purpose system is a good alternative to the previous supercomputer-based system or custom-designed hardware.

  2. NASA's Core Trajectory Sub-System Project: Using JBoss Enterprise Middleware for Building Software Systems Used to Support Spacecraft Trajectory Operations

    NASA Technical Reports Server (NTRS)

    Stensrud, Kjell C.; Hamm, Dustin

    2007-01-01

    NASA's Johnson Space Center (JSC) / Flight Design and Dynamics Division (DM) has prototyped the use of Open Source middleware technology for building its next generation spacecraft mission support system. This is part of a larger initiative to use open standards and open source software as building blocks for future mission and safety critical systems. JSC is hoping to leverage standardized enterprise architectures, such as Java EE, so that its internal software development efforts can be focused on the core aspects of their problem domain. This presentation will outline the design and implementation of the Trajectory system and the lessons learned during the exercise.

  3. Fostering successful scientific software communities

    NASA Astrophysics Data System (ADS)

    Bangerth, W.; Heister, T.; Hwang, L.; Kellogg, L. H.

    2016-12-01

    Developing sustainable open source software packages for the sciences appears at first to be primarily a technical challenge: How can one create stable and robust algorithms, appropriate software designs, sufficient documentation, quality assurance strategies such as continuous integration and test suites, or backward compatibility approaches that yield high-quality software usable not only by the authors, but also the broader community of scientists? However, our experience from almost two decades of leading the development of the deal.II software library (http://www.dealii.org, a widely-used finite element package) and the ASPECT code (http://aspect.dealii.org, used to simulate convection in the Earth's mantle) has taught us that technical aspects are not the most difficult ones in scientific open source software. Rather, it is the social challenge of building and maintaining a community of users and developers interested in answering questions on user forums, contributing code, and jointly finding solutions to common technical and non-technical challenges. These problems are posed in an environment where project leaders typically have no resources to reward the majority of contributors, where very few people are specifically paid for the work they do on the project, and with frequent turnover of contributors as project members rotate into and out of jobs. In particular, much software work is done by graduate students who may become fluent enough in a software only a year or two before they leave academia. We will discuss strategies we have found do and do not work in maintaining and growing communities around the scientific software projects we lead. Specifically, we will discuss the management style necessary to keep contributors engaged, ways to give credit where credit is due, and structuring documentation to decrease reliance on forums and thereby allow user communities to grow without straining those who answer questions.

  4. CrossTalk: The Journal of Defense Software Engineering. Volume 23, Number 4, July/August 2010

    DTIC Science & Technology

    2010-08-01

    Anita Carleton, Del Kellogg, and Jeff Schwalb Building Critical Systems as a Cyborg As outrageous as it may seem, adapting cybernetics to defense...software is a real possibility in building complex software systems. Ball discusses the history of cybernetics, what a “ cyborg ” really is, and how...What Is a Cyborg ? We want both kinds of the behavior that I’ve talked about, with predictable systems that follow established rules and proce- dures

  5. Technology Solutions Case Study: Hygrothermal Performance of a Double-Stud Cellulose Wall

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2015-06-01

    Moisture problems within the building shell can be caused by a number of factors including excess interior moisture that is transported into the wall through air leakage and vapor drive, bulk water intrusion from leaks and wind-driven rain, capillary action from concrete to wood connections, and through wetted building materials such as siding wetted from rain splash back. With the increasing thickness of walls, moisture issues could increase. Several builders have successfully used “double-wall” systems to more practically achieve higher R-values in thicker framed walls. A double wall typically consists of a load-bearing external frame wall constructed with 2 ×more » 4 framing at 16 in. on center using conventional methods. After the building is enclosed, an additional frame wall is constructed several inches inside the load-bearing wall. Several researchers have used moisture modeling software to conduct extensive analysis of these assemblies; however, little field research has been conducted to validate the results. In this project, the Building America research team Consortium for Advanced Residential Buildings monitored a double-stud assembly in climate zone 5A to determine the accu¬racy of moisture modeling and make recommendations to ensure durable and efficient assemblies.« less

  6. Sensor Suitcase Tablet Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    The Retrocommissioning Sensor Suitcase is targeted for use in small commercial buildings of less than 50,000 square feet of floor space that regularly receive basic services such as maintenance and repair, but don't have in-house energy management staff or buildings experts. The Suitcase is designed to be easy-to-use by building maintenance staff, or other professionals such as telecom and alarm technicians. The software in the hand-held is designed to guide the staff to input the building and system information, deploy the sensors in proper location, configure the sensor hardware, and start the data collection.

  7. Software augmented buildings: Exploiting existing infrastructure to improve energy efficiency and comfort in commercial buildings

    NASA Astrophysics Data System (ADS)

    Balaji, Bharathan

    Commercial buildings consume 19% of energy in the US as of 2010, and traditionally, their energy use has been optimized through improved equipment efficiency and retrofits. Beyond improved hardware and infrastructure, there exists a tremendous potential in reducing energy use through better monitoring and operation. We present several applications that we developed and deployed to support our thesis that building energy use can be reduced through sensing, monitoring and optimization software that modulates use of building subsystems including HVAC. We focus on HVAC systems as these constitute 48-55% of building energy use. Specifically, in case of sensing, we describe an energy apportionment system that enables us to estimate real-time zonal HVAC power consumption by analyzing existing sensor information. With this energy breakdown, we can measure effectiveness of optimization solutions and identify inefficiencies. Central to energy efficiency improvement is determination of human occupancy in buildings. But this information is often unavailable or expensive to obtain using wide scale sensor deployment. We present our system that infers room level occupancy inexpensively by leveraging existing WiFi infrastructure. Occupancy information can be used not only to directly control HVAC but also to infer state of the building for predictive control. Building energy use is strongly influenced by human behaviors, and timely feedback mechanisms can encourage energy saving behavior. Occupants interact with HVAC using thermostats which has shown to be inadequate for thermal comfort. Building managers are responsible for incorporating energy efficiency measures, but our interviews reveal that they struggle to maintain efficiency due to lack of analytical tools and contextual information. We present our software services that provide energy feedback to occupants and building managers, improves comfort with personalized control and identifies energy wasting faults. For wide scale deployment of such energy saving software, they need to be portable across multiple buildings. However, buildings consist of heterogeneous equipment and use inconsistent naming schema, and developers need extensive domain knowledge to map sensor information to a standard format. To enable portability, we present an active learning algorithm that automates mapping building sensor metadata to a standard naming schema.

  8. The image-guided surgery toolkit IGSTK: an open source C++ software toolkit.

    PubMed

    Enquobahrie, Andinet; Cheng, Patrick; Gary, Kevin; Ibanez, Luis; Gobbi, David; Lindseth, Frank; Yaniv, Ziv; Aylward, Stephen; Jomier, Julien; Cleary, Kevin

    2007-11-01

    This paper presents an overview of the image-guided surgery toolkit (IGSTK). IGSTK is an open source C++ software library that provides the basic components needed to develop image-guided surgery applications. It is intended for fast prototyping and development of image-guided surgery applications. The toolkit was developed through a collaboration between academic and industry partners. Because IGSTK was designed for safety-critical applications, the development team has adopted lightweight software processes that emphasizes safety and robustness while, at the same time, supporting geographically separated developers. A software process that is philosophically similar to agile software methods was adopted emphasizing iterative, incremental, and test-driven development principles. The guiding principle in the architecture design of IGSTK is patient safety. The IGSTK team implemented a component-based architecture and used state machine software design methodologies to improve the reliability and safety of the components. Every IGSTK component has a well-defined set of features that are governed by state machines. The state machine ensures that the component is always in a valid state and that all state transitions are valid and meaningful. Realizing that the continued success and viability of an open source toolkit depends on a strong user community, the IGSTK team is following several key strategies to build an active user community. These include maintaining a users and developers' mailing list, providing documentation (application programming interface reference document and book), presenting demonstration applications, and delivering tutorial sessions at relevant scientific conferences.

  9. An authentication infrastructure for today and tomorrow

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Engert, D.E.

    1996-06-01

    The Open Software Foundation`s Distributed Computing Environment (OSF/DCE) was originally designed to provide a secure environment for distributed applications. By combining it with Kerberos Version 5 from MIT, it can be extended to provide network security as well. This combination can be used to build both an inter and intra organizational infrastructure while providing single sign-on for the user with overall improved security. The ESnet community of the Department of Energy is building just such an infrastructure. ESnet has modified these systems to improve their interoperability, while encouraging the developers to incorporate these changes and work more closely together tomore » continue to improve the interoperability. The success of this infrastructure depends on its flexibility to meet the needs of many applications and network security requirements. The open nature of Kerberos, combined with the vendor support of OSF/DCE, provides the infrastructure for today and tomorrow.« less

  10. Single-Side Two-Location Spotlight Imaging for Building Based on MIMO Through-Wall-Radar.

    PubMed

    Jia, Yong; Zhong, Xiaoling; Liu, Jiangang; Guo, Yong

    2016-09-07

    Through-wall-radar imaging is of interest for mapping the wall layout of buildings and for the detection of stationary targets within buildings. In this paper, we present an easy single-side two-location spotlight imaging method for both wall layout mapping and stationary target detection by utilizing multiple-input multiple-output (MIMO) through-wall-radar. Rather than imaging for building walls directly, the images of all building corners are generated to speculate wall layout indirectly by successively deploying the MIMO through-wall-radar at two appropriate locations on only one side of the building and then carrying out spotlight imaging with two different squint-views. In addition to the ease of implementation, the single-side two-location squint-view detection also has two other advantages for stationary target imaging. The first one is the fewer multi-path ghosts, and the second one is the smaller region of side-lobe interferences from the corner images in comparison to the wall images. Based on Computer Simulation Technology (CST) electromagnetic simulation software, we provide multiple sets of validation results where multiple binary panorama images with clear images of all corners and stationary targets are obtained by combining two single-location images with the use of incoherent additive fusion and two-dimensional cell-averaging constant-false-alarm-rate (2D CA-CFAR) detection.

  11. A method for the complete analysis of NORM building materials by γ-ray spectrometry using HPGe detectors.

    PubMed

    Quintana, B; Pedrosa, M C; Vázquez-Canelas, L; Santamaría, R; Sanjuán, M A; Puertas, F

    2018-04-01

    A methodology including software tools for analysing NORM building materials and residues by low-level gamma-ray spectrometry has been developed. It comprises deconvolution of gamma-ray spectra using the software GALEA with focus on the natural radionuclides and Monte Carlo simulations for efficiency and true coincidence summing corrections. The methodology has been tested on a range of building materials and validated against reference materials. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. A Scalable Software Architecture Booting and Configuring Nodes in the Whitney Commodity Computing Testbed

    NASA Technical Reports Server (NTRS)

    Fineberg, Samuel A.; Kutler, Paul (Technical Monitor)

    1997-01-01

    The Whitney project is integrating commodity off-the-shelf PC hardware and software technology to build a parallel supercomputer with hundreds to thousands of nodes. To build such a system, one must have a scalable software model, and the installation and maintenance of the system software must be completely automated. We describe the design of an architecture for booting, installing, and configuring nodes in such a system with particular consideration given to scalability and ease of maintenance. This system has been implemented on a 40-node prototype of Whitney and is to be used on the 500 processor Whitney system to be built in 1998.

  13. BioMOL: a computer-assisted biological modeling tool for complex chemical mixtures and biological processes at the molecular level.

    PubMed Central

    Klein, Michael T; Hou, Gang; Quann, Richard J; Wei, Wei; Liao, Kai H; Yang, Raymond S H; Campain, Julie A; Mazurek, Monica A; Broadbelt, Linda J

    2002-01-01

    A chemical engineering approach for the rigorous construction, solution, and optimization of detailed kinetic models for biological processes is described. This modeling capability addresses the required technical components of detailed kinetic modeling, namely, the modeling of reactant structure and composition, the building of the reaction network, the organization of model parameters, the solution of the kinetic model, and the optimization of the model. Even though this modeling approach has enjoyed successful application in the petroleum industry, its application to biomedical research has just begun. We propose to expand the horizons on classic pharmacokinetics and physiologically based pharmacokinetics (PBPK), where human or animal bodies were often described by a few compartments, by integrating PBPK with reaction network modeling described in this article. If one draws a parallel between an oil refinery, where the application of this modeling approach has been very successful, and a human body, the individual processing units in the oil refinery may be considered equivalent to the vital organs of the human body. Even though the cell or organ may be much more complicated, the complex biochemical reaction networks in each organ may be similarly modeled and linked in much the same way as the modeling of the entire oil refinery through linkage of the individual processing units. The integrated chemical engineering software package described in this article, BioMOL, denotes the biological application of molecular-oriented lumping. BioMOL can build a detailed model in 1-1,000 CPU sec using standard desktop hardware. The models solve and optimize using standard and widely available hardware and software and can be presented in the context of a user-friendly interface. We believe this is an engineering tool with great promise in its application to complex biological reaction networks. PMID:12634134

  14. A Lossless Network for Data Acquisition

    NASA Astrophysics Data System (ADS)

    Jereczek, Grzegorz; Lehmann Miotto, Giovanna; Malone, David; Walukiewicz, Miroslaw

    2017-06-01

    The bursty many-to-one communication pattern, typical for data acquisition systems, is particularly demanding for commodity TCP/IP and Ethernet technologies. We expand the study of lossless switching in software running on commercial off-the-shelf servers, using the ATLAS experiment as a case study. In this paper, we extend the popular software switch, Open vSwitch, with a dedicated, throughput-oriented buffering mechanism for data acquisition. We compare the performance under heavy congestion on typical Ethernet switches to a commodity server acting as a switch. Our results indicate that software switches with large buffers perform significantly better. Next, we evaluate the scalability of the system when building a larger topology of interconnected software switches, exploiting the integration with software-defined networking technologies. We build an IP-only leaf-spine network consisting of eight software switches running on distinct physical servers as a demonstrator.

  15. NREL Improves Building Energy Simulation Programs Through Diagnostic Testing (Fact Sheet)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    2012-01-01

    This technical highlight describes NREL research to develop Building Energy Simulation Test for Existing Homes (BESTEST-EX) to increase the quality and accuracy of energy analysis tools for the building retrofit market. Researchers at the National Renewable Energy Laboratory (NREL) have developed a new test procedure to increase the quality and accuracy of energy analysis tools for the building retrofit market. The Building Energy Simulation Test for Existing Homes (BESTEST-EX) is a test procedure that enables software developers to evaluate the performance of their audit tools in modeling energy use and savings in existing homes when utility bills are available formore » model calibration. Similar to NREL's previous energy analysis tests, such as HERS BESTEST and other BESTEST suites included in ANSI/ASHRAE Standard 140, BESTEST-EX compares software simulation findings to reference results generated with state-of-the-art simulation tools such as EnergyPlus, SUNREL, and DOE-2.1E. The BESTEST-EX methodology: (1) Tests software predictions of retrofit energy savings in existing homes; (2) Ensures building physics calculations and utility bill calibration procedures perform to a minimum standard; and (3) Quantifies impacts of uncertainties in input audit data and occupant behavior. BESTEST-EX includes building physics and utility bill calibration test cases. The diagram illustrates the utility bill calibration test cases. Participants are given input ranges and synthetic utility bills. Software tools use the utility bills to calibrate key model inputs and predict energy savings for the retrofit cases. Participant energy savings predictions using calibrated models are compared to NREL predictions using state-of-the-art building energy simulation programs.« less

  16. Student Computer Attitudes, Experience and Perceptions about the Use of Two Software Applications in Building Engineering

    ERIC Educational Resources Information Center

    Chiner, Esther; Garcia-Vera, Victoria E.

    2017-01-01

    The purpose of this study was to examine students' computer attitudes and experience, as well as students' perceptions about the use of two specific software applications (Google Drive Spreadsheets and Arquimedes) in the Building Engineering context. The relationships among these variables were also examined. Ninety-two students took part in this…

  17. PUS Services Software Building Block Automatic Generation for Space Missions

    NASA Astrophysics Data System (ADS)

    Candia, S.; Sgaramella, F.; Mele, G.

    2008-08-01

    The Packet Utilization Standard (PUS) has been specified by the European Committee for Space Standardization (ECSS) and issued as ECSS-E-70-41A to define the application-level interface between Ground Segments and Space Segments. The ECSS-E- 70-41A complements the ECSS-E-50 and the Consultative Committee for Space Data Systems (CCSDS) recommendations for packet telemetry and telecommand. The ECSS-E-70-41A characterizes the identified PUS Services from a functional point of view and the ECSS-E-70-31 standard specifies the rules for their mission-specific tailoring. The current on-board software design for a space mission implies the production of several PUS terminals, each providing a specific tailoring of the PUS services. The associated on-board software building blocks are developed independently, leading to very different design choices and implementations even when the mission tailoring requires very similar services (from the Ground operative perspective). In this scenario, the automatic production of the PUS services building blocks for a mission would be a way to optimize the overall mission economy and improve the robusteness and reliability of the on-board software and of the Ground-Space interactions. This paper presents the Space Software Italia (SSI) activities for the development of an integrated environment to support: the PUS services tailoring activity for a specific mission. the mission-specific PUS services configuration. the generation the UML model of the software building block implementing the mission-specific PUS services and the related source code, support documentation (software requirements, software architecture, test plans/procedures, operational manuals), and the TM/TC database. The paper deals with: (a) the project objectives, (b) the tailoring, configuration, and generation process, (c) the description of the environments supporting the process phases, (d) the characterization of the meta-model used for the generation, (e) the characterization of the reference avionics architecture and of the reference on- board software high-level architecture.

  18. A Global Capacity Building Vision for Societal Applications of Earth Observing Systems and Data: Key Questions and Recommendations

    NASA Technical Reports Server (NTRS)

    Hossain, Faisal; Serrat-Capdevila, Aleix; Granger, Stephanie; Thomas, Amy; Saah, David; Ganz, David; Mugo, Robinson; Murthy, M. S. R.; Ramos, Victor Hugo; Kirschbaum, Dalia; hide

    2016-01-01

    Capacity building using Earth observing (EO) systems and data (i.e., from orbital and nonorbital platforms) to enable societal applications includes the network of human, nonhuman, technical, nontechnical, hardware, and software dimensions that are necessary to successfully cross the valley [of death; see NRC (2001)] between science and research (port of departure) and societal application (port of arrival). In many parts of the world (especially where ground-based measurements are scarce or insufficient), applications of EO data still struggle for longevity or continuity for a variety of reasons, foremost among them being the lack of resilient capacity. An organization is said to have resilient capacity when it can retain and continue to build capacity in the face of unexpected shocks or stresses. Stresses can include intermittent power and limited Internet bandwidth, constant need for education on ever-increasing complexity of EO systems and data, communication challenges between the ports of departure and arrival (especially across time zones), and financial limitations and instability. Shocks may also include extreme events such as disasters and losing key staff with technical and institutional knowledge.

  19. Software Tools for Development on the Peregrine System | High-Performance

    Science.gov Websites

    Computing | NREL Software Tools for Development on the Peregrine System Software Tools for and manage software at the source code level. Cross-Platform Make and SCons The "Cross-Platform Make" (CMake) package is from Kitware, and SCons is a modern software build tool based on Python

  20. Averting Denver Airports on a Chip

    NASA Technical Reports Server (NTRS)

    Sullivan, Kevin J.

    1995-01-01

    As a result of recent advances in software engineering capabilities, we are now in a more stable environment. De-facto hardware and software standards are emerging. Work on software architecture and design patterns signals a consensus on the importance of early system-level design decisions, and agreements on the uses of certain paradigmatic software structures. We now routinely build systems that would have been risky or infeasible a few years ago. Unfortunately, technological developments threaten to destabilize software design again. Systems designed around novel computing and peripheral devices will spark ambitious new projects that will stress current software design and engineering capabilities. Micro-electro-mechanical systems (MEMS) and related technologies provide the physical basis for new systems with the potential to produce this kind of destabilizing effect. One important response to anticipated software engineering and design difficulties is carefully directed engineering-scientific research. Two specific problems meriting substantial research attention are: A lack of sufficient means to build software systems by generating, extending, specializing, and integrating large-scale reusable components; and a lack of adequate computational and analytic tools to extend and aid engineers in maintaining intellectual control over complex software designs.

  1. Economic and Environmental Assessment of a 1 MW Grid Connected Rooftop Solar PV System for Energy Efficient Building in Bangladesh

    NASA Astrophysics Data System (ADS)

    Chakraborty, Sanjib; Hosain, Rubayet; Rahman, Toufiqur; Rabbi, Ahmead Fazle

    This paper evaluates the potentiality of a 1 MW grid connected rooftop solar PV system for an Energy Efficient Building in Bangladesh, which was estimated by utilizing NASA SSE solar radiation data, PVsyst simulation software and RETScreen simulation software. Economic and environmental viability for a ten-storied building with roof area of 6,500 m2 in the Capital City of Bangladesh, Dhaka was assessed by using the RETScreen simulation software. The yearly electricity production of the proposed system was 1,581 MWh estimated by PVsyst where the technical prospective of gird-connected solar PV in Bangladesh was calculated as about 50,174 MW. The economic assessments were determined the simple payback in such a way that the generated electricity first fulfills the demand of the building, and then the rest of the energy is supplied to the grid. The result indicates that the roof top solar PV system for an Energy efficient building in Dhaka city has a favorable condition for development both in economic and environmental point of view.

  2. Enhancing reproducibility in scientific computing: Metrics and registry for Singularity containers.

    PubMed

    Sochat, Vanessa V; Prybol, Cameron J; Kurtzer, Gregory M

    2017-01-01

    Here we present Singularity Hub, a framework to build and deploy Singularity containers for mobility of compute, and the singularity-python software with novel metrics for assessing reproducibility of such containers. Singularity containers make it possible for scientists and developers to package reproducible software, and Singularity Hub adds automation to this workflow by building, capturing metadata for, visualizing, and serving containers programmatically. Our novel metrics, based on custom filters of content hashes of container contents, allow for comparison of an entire container, including operating system, custom software, and metadata. First we will review Singularity Hub's primary use cases and how the infrastructure has been designed to support modern, common workflows. Next, we conduct three analyses to demonstrate build consistency, reproducibility metric and performance and interpretability, and potential for discovery. This is the first effort to demonstrate a rigorous assessment of measurable similarity between containers and operating systems. We provide these capabilities within Singularity Hub, as well as the source software singularity-python that provides the underlying functionality. Singularity Hub is available at https://singularity-hub.org, and we are excited to provide it as an openly available platform for building, and deploying scientific containers.

  3. Enhancing reproducibility in scientific computing: Metrics and registry for Singularity containers

    PubMed Central

    Prybol, Cameron J.; Kurtzer, Gregory M.

    2017-01-01

    Here we present Singularity Hub, a framework to build and deploy Singularity containers for mobility of compute, and the singularity-python software with novel metrics for assessing reproducibility of such containers. Singularity containers make it possible for scientists and developers to package reproducible software, and Singularity Hub adds automation to this workflow by building, capturing metadata for, visualizing, and serving containers programmatically. Our novel metrics, based on custom filters of content hashes of container contents, allow for comparison of an entire container, including operating system, custom software, and metadata. First we will review Singularity Hub’s primary use cases and how the infrastructure has been designed to support modern, common workflows. Next, we conduct three analyses to demonstrate build consistency, reproducibility metric and performance and interpretability, and potential for discovery. This is the first effort to demonstrate a rigorous assessment of measurable similarity between containers and operating systems. We provide these capabilities within Singularity Hub, as well as the source software singularity-python that provides the underlying functionality. Singularity Hub is available at https://singularity-hub.org, and we are excited to provide it as an openly available platform for building, and deploying scientific containers. PMID:29186161

  4. 78 FR 31916 - Increasing Market and Planning Efficiency Through Improved Software; Supplemental Agenda Notice

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-28

    ... Market and Planning Efficiency Through Improved Software; Supplemental Agenda Notice Take notice that... for increasing real-time and day-ahead market efficiency through improved software. A detailed agenda..., the software industry, government, research centers and academia and is intended to build on the...

  5. Test-driven programming

    NASA Astrophysics Data System (ADS)

    Georgiev, Bozhidar; Georgieva, Adriana

    2013-12-01

    In this paper, are presented some possibilities concerning the implementation of a test-driven development as a programming method. Here is offered a different point of view for creation of advanced programming techniques (build tests before programming source with all necessary software tools and modules respectively). Therefore, this nontraditional approach for easier programmer's work through building tests at first is preferable way of software development. This approach allows comparatively simple programming (applied with different object-oriented programming languages as for example JAVA, XML, PYTHON etc.). It is predictable way to develop software tools and to provide help about creating better software that is also easier to maintain. Test-driven programming is able to replace more complicated casual paradigms, used by many programmers.

  6. Evaluation of comprehensive environmental effect about coastal zone development activities in Liaoning Province and management advice.

    PubMed

    Wang, Wei-Wei; Cai, Yue-Yin; Sun, Yong-Guang; Ma, Hong-Wei

    2015-07-01

    Using spatial analysis function of Arcgis software, the present study investigated the building environment impact evaluation index system of coastal development in Liaoning Province. The factors of it included of current state of environmental quality, environmental impact of marine development and marine environmental disaster. Weighted factor analysis and comprehensive index method were utilized. At the end, comprehensive environment effect of coastal development in Liaoning Province were evaluated successfully. The result showed that the environmental effect of development activity were most serious, along the Zhao Jiatun coast in north of Zhimao bay and coast of Mianhua island in Dalian bay.

  7. Third-Party Software's Trust Quagmire.

    PubMed

    Voas, J; Hurlburt, G

    2015-12-01

    Current software development has trended toward the idea of integrating independent software sub-functions to create more complete software systems. Software sub-functions are often not homegrown - instead they are developed by unknown 3 rd party organizations and reside in software marketplaces owned or controlled by others. Such software sub-functions carry plausible concern in terms of quality, origins, functionality, security, interoperability, to name a few. This article surveys key technical difficulties in confidently building systems from acquired software sub-functions by calling out the principle software supply chain actors.

  8. Direct-coupled microcomputer-based building emulator for building energy management and control systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lam, H.N.

    1999-07-01

    In this paper, the development and implementation of a direct-coupled building emulator for a building energy management and control system (EMCS) is presented. The building emulator consists of a microcomputer and a computer model of an air-conditioning system implemented in a modular dynamic simulation software package for direct-coupling to an EMCS, without using analog-to-digital and digital-to-analog converters. The building emulator can be used to simulate in real time the behavior of the air-conditioning system under a given operating environment and subject to a given usage pattern. Software modules for data communication, graphical display, dynamic data exchange, and synchronization of simulationmore » outputs with real time have been developed to achieve direct digital data transfer between the building emulator and a commercial EMCS. Based on the tests conducted, the validity of the building emulator has been established and the proportional-plus-integral control function of the EMCS assessed.« less

  9. Working Notes from the 1992 AAAI Workshop on Automating Software Design. Theme: Domain Specific Software Design

    NASA Technical Reports Server (NTRS)

    Keller, Richard M. (Editor); Barstow, David; Lowry, Michael R.; Tong, Christopher H.

    1992-01-01

    The goal of this workshop is to identify different architectural approaches to building domain-specific software design systems and to explore issues unique to domain-specific (vs. general-purpose) software design. Some general issues that cut across the particular software design domain include: (1) knowledge representation, acquisition, and maintenance; (2) specialized software design techniques; and (3) user interaction and user interface.

  10. User systems guidelines for software projects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abrahamson, L.

    1986-04-01

    This manual presents guidelines for software standards which were developed so that software project-development teams and management involved in approving the software could have a generalized view of all phases in the software production procedure and the steps involved in completing each phase. Guidelines are presented for six phases of software development: project definition, building a user interface, designing software, writing code, testing code, and preparing software documentation. The discussions for each phase include examples illustrating the recommended guidelines. 45 refs. (DWL)

  11. AF-GEOSPACE Version 2.1

    NASA Astrophysics Data System (ADS)

    Hilmer, R. V.; Ginet, G. P.; Hall, T.; Holeman, E.; Madden, D.; Tautz, M.; Roth, C.

    2004-05-01

    AF-GEOSpace is a graphics-intensive software program with space environment models and applications developed and distributed by the Space Weather Center of Excellence at AFRL. A review of current (Version 2.0) and planned (Version 2.1) AF-GEOSpace capabilities will be given. A wide range of physical domains is represented enabling the software to address such things as solar disturbance propagation, radiation belt configuration, and ionospheric auroral particle precipitation and scintillation. The software is currently being used to aid with the design, operation, and simulation of a wide variety of communications, navigation, and surveillance systems. Building on the success of previous releases, AF-GEOSpace has become a platform for the rapid prototyping of automated operational and simulation space weather visualization products and helps with a variety of tasks, including: orbit specification for radiation hazard avoidance; satellite design assessment and post-event anomaly analysis; solar disturbance effects forecasting; frequency and antenna management for radar and HF communications; determination of link outage regions for active ionospheric conditions; scientific model validation and comparison, physics research, and education. Version 2.0 provided a simplified graphical user interface, improved science and application modules, and significantly enhanced graphical performance. Common input data archive sets, application modules, and 1-D, 2-D, and 3-D visualization tools are provided to all models. Dynamic capabilities permit multiple environments to be generated at user-specified time intervals while animation tools enable displays such as satellite orbits and environment data together as a function of time. Building on the existing Version 2.0 software architecture, AF-GEOSpace Version 2.1 is currently under development and will include a host of new modules to provide, for example, geosynchronous charged particle fluxes, neutral atmosphere densities, cosmic ray cutoff maps, low-altitude trapped proton belt specification, and meteor shower/storm fluxes with spacecraft impact probabilities. AF-GEOSpace Version 2.1 is being developed for Windows NT/2000/XP and Linux systems.

  12. AF-GEOSpace Version 2.1 Release

    NASA Astrophysics Data System (ADS)

    Hilmer, R. V.; Ginet, G. P.; Hall, T.; Holeman, E.; Madden, D.; Perry, K. L.; Tautz, M.; Roth, C.

    2006-05-01

    AF-GEOSpace Version 2.1 is a graphics-intensive software program with space environment models and applications developed recently by the Space Weather Center of Excellence at AFRL. A review of new and planned AF-GEOSpace capabilities will be given. The software addresses a wide range of physical domains and addresses such topics as solar disturbance propagation, geomagnetic field and radiation belt configurations, auroral particle precipitation, and ionospheric scintillation. Building on the success of previous releases, AF-GEOSpace has become a platform for the rapid prototyping of automated operational and simulation space weather visualization products and helps with a variety of tasks, including: orbit specification for radiation hazard avoidance; satellite design assessment and post-event anomaly analysis; solar disturbance effects forecasting; determination of link outage regions for active ionospheric conditions; satellite magnetic conjugate studies, scientific model validation and comparison, physics research, and education. Previously, Version 2.0 provided a simplified graphical user interface, improved science and application modules, significantly enhanced graphical performance, common input data archive sets, and 1-D, 2-D, and 3- D visualization tools for all models. Dynamic capabilities permit multiple environments to be generated at user- specified time intervals while animation tools enable the display of satellite orbits and environment data together as a function of time. Building on the Version 2.0 software architecture, AF-GEOSpace Version 2.1 includes a host of new modules providing, for example, plasma sheet charged particle fluxes, neutral atmosphere densities, 3-D cosmic ray cutoff maps, low-altitude trapped proton belt flux specification, DMSP particle data displays, satellite magnetic field footprint mapping determination, and meteor sky maps and shower/storm fluxes with spacecraft impact probabilities. AF-GEOSpace Version 2.1 was developed for Windows XP and Linux systems. To receive a copy of the AF-GEOSpace 2.1 software, please submit requests via e-mail to the first author.

  13. An Overview of Biological Macromolecule Crystallization

    PubMed Central

    Krauss, Irene Russo; Merlino, Antonello; Vergara, Alessandro; Sica, Filomena

    2013-01-01

    The elucidation of the three dimensional structure of biological macromolecules has provided an important contribution to our current understanding of many basic mechanisms involved in life processes. This enormous impact largely results from the ability of X-ray crystallography to provide accurate structural details at atomic resolution that are a prerequisite for a deeper insight on the way in which bio-macromolecules interact with each other to build up supramolecular nano-machines capable of performing specialized biological functions. With the advent of high-energy synchrotron sources and the development of sophisticated software to solve X-ray and neutron crystal structures of large molecules, the crystallization step has become even more the bottleneck of a successful structure determination. This review introduces the general aspects of protein crystallization, summarizes conventional and innovative crystallization methods and focuses on the new strategies utilized to improve the success rate of experiments and increase crystal diffraction quality. PMID:23727935

  14. The ALICE Software Release Validation cluster

    NASA Astrophysics Data System (ADS)

    Berzano, D.; Krzewicki, M.

    2015-12-01

    One of the most important steps of software lifecycle is Quality Assurance: this process comprehends both automatic tests and manual reviews, and all of them must pass successfully before the software is approved for production. Some tests, such as source code static analysis, are executed on a single dedicated service: in High Energy Physics, a full simulation and reconstruction chain on a distributed computing environment, backed with a sample “golden” dataset, is also necessary for the quality sign off. The ALICE experiment uses dedicated and virtualized computing infrastructures for the Release Validation in order not to taint the production environment (i.e. CVMFS and the Grid) with non-validated software and validation jobs: the ALICE Release Validation cluster is a disposable virtual cluster appliance based on CernVM and the Virtual Analysis Facility, capable of deploying on demand, and with a single command, a dedicated virtual HTCondor cluster with an automatically scalable number of virtual workers on any cloud supporting the standard EC2 interface. Input and output data are externally stored on EOS, and a dedicated CVMFS service is used to provide the software to be validated. We will show how the Release Validation Cluster deployment and disposal are completely transparent for the Release Manager, who simply triggers the validation from the ALICE build system's web interface. CernVM 3, based entirely on CVMFS, permits to boot any snapshot of the operating system in time: we will show how this allows us to certify each ALICE software release for an exact CernVM snapshot, addressing the problem of Long Term Data Preservation by ensuring a consistent environment for software execution and data reprocessing in the future.

  15. Building Diagnostic Market Deployment - Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Katipamula, S.; Gayeski, N.

    2012-04-30

    Operational faults are pervasive across the commercial buildings sector, wasting energy and increasing energy costs by up to about 30% (Mills 2009, Liu et al. 2003, Claridge et al. 2000, Katipamula and Brambley 2008, and Brambley and Katipamula 2009). Automated fault detection and diagnostic (AFDD) tools provide capabilities essential for detecting and correcting these problems and eliminating the associated energy waste and costs. The U.S. Department of Energy's (DOE) Building Technology Program (BTP) has previously invested in developing and testing of such diagnostic tools for whole-building (and major system) energy use, air handlers, chillers, cooling towers, chilled-water distribution systems, andmore » boilers. These diagnostic processes can be used to make the commercial buildings more energy efficient. The work described in this report was done as part of a Cooperative Research and Development Agreement (CRADA) between the U.S. Department of Energy's Pacific Northwest National Laboratory (PNNL) and KGS Building LLC (KGS). PNNL and KGS both believe that the widespread adoption of AFDD tools will result in significant reduction to energy and peak energy consumption. The report provides an introduction and summary of the various tasks performed under the CRADA. The CRADA project had three major focus areas: (1) Technical Assistance for Whole Building Energy Diagnostician (WBE) Commercialization, (2) Market Transfer of the Outdoor Air/Economizer Diagnostician (OAE), and (3) Development and Deployment of Automated Diagnostics to Improve Large Commercial Building Operations. PNNL has previously developed two diagnostic tools: (1) whole building energy (WBE) diagnostician and (2) outdoor air/economizer (OAE) diagnostician. WBE diagnostician is currently licensed non-exclusively to one company. As part of this CRADA, PNNL developed implementation documentation and provided technical support to KGS to implement the tool into their software suite, Clockworks. PNNL also provided validation data sets and the WBE software tool to validate the KGS implementation. OAE diagnostician automatically detects and diagnoses problems with outdoor air ventilation and economizer operation for air handling units (AHUs) in commercial buildings using data available from building automation systems (BASs). As part of this CRADA, PNNL developed implementation documentation and provided technical support to KGS to implement the tool into their software suite. PNNL also provided validation data sets and the OAE software tool to validate the KGS implementation. Finally, as part of this CRADA project, PNNL developed new processes to automate parts of the re-tuning process and transfer those process to KGS for integration into their software product. The transfer of DOE-funded technologies will transform the commercial buildings sector by making buildings more energy efficient and also reducing the carbon footprint from the buildings. As part of the CRADA with PNNL, KGS implemented the whole building energy diagnostician, a portion of outdoor air economizer diagnostician and a number of measures that automate the identification of re-tuning measures.« less

  16. Building the Core Architecture of a Multiagent System Product Line: With an example from a future NASA Mission

    NASA Technical Reports Server (NTRS)

    Pena, Joaquin; Hinchey, Michael G.; Ruiz-Cortes, Antonio

    2006-01-01

    The field of Software Product Lines (SPL) emphasizes building a core architecture for a family of software products from which concrete products can be derived rapidly. This helps to reduce time-to-market, costs, etc., and can result in improved software quality and safety. Current AOSE methodologies are concerned with developing a single Multiagent System. We propose an initial approach to developing the core architecture of a Multiagent Systems Product Line (MAS-PL), exemplifying our approach with reference to a concept NASA mission based on multiagent technology.

  17. Teaching Model Building to High School Students: Theory and Reality.

    ERIC Educational Resources Information Center

    Roberts, Nancy; Barclay, Tim

    1988-01-01

    Builds on a National Science Foundation (NSF) microcomputer based laboratory project to introduce system dynamics into the precollege setting. Focuses on providing students with powerful and investigatory theory building tools. Discusses developed hardware, software, and curriculum materials used to introduce model building and simulations into…

  18. Luigi Gentile Polese | NREL

    Science.gov Websites

    software development of next-generation whole-building energy modeling, analysis, and simulation tools technical positions in networking protocol specifications, call control software, and requirements

  19. Home | BEopt

    Science.gov Websites

    BEopt - Building Energy Optimization BEopt NREL - National Renewable Energy Laboratory Primary Energy Optimization) software provides capabilities to evaluate residential building designs and identify sequential search optimization technique used by BEopt: Finds minimum-cost building designs at different

  20. RETScreen Plus Software Tutorial

    NASA Technical Reports Server (NTRS)

    Ganoe, Rene D.; Stackhouse, Paul W., Jr.; DeYoung, Russell J.

    2014-01-01

    Greater emphasis is being placed on reducing both the carbon footprint and energy cost of buildings. A building's energy usage depends upon many factors one of the most important is the local weather and climate conditions to which it's electrical, heating and air conditioning systems must respond. Incorporating renewable energy systems, including solar systems, to supplement energy supplies and increase energy efficiency is important to saving costs and reducing emissions. Also retrofitting technologies to buildings requires knowledge of building performance in its current state, potential future climate state, projection of potential savings with capital investment, and then monitoring the performance once the improvements are made. RETScreen Plus is a performance analysis software module that supplies the needed functions of monitoring current building performance, targeting projected energy efficiency improvements and verifying improvements once completed. This tutorial defines the functions of RETScreen Plus as well as outlines the general procedure for monitoring and reporting building energy performance.

  1. Seqcrawler: biological data indexing and browsing platform.

    PubMed

    Sallou, Olivier; Bretaudeau, Anthony; Roult, Aurelien

    2012-07-24

    Seqcrawler takes its roots in software like SRS or Lucegene. It provides an indexing platform to ease the search of data and meta-data in biological banks and it can scale to face the current flow of data. While many biological bank search tools are available on the Internet, mainly provided by large organizations to search their data, there is a lack of free and open source solutions to browse one's own set of data with a flexible query system and able to scale from a single computer to a cloud system. A personal index platform will help labs and bioinformaticians to search their meta-data but also to build a larger information system with custom subsets of data. The software is scalable from a single computer to a cloud-based infrastructure. It has been successfully tested in a private cloud with 3 index shards (pieces of index) hosting ~400 millions of sequence information (whole GenBank, UniProt, PDB and others) for a total size of 600 GB in a fault tolerant architecture (high-availability). It has also been successfully integrated with software to add extra meta-data from blast results to enhance users' result analysis. Seqcrawler provides a complete open source search and store solution for labs or platforms needing to manage large amount of data/meta-data with a flexible and customizable web interface. All components (search engine, visualization and data storage), though independent, share a common and coherent data system that can be queried with a simple HTTP interface. The solution scales easily and can also provide a high availability infrastructure.

  2. The Generalizability of Private Sector Research on Software Project Management in Two USAF Organizations: An Exploratory Study

    DTIC Science & Technology

    2003-03-01

    private sector . Researchers have also identified software acquisitions as one of the major differences between the private sector and public sector MIS. This indicates that the elements for a successful software project in the public sector may be different from the private sector . Private sector project success depends on many elements. Three of them are user interaction with the project’s development, critical success factors, and how the project manager prioritizes the traditional success criteria.

  3. Selecting Really Excellent Software for Young Adults.

    ERIC Educational Resources Information Center

    Polly, Jean Armour

    1985-01-01

    This article discusses criteria of a good computer software package to aid the public librarian in the building, weeding, and maintenance of a software collection for young adults. Highlights include manuals or documentation; bells, whistles, and color; and the true test of time. (EJS)

  4. Software technology insertion: A study of success factors

    NASA Technical Reports Server (NTRS)

    Lydon, Tom

    1990-01-01

    Managing software development in large organizations has become increasingly difficult due to increasing technical complexity, stricter government standards, a shortage of experienced software engineers, competitive pressure for improved productivity and quality, the need to co-develop hardware and software together, and the rapid changes in both hardware and software technology. The 'software factory' approach to software development minimizes risks while maximizing productivity and quality through standardization, automation, and training. However, in practice, this approach is relatively inflexible when adopting new software technologies. The methods that a large multi-project software engineering organization can use to increase the likelihood of successful software technology insertion (STI), especially in a standardized engineering environment, are described.

  5. Successful aging in Spanish older adults: the role of psychosocial resources.

    PubMed

    Dumitrache, Cristina G; Rubio, Laura; Cordón-Pozo, Eulogio

    2018-05-25

    ABSTRACTBackground:Psychological and social resources such as extraversion, optimism, social support, or social networks contribute to adaptation and to successful aging. Building on assumptions derived from successful aging and from the developmental adaptation models, this study aims to analyze the joint impact of different psychosocial resources, such as personality, social relations, health, and socio-demographic characteristics on life satisfaction in a group of people aged 65 years-old and older from Spain. A cross-sectional survey using non-proportional quota sampling was carried out. The sample comprised 406 community-dwelling older adults (M = 74.88, SD = 6.75). In order to collect the data, face-to-face interviews were individually conducted. A structural equation model (SEM) was carried out using the PLS software. The results of the SEM model showed that, within this sample, psychosocial variables explain 47.4% of the variance in life satisfaction. Social relations and personality, specifically optimism, were strongly related with life satisfaction, while health status and socio-demographic characteristics were modestly associated with life satisfaction. Findings support the view that psychosocial resources are important for successful aging and therefore should be included in successful aging models. Furthermore, interventions aimed at fostering successful aging should take into account the role of psychosocial variables.

  6. TASS - The Amateur Sky Survey

    NASA Astrophysics Data System (ADS)

    Droege, T. F.; Albertson, C.; Gombert, G.; Gutzwiller, M.; Molhant, N. W.; Johnson, H.; Skvarc, J.; Wickersham, R. J.; Richmond, M. W.; Rybski, P.; Henden, A.; Beser, N.; Pittinger, M.; Kluga, B.

    1997-05-01

    As a non-astronomer watching Shoemaker/Levy 9 crash into Jupiter through postings on sci.astro, it occurred to me that it might be fun to build a comet finding machine. After wild speculations on how such a device might be built - I considered a 26" x 40" fresnel lens and a string of pin diodes -- postings to sci.astro brought me down to earth. I quickly made contact with both professionals and amateurs and found that there was interesting science to be done with an all sky survey. After several prototype drift scan cameras were built using various CCDs, I determined the real problem was software. How does one get the software written for an all sky survey? Willie Sutton could tell you, "Go where the programmers are." Our strategy has been to build a bunch of drift scan cameras and just give them away (without software) to programmers found on the Internet. This author reports more success by this technique than when he had a business and hired and paid programmers at a cost of a million or so a year. To date, 22 drift scan cameras have been constructed. Most of these are operated as triplets spaced 15 degrees apart in Right Ascension and with I, V, I filters. The cameras use 135mm fl, f.2.8 camera lenses for a plate scale of 14 arc seconds per pixel and reach magnitude 13. With 512 pixels across the drift scan direction and running through the night, a triplet will collect 200 Mb of data on three overlapping areas of 3 x 120 degrees each. To date four of the triplets and one single have taken data. Production has started on 25 second generation cameras using 2k x 2k devices and a barn door mount.

  7. NASA's Space Launch System: Systems Engineering Approach for Affordability and Mission Success

    NASA Technical Reports Server (NTRS)

    Hutt, John J.; Whitehead, Josh; Hanson, John

    2017-01-01

    NASA is working toward the first launch of a new, unmatched capability for deep space exploration, with launch readiness planned for 2018. The initial Block 1 configuration of the Space Launch System will more than double the mass and volume to Low Earth Orbit (LEO) of any launch vehicle currently in operation - with a path to evolve to the greatest capability ever developed. The program formally began in 2011. The vehicle successfully passed Preliminary Design Review (PDR) in 2013, Key Decision Point C (KDPC) in 2014 and Critical Design Review (CDR) in October 2015 - nearly 40 years since the last CDR of a NASA human-rated rocket. Every major SLS element has completed components of test and flight hardware. Flight software has completed several development cycles. RS-25 hotfire testing at NASA Stennis Space Center (SSC) has successfully demonstrated the space shuttle-heritage engine can perform to SLS requirements and environments. The five-segment solid rocket booster design has successfully completed two full-size motor firing tests in Utah. Stage and component test facilities at Stennis and NASA Marshall Space Flight Center are nearing completion. Launch and test facilities, as well as transportation and other ground support equipment are largely complete at NASA's Kennedy, Stennis and Marshall field centers. Work is also underway on the more powerful Block 1 B variant with successful completion of the Exploration Upper Stage (EUS) PDR in January 2017. NASA's approach is to develop this heavy lift launch vehicle with limited resources by building on existing subsystem designs and existing hardware where available. The systems engineering and integration (SE&I) of existing and new designs introduces unique challenges and opportunities. The SLS approach was designed with three objectives in mind: 1) Design the vehicle around the capability of existing systems; 2) Reduce work hours for nonhardware/ software activities; 3) Increase the probability of mission success by focusing effort on more critical activities.

  8. SRA Real Math Building Blocks PreK. What Works Clearinghouse Intervention Report

    ERIC Educational Resources Information Center

    What Works Clearinghouse, 2007

    2007-01-01

    "SRA Real Math Building Blocks PreK" (also referred to as "Building Blocks for Math") is a supplemental mathematics curriculum designed to develop preschool children's early mathematical knowledge through various individual and small- and large-group activities. It uses "Building Blocks for Math PreK" software,…

  9. Ontology for Life-Cycle Modeling of Electrical Distribution Systems: Model View Definition

    DTIC Science & Technology

    2013-06-01

    building information models ( BIM ) at the coordinated design stage of building construction. 1.3 Approach To...standard for exchanging Building Information Modeling ( BIM ) data, which defines hundreds of classes for common use in software, currently supported by...specifications, Construction Operations Building in- formation exchange (COBie), Building Information Modeling ( BIM ) 16. SECURITY CLASSIFICATION OF:

  10. Rapid Deployment of Optimal Control for Building HVAC Systems Using Innovative Software Tools and a Hybrid Heuristic/Model Based Control Approach

    DTIC Science & Technology

    2017-03-21

    for public release; distribution is unlimited 13. SUPPLEMENTARY NOTES None 14. ABSTRACT ESTCP project EW-201409 aimed at demonstrating the benefits ...of innovative software technology for building HV AC systems. These benefits included reduced system energy use and cost as wetl as improved...Control Approach March 2017 This document has been cleared for public release; Distribution Statement A

  11. Defense Facility Condition: Revised Guidance Needed to Improve Oversight of Assessments and Ratings

    DTIC Science & Technology

    2016-06-01

    are to implement the standardized process in part by assessing the condition of buildings, pavement , and rail using the same set of software tools...facility to current standards; costs for labor, equipment, materials, and currency exchange rates overseas; costs for project planning and design ...example, the services are to assess the condition of buildings, pavement , and rail using Sustainment Management System software tools developed by the

  12. Teaching and Assessment of Mathematical Principles for Software Correctness Using a Reasoning Concept Inventory

    ERIC Educational Resources Information Center

    Drachova-Strang, Svetlana V.

    2013-01-01

    As computing becomes ubiquitous, software correctness has a fundamental role in ensuring the safety and security of the systems we build. To design and develop software correctly according to their formal contracts, CS students, the future software practitioners, need to learn a critical set of skills that are necessary and sufficient for…

  13. Automatic Generation of Just-in-Time Online Assessments from Software Design Models

    ERIC Educational Resources Information Center

    Zualkernan, Imran A.; El-Naaj, Salim Abou; Papadopoulos, Maria; Al-Amoudi, Budoor K.; Matthews, Charles E.

    2009-01-01

    Computer software is pervasive in today's society. The rate at which new versions of computer software products are released is phenomenal when compared to the release rate of new products in traditional industries such as aircraft building. This rapid rate of change can partially explain why most certifications in the software industry are…

  14. Accuracy Assessment of a Complex Building 3d Model Reconstructed from Images Acquired with a Low-Cost Uas

    NASA Astrophysics Data System (ADS)

    Oniga, E.; Chirilă, C.; Stătescu, F.

    2017-02-01

    Nowadays, Unmanned Aerial Systems (UASs) are a wide used technique for acquisition in order to create buildings 3D models, providing the acquisition of a high number of images at very high resolution or video sequences, in a very short time. Since low-cost UASs are preferred, the accuracy of a building 3D model created using this platforms must be evaluated. To achieve results, the dean's office building from the Faculty of "Hydrotechnical Engineering, Geodesy and Environmental Engineering" of Iasi, Romania, has been chosen, which is a complex shape building with the roof formed of two hyperbolic paraboloids. Seven points were placed on the ground around the building, three of them being used as GCPs, while the remaining four as Check points (CPs) for accuracy assessment. Additionally, the coordinates of 10 natural CPs representing the building characteristic points were measured with a Leica TCR 405 total station. The building 3D model was created as a point cloud which was automatically generated based on digital images acquired with the low-cost UASs, using the image matching algorithm and different software like 3DF Zephyr, Visual SfM, PhotoModeler Scanner and Drone2Map for ArcGIS. Except for the PhotoModeler Scanner software, the interior and exterior orientation parameters were determined simultaneously by solving a self-calibrating bundle adjustment. Based on the UAS point clouds, automatically generated by using the above mentioned software and GNSS data respectively, the parameters of the east side hyperbolic paraboloid were calculated using the least squares method and a statistical blunder detection. Then, in order to assess the accuracy of the building 3D model, several comparisons were made for the facades and the roof with reference data, considered with minimum errors: TLS mesh for the facades and GNSS mesh for the roof. Finally, the front facade of the building was created in 3D based on its characteristic points using the PhotoModeler Scanner software, resulting a CAD (Computer Aided Design) model. The results showed the high potential of using low-cost UASs for building 3D model creation and if the building 3D model is created based on its characteristic points the accuracy is significantly improved.

  15. Software For Genetic Algorithms

    NASA Technical Reports Server (NTRS)

    Wang, Lui; Bayer, Steve E.

    1992-01-01

    SPLICER computer program is genetic-algorithm software tool used to solve search and optimization problems. Provides underlying framework and structure for building genetic-algorithm application program. Written in Think C.

  16. Automatic building information model query generation

    DOE PAGES

    Jiang, Yufei; Yu, Nan; Ming, Jiang; ...

    2015-12-01

    Energy efficient building design and construction calls for extensive collaboration between different subfields of the Architecture, Engineering and Construction (AEC) community. Performing building design and construction engineering raises challenges on data integration and software interoperability. Using Building Information Modeling (BIM) data hub to host and integrate building models is a promising solution to address those challenges, which can ease building design information management. However, the partial model query mechanism of current BIM data hub collaboration model has several limitations, which prevents designers and engineers to take advantage of BIM. To address this problem, we propose a general and effective approachmore » to generate query code based on a Model View Definition (MVD). This approach is demonstrated through a software prototype called QueryGenerator. In conclusion, by demonstrating a case study using multi-zone air flow analysis, we show how our approach and tool can help domain experts to use BIM to drive building design with less labour and lower overhead cost.« less

  17. Automatic building information model query generation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jiang, Yufei; Yu, Nan; Ming, Jiang

    Energy efficient building design and construction calls for extensive collaboration between different subfields of the Architecture, Engineering and Construction (AEC) community. Performing building design and construction engineering raises challenges on data integration and software interoperability. Using Building Information Modeling (BIM) data hub to host and integrate building models is a promising solution to address those challenges, which can ease building design information management. However, the partial model query mechanism of current BIM data hub collaboration model has several limitations, which prevents designers and engineers to take advantage of BIM. To address this problem, we propose a general and effective approachmore » to generate query code based on a Model View Definition (MVD). This approach is demonstrated through a software prototype called QueryGenerator. In conclusion, by demonstrating a case study using multi-zone air flow analysis, we show how our approach and tool can help domain experts to use BIM to drive building design with less labour and lower overhead cost.« less

  18. Custom software development for use in a clinical laboratory

    PubMed Central

    Sinard, John H.; Gershkovich, Peter

    2012-01-01

    In-house software development for use in a clinical laboratory is a controversial issue. Many of the objections raised are based on outdated software development practices, an exaggeration of the risks involved, and an underestimation of the benefits that can be realized. Buy versus build analyses typically do not consider total costs of ownership, and unfortunately decisions are often made by people who are not directly affected by the workflow obstacles or benefits that result from those decisions. We have been developing custom software for clinical use for over a decade, and this article presents our perspective on this practice. A complete analysis of the decision to develop or purchase must ultimately examine how the end result will mesh with the departmental workflow, and custom-developed solutions typically can have the greater positive impact on efficiency and productivity, substantially altering the decision balance sheet. Involving the end-users in preparation of the functional specifications is crucial to the success of the process. A large development team is not needed, and even a single programmer can develop significant solutions. Many of the risks associated with custom development can be mitigated by a well-structured development process, use of open-source tools, and embracing an agile development philosophy. In-house solutions have the significant advantage of being adaptable to changing departmental needs, contributing to efficient and higher quality patient care. PMID:23372985

  19. Custom software development for use in a clinical laboratory.

    PubMed

    Sinard, John H; Gershkovich, Peter

    2012-01-01

    In-house software development for use in a clinical laboratory is a controversial issue. Many of the objections raised are based on outdated software development practices, an exaggeration of the risks involved, and an underestimation of the benefits that can be realized. Buy versus build analyses typically do not consider total costs of ownership, and unfortunately decisions are often made by people who are not directly affected by the workflow obstacles or benefits that result from those decisions. We have been developing custom software for clinical use for over a decade, and this article presents our perspective on this practice. A complete analysis of the decision to develop or purchase must ultimately examine how the end result will mesh with the departmental workflow, and custom-developed solutions typically can have the greater positive impact on efficiency and productivity, substantially altering the decision balance sheet. Involving the end-users in preparation of the functional specifications is crucial to the success of the process. A large development team is not needed, and even a single programmer can develop significant solutions. Many of the risks associated with custom development can be mitigated by a well-structured development process, use of open-source tools, and embracing an agile development philosophy. In-house solutions have the significant advantage of being adaptable to changing departmental needs, contributing to efficient and higher quality patient care.

  20. Evan Weaver | NREL

    Science.gov Websites

    Evan Weaver Photo of Evan Weaver Evan Weaver Researcher III-Software Engineering Evan.Weaver , he works as a software engineer developing whole-building energy modeling tools. Prior to joining NREL, he worked in the biomedical industry as a software engineer, specializing in graphical user

  1. S-Cube: Enabling the Next Generation of Software Services

    NASA Astrophysics Data System (ADS)

    Metzger, Andreas; Pohl, Klaus

    The Service Oriented Architecture (SOA) paradigm is increasingly adopted by industry for building distributed software systems. However, when designing, developing and operating innovative software services and servicebased systems, several challenges exist. Those challenges include how to manage the complexity of those systems, how to establish, monitor and enforce Quality of Service (QoS) and Service Level Agreements (SLAs), as well as how to build those systems such that they can proactively adapt to dynamically changing requirements and context conditions. Developing foundational solutions for those challenges requires joint efforts of different research communities such as Business Process Management, Grid Computing, Service Oriented Computing and Software Engineering. This paper provides an overview of S-Cube, the European Network of Excellence on Software Services and Systems. S-Cube brings together researchers from leading research institutions across Europe, who join their competences to develop foundations, theories as well as methods and tools for future service-based systems.

  2. Descriptions of Free and Freeware Software in the Mathematics Teaching

    NASA Astrophysics Data System (ADS)

    Antunes de Macedo, Josue; Neves de Almeida, Samara; Voelzke, Marcos Rincon

    2016-05-01

    This paper presents the analysis and the cataloging of free and freeware mathematical software available on the internet, a brief explanation of them, and types of licenses for use in teaching and learning. The methodology is based on the qualitative research. Among the different types of software found, it stands out in algebra, the Winmat, that works with linear algebra, matrices and linear systems. In geometry, the GeoGebra, which can be used in the study of functions, plan and spatial geometry, algebra and calculus. For graphing, can quote the Graph and Graphequation. With Graphmatica software, it is possible to build various graphs of mathematical equations on the same screen, representing cartesian equations, inequalities, parametric among other functions. The Winplot allows the user to build graphics in two and three dimensions functions and mathematical equations. Thus, this work aims to present the teachers some free math software able to be used in the classroom.

  3. CHARIS (Contribution to High Asia Runoff from Ice and Snow) Lessons Learned in Capacity-Building for Hydrological Sciences with Asian Partner Communities

    NASA Astrophysics Data System (ADS)

    Brodzik, M. J.; Armstrong, R. L.; Armstrong, B. R.; Barrett, A. P.; Fetterer, F. M.; Hill, A. F.; Hughes, H.; Khalsa, S. J. S.; Racoviteanu, A.; Raup, B. H.; Rittger, K.; Williams, M. W.; Wilson, A. M.

    2016-12-01

    Funded by USAID and based at the University of Colorado, the Contribution to High Asia Runoff from Ice & Snow (CHARIS) project has among its objectives both scientific and capacity-building goals. We are systematically assessing the role of glaciers and seasonal snow in the freshwater resources of High Asia to better forecast future availability and vulnerability of water resources in the region. We are collaborating with Asian partner institutions in eight nations across High Asia (Bhutan, Nepal, India, Pakistan, Afghanistan, Kazakhstan, Kyrgyzstan and Tajikistan). Our capacity-building activities include data-sharing, training, supporting field work and education and infrastructure development, which includes creating the only water-chemistry laboratory of its kind in Bhutan. We have also derived reciprocal benefits from our partners, learning from their specialized local knowledge and obtaining access to otherwise unavailable in situ data. Our presentation will share lessons learned in our annual training workshops with our Asian collaborators, at which we have interspersed remote sensing and hydrological modelling lectures with GIS and python programming, and hands-on applications using remote sensing data. Our challenges have included technological issues such as: power incompatibilities, reliable shipping methods to remote locations, bandwidth limitations to transferring large remote sensing data sets, cost of proprietary software, choosing among free software alternatives, and negotiating the formats and jargon of remote sensing data to get to the science as quickly as possible. We will describe successes and failures in training methods we have used, what we look for in training venue facilities, and how our approach has changed in response to student evaluations and partner feedback.

  4. Sivasathya Pradha Balamurugan | NREL

    Science.gov Websites

    Researcher II-Software Engineering SivasathyaPradha.Balamurugan@nrel.gov | 303-275-3883 Sivasathya joined NREL in 2017. Her research is focused on developing and supporting software for energy management in buildings. Her background is in software development, applied cryptography, and hardware. Education M.S

  5. Proposal for constructing an advanced software tool for planetary atmospheric modeling

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Sims, Michael H.; Podolak, Esther; Mckay, Christopher P.; Thompson, David E.

    1990-01-01

    Scientific model building can be a time intensive and painstaking process, often involving the development of large and complex computer programs. Despite the effort involved, scientific models cannot easily be distributed and shared with other scientists. In general, implemented scientific models are complex, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We believe that advanced software techniques can facilitate both the model building and model sharing process. We propose to construct a scientific modeling software tool that serves as an aid to the scientist in developing and using models. The proposed tool will include an interactive intelligent graphical interface and a high level, domain specific, modeling language. As a testbed for this research, we propose development of a software prototype in the domain of planetary atmospheric modeling.

  6. Development and Analysis of a Bi-Directional Tidal Turbine

    DTIC Science & Technology

    2012-03-01

    commercial CFD software ANSYS CFX was utilized to build a turbine map. The basic turbine map was developed for a 25 blade bi-axial turbine under...directional turbine created for this purpose. In the present study, the commercial CFD software ANSYS CFX was utilized to build a turbine map. The...sheath C. PROBLEM SPECIFICATIONS AND BOUNDARY CONDITIONS The simulation definition was created using ANSYS CFX -Pre. The best measurements to determine

  7. Ben Polly | NREL

    Science.gov Websites

    joined NREL in 2009. He specializes in the development, validation, and application of energy analysis approaches for individual buildings and collections of buildings. He has developed and applied software (e.g., building energy storage). In addition to his work at NREL, Ben serves on the Building

  8. Building high-quality assay libraries for targeted analysis of SWATH MS data.

    PubMed

    Schubert, Olga T; Gillet, Ludovic C; Collins, Ben C; Navarro, Pedro; Rosenberger, George; Wolski, Witold E; Lam, Henry; Amodei, Dario; Mallick, Parag; MacLean, Brendan; Aebersold, Ruedi

    2015-03-01

    Targeted proteomics by selected/multiple reaction monitoring (S/MRM) or, on a larger scale, by SWATH (sequential window acquisition of all theoretical spectra) MS (mass spectrometry) typically relies on spectral reference libraries for peptide identification. Quality and coverage of these libraries are therefore of crucial importance for the performance of the methods. Here we present a detailed protocol that has been successfully used to build high-quality, extensive reference libraries supporting targeted proteomics by SWATH MS. We describe each step of the process, including data acquisition by discovery proteomics, assertion of peptide-spectrum matches (PSMs), generation of consensus spectra and compilation of MS coordinates that uniquely define each targeted peptide. Crucial steps such as false discovery rate (FDR) control, retention time normalization and handling of post-translationally modified peptides are detailed. Finally, we show how to use the library to extract SWATH data with the open-source software Skyline. The protocol takes 2-3 d to complete, depending on the extent of the library and the computational resources available.

  9. Simulation of three-phase induction motor drives using indirect field oriented control in PSIM environment

    NASA Astrophysics Data System (ADS)

    Aziri, Hasif; Patakor, Fizatul Aini; Sulaiman, Marizan; Salleh, Zulhisyam

    2017-09-01

    This paper presents the simulation of three-phase induction motor drives using Indirect Field Oriented Control (IFOC) in PSIM environment. The asynchronous machine is well known about natural limitations fact of highly nonlinearity and complexity of motor model. In order to resolve these problems, the IFOC is applied to control the instantaneous electrical quantities such as torque and flux component. As FOC is controlling the stator current that represented by a vector, the torque component is aligned with d coordinate while the flux component is aligned with q coordinate. There are five levels of the incremental system are gradually built up to verify and testing the software module in the system. Indeed, all of system build levels are verified and successfully tested in PSIM environment. Moreover, the corresponding system of five build levels are simulated in PSIM environment which is user-friendly for simulation studies in order to explore the performance of speed responses based on IFOC algorithm for three-phase induction motor drives.

  10. Advanced information processing system: Local system services

    NASA Technical Reports Server (NTRS)

    Burkhardt, Laura; Alger, Linda; Whittredge, Roy; Stasiowski, Peter

    1989-01-01

    The Advanced Information Processing System (AIPS) is a multi-computer architecture composed of hardware and software building blocks that can be configured to meet a broad range of application requirements. The hardware building blocks are fault-tolerant, general-purpose computers, fault-and damage-tolerant networks (both computer and input/output), and interfaces between the networks and the computers. The software building blocks are the major software functions: local system services, input/output, system services, inter-computer system services, and the system manager. The foundation of the local system services is an operating system with the functions required for a traditional real-time multi-tasking computer, such as task scheduling, inter-task communication, memory management, interrupt handling, and time maintenance. Resting on this foundation are the redundancy management functions necessary in a redundant computer and the status reporting functions required for an operator interface. The functional requirements, functional design and detailed specifications for all the local system services are documented.

  11. Toward a Progress Indicator for Machine Learning Model Building and Data Mining Algorithm Execution: A Position Paper.

    PubMed

    Luo, Gang

    2017-12-01

    For user-friendliness, many software systems offer progress indicators for long-duration tasks. A typical progress indicator continuously estimates the remaining task execution time as well as the portion of the task that has been finished. Building a machine learning model often takes a long time, but no existing machine learning software supplies a non-trivial progress indicator. Similarly, running a data mining algorithm often takes a long time, but no existing data mining software provides a nontrivial progress indicator. In this article, we consider the problem of offering progress indicators for machine learning model building and data mining algorithm execution. We discuss the goals and challenges intrinsic to this problem. Then we describe an initial framework for implementing such progress indicators and two advanced, potential uses of them, with the goal of inspiring future research on this topic.

  12. Toward a Progress Indicator for Machine Learning Model Building and Data Mining Algorithm Execution: A Position Paper

    PubMed Central

    Luo, Gang

    2017-01-01

    For user-friendliness, many software systems offer progress indicators for long-duration tasks. A typical progress indicator continuously estimates the remaining task execution time as well as the portion of the task that has been finished. Building a machine learning model often takes a long time, but no existing machine learning software supplies a non-trivial progress indicator. Similarly, running a data mining algorithm often takes a long time, but no existing data mining software provides a nontrivial progress indicator. In this article, we consider the problem of offering progress indicators for machine learning model building and data mining algorithm execution. We discuss the goals and challenges intrinsic to this problem. Then we describe an initial framework for implementing such progress indicators and two advanced, potential uses of them, with the goal of inspiring future research on this topic. PMID:29177022

  13. Stepwise approach to establishing multiple outreach laboratory information system-electronic medical record interfaces.

    PubMed

    Pantanowitz, Liron; Labranche, Wayne; Lareau, William

    2010-05-26

    Clinical laboratory outreach business is changing as more physician practices adopt an electronic medical record (EMR). Physician connectivity with the laboratory information system (LIS) is consequently becoming more important. However, there are no reports available to assist the informatician with establishing and maintaining outreach LIS-EMR connectivity. A four-stage scheme is presented that was successfully employed to establish unidirectional and bidirectional interfaces with multiple physician EMRs. This approach involves planning (step 1), followed by interface building (step 2) with subsequent testing (step 3), and finally ongoing maintenance (step 4). The role of organized project management, software as a service (SAAS), and alternate solutions for outreach connectivity are discussed.

  14. Stepwise approach to establishing multiple outreach laboratory information system-electronic medical record interfaces

    PubMed Central

    Pantanowitz, Liron; LaBranche, Wayne; Lareau, William

    2010-01-01

    Clinical laboratory outreach business is changing as more physician practices adopt an electronic medical record (EMR). Physician connectivity with the laboratory information system (LIS) is consequently becoming more important. However, there are no reports available to assist the informatician with establishing and maintaining outreach LIS–EMR connectivity. A four-stage scheme is presented that was successfully employed to establish unidirectional and bidirectional interfaces with multiple physician EMRs. This approach involves planning (step 1), followed by interface building (step 2) with subsequent testing (step 3), and finally ongoing maintenance (step 4). The role of organized project management, software as a service (SAAS), and alternate solutions for outreach connectivity are discussed. PMID:20805958

  15. Implementing Software Safety in the NASA Environment

    NASA Technical Reports Server (NTRS)

    Wetherholt, Martha S.; Radley, Charles F.

    1994-01-01

    Until recently, NASA did not consider allowing computers total control of flight systems. Human operators, via hardware, have constituted the ultimate safety control. In an attempt to reduce costs, NASA has come to rely more and more heavily on computers and software to control space missions. (For example. software is now planned to control most of the operational functions of the International Space Station.) Thus the need for systematic software safety programs has become crucial for mission success. Concurrent engineering principles dictate that safety should be designed into software up front, not tested into the software after the fact. 'Cost of Quality' studies have statistics and metrics to prove the value of building quality and safety into the development cycle. Unfortunately, most software engineers are not familiar with designing for safety, and most safety engineers are not software experts. Software written to specifications which have not been safety analyzed is a major source of computer related accidents. Safer software is achieved step by step throughout the system and software life cycle. It is a process that includes requirements definition, hazard analyses, formal software inspections, safety analyses, testing, and maintenance. The greatest emphasis is placed on clearly and completely defining system and software requirements, including safety and reliability requirements. Unfortunately, development and review of requirements are the weakest link in the process. While some of the more academic methods, e.g. mathematical models, may help bring about safer software, this paper proposes the use of currently approved software methodologies, and sound software and assurance practices to show how, to a large degree, safety can be designed into software from the start. NASA's approach today is to first conduct a preliminary system hazard analysis (PHA) during the concept and planning phase of a project. This determines the overall hazard potential of the system to be built. Shortly thereafter, as the system requirements are being defined, the second iteration of hazard analyses takes place, the systems hazard analysis (SHA). During the systems requirements phase, decisions are made as to what functions of the system will be the responsibility of software. This is the most critical time to affect the safety of the software. From this point, software safety analyses as well as software engineering practices are the main focus for assuring safe software. While many of the steps proposed in this paper seem like just sound engineering practices, they are the best technical and most cost effective means to assure safe software within a safe system.

  16. Eric Bonnema | NREL

    Science.gov Websites

    contributes to the research efforts for commercial buildings. This effort is dedicated to studying the , commercial sector whole-building energy simulation, scientific computing, and software configuration and

  17. A SOFTWARE TOOL TO COMPARE MEASURED AND SIMULATED BUILDING ENERGY PERFORMANCE DATA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maile, Tobias; Bazjanac, Vladimir; O'Donnell, James

    2011-11-01

    Building energy performance is often inadequate when compared to design goals. To link design goals to actual operation one can compare measured with simulated energy performance data. Our previously developed comparison approach is the Energy Performance Comparison Methodology (EPCM), which enables the identification of performance problems based on a comparison of measured and simulated performance data. In context of this method, we developed a software tool that provides graphing and data processing capabilities of the two performance data sets. The software tool called SEE IT (Stanford Energy Efficiency Information Tool) eliminates the need for manual generation of data plots andmore » data reformatting. SEE IT makes the generation of time series, scatter and carpet plots independent of the source of data (measured or simulated) and provides a valuable tool for comparing measurements with simulation results. SEE IT also allows assigning data points on a predefined building object hierarchy and supports different versions of simulated performance data. This paper briefly introduces the EPCM, describes the SEE IT tool and illustrates its use in the context of a building case study.« less

  18. Campus-Based Practices for Promoting Student Success: Software Solutions. Research Brief

    ERIC Educational Resources Information Center

    Horn, Aaron S.; Reinert, Leah; Reis, Michael

    2015-01-01

    Colleges and universities are increasingly adopting various software solutions to raise degree completion rates and lower costs (Ferguson, 2012; Vendituoli, 2014; Yanosky, 2014). Student success software, also known as Integrated Planning and Advising Services (IPAS), appears to be in high demand among both students and faculty (Dahlstrom &…

  19. Computers and Young Children. Storyboard Software: Flannel Boards in the Computer Age.

    ERIC Educational Resources Information Center

    Shade, Daniel D.

    1995-01-01

    Describes storyboard software as computer programs with which children can build a story using visuals. Notes the importance of such programs from preliterate or nonreading children. Describes a new storyboard program, "Wiggins in Storyland," and its features. Lists recommended storyboard software programs, with publishers and compatible…

  20. Modeling and MBL: Software Tools for Science.

    ERIC Educational Resources Information Center

    Tinker, Robert F.

    Recent technological advances and new software packages put unprecedented power for experimenting and theory-building in the hands of students at all levels. Microcomputer-based laboratory (MBL) and model-solving tools illustrate the educational potential of the technology. These tools include modeling software and three MBL packages (which are…

  1. Methods and Software for Building Bibliographic Data Bases.

    ERIC Educational Resources Information Center

    Daehn, Ralph M.

    1985-01-01

    This in-depth look at database management systems (DBMS) for microcomputers covers data entry, information retrieval, security, DBMS software and design, and downloading of literature search results. The advantages of in-house systems versus online search vendors are discussed, and specifications of three software packages and 14 sources are…

  2. Moving Secure Software Assurance into Higher Education: A Roadmap for Change

    DTIC Science & Technology

    2011-06-02

    Summarized: The Issue: 6/2/20118 Software defects are currently a fact of life Software defects are avenues of security vulnerabilities that cyber ... criminals , terrorists, or hostile nations can exploit. We (THE ENTIRE INDUSTY) need to change the way we build systems Decrease the number of defects

  3. Building Software Development Capacity to Advance the State of Educational Technology

    ERIC Educational Resources Information Center

    Luterbach, Kenneth J.

    2013-01-01

    Educational technologists may advance the state of the field by increasing capacity to develop software tools and instructional applications. Presently, few academic programs in educational technology require even a single computer programming course. Further, the educational technologists who develop software generally work independently or in…

  4. Eric Wilson | NREL

    Science.gov Websites

    , developing an analysis framework and data visualization for national residential building stock models, and include developing multifamily modeling capabilities for the BEopt building energy optimization software

  5. Hardware and software improvements to a low-cost horizontal parallax holographic video monitor.

    PubMed

    Henrie, Andrew; Codling, Jesse R; Gneiting, Scott; Christensen, Justin B; Awerkamp, Parker; Burdette, Mark J; Smalley, Daniel E

    2018-01-01

    Displays capable of true holographic video have been prohibitively expensive and difficult to build. With this paper, we present a suite of modularized hardware components and software tools needed to build a HoloMonitor with basic "hacker-space" equipment, highlighting improvements that have enabled the total materials cost to fall to $820, well below that of other holographic displays. It is our hope that the current level of simplicity, development, design flexibility, and documentation will enable the lay engineer, programmer, and scientist to relatively easily replicate, modify, and build upon our designs, bringing true holographic video to the masses.

  6. Leveraging the Unified Access Framework: A Tale of an Integrated Ocean Data Prototype

    NASA Astrophysics Data System (ADS)

    O'Brien, K.; Kern, K.; Smith, B.; Schweitzer, R.; Simons, R.; Mendelssohn, R.; Diggs, S. C.; Belbeoch, M.; Hankin, S.

    2014-12-01

    The Tropical Pacific Observing System (TPOS) has been functioning and capturing measurements since the mid 1990s during the very successful Tropical Ocean Global Atmosphere (TOGA) project. Unfortunately, in the current environment, some 20 years after the end of the TOGA project, sustaining the observing system is proving difficult. With the many advances in methods of observing the ocean, a group of scientists is taking a fresh look at what the Tropical Pacific Observing System requires for sustainability. This includes utilizing a wide variety of observing system platforms, including Argo floats, unmanned drifters, moorings, ships, etc. This variety of platforms measuring ocean data also provides a significant challenge in terms of integrated data management. It is recognized that data and information management is crucial to the success and impact of any observing system. In order to be successful, it is also crucial to avoid building stovepipes for data management. To that end, NOAA's Observing System Monitoring Center (OSMC) has been tasked to create a testbed of integrated real time and delayed mode observations for the Tropical Pacific region in support of the TPOS. The observing networks included in the prototype are: Argo floats, OceanSites moorings, drifting buoys, hydrographic surveys, underway carbon observations and, of course, real time ocean measurements. In this presentation, we will discuss how the OSMC project is building the integrated data prototype using existing free and open source software. We will explore how we are leveraging successful data management frameworks pioneered by efforts such as NOAA's Unified Access Framework project. We will also show examples of how conforming to well known conventions and standards allows for discoverability, usability and interoperability of data.

  7. Ry Horsey | NREL

    Science.gov Websites

    Ry Horsey Photo of Henry Horsey Ry Horsey Software Developer - Commercial Buildings Energy Modeling the field of commercial building energy modeling. He is particularly interested in the increasing tools to support large-scale commercial building energy modeling. This work has led to contributions to

  8. Software-engineering challenges of building and deploying reusable problem solvers.

    PubMed

    O'Connor, Martin J; Nyulas, Csongor; Tu, Samson; Buckeridge, David L; Okhmatovskaia, Anna; Musen, Mark A

    2009-11-01

    Problem solving methods (PSMs) are software components that represent and encode reusable algorithms. They can be combined with representations of domain knowledge to produce intelligent application systems. A goal of research on PSMs is to provide principled methods and tools for composing and reusing algorithms in knowledge-based systems. The ultimate objective is to produce libraries of methods that can be easily adapted for use in these systems. Despite the intuitive appeal of PSMs as conceptual building blocks, in practice, these goals are largely unmet. There are no widely available tools for building applications using PSMs and no public libraries of PSMs available for reuse. This paper analyzes some of the reasons for the lack of widespread adoptions of PSM techniques and illustrate our analysis by describing our experiences developing a complex, high-throughput software system based on PSM principles. We conclude that many fundamental principles in PSM research are useful for building knowledge-based systems. In particular, the task-method decomposition process, which provides a means for structuring knowledge-based tasks, is a powerful abstraction for building systems of analytic methods. However, despite the power of PSMs in the conceptual modeling of knowledge-based systems, software engineering challenges have been seriously underestimated. The complexity of integrating control knowledge modeled by developers using PSMs with the domain knowledge that they model using ontologies creates a barrier to widespread use of PSM-based systems. Nevertheless, the surge of recent interest in ontologies has led to the production of comprehensive domain ontologies and of robust ontology-authoring tools. These developments present new opportunities to leverage the PSM approach.

  9. GABBs: Cyberinfrastructure for Self-Service Geospatial Data Exploration, Computation, and Sharing

    NASA Astrophysics Data System (ADS)

    Song, C. X.; Zhao, L.; Biehl, L. L.; Merwade, V.; Villoria, N.

    2016-12-01

    Geospatial data are present everywhere today with the proliferation of location-aware computing devices. This is especially true in the scientific community where large amounts of data are driving research and education activities in many domains. Collaboration over geospatial data, for example, in modeling, data analysis and visualization, must still overcome the barriers of specialized software and expertise among other challenges. In addressing these needs, the Geospatial data Analysis Building Blocks (GABBs) project aims at building geospatial modeling, data analysis and visualization capabilities in an open source web platform, HUBzero. Funded by NSF's Data Infrastructure Building Blocks initiative, GABBs is creating a geospatial data architecture that integrates spatial data management, mapping and visualization, and interfaces in the HUBzero platform for scientific collaborations. The geo-rendering enabled Rappture toolkit, a generic Python mapping library, geospatial data exploration and publication tools, and an integrated online geospatial data management solution are among the software building blocks from the project. The GABBS software will be available through Amazon's AWS Marketplace VM images and open source. Hosting services are also available to the user community. The outcome of the project will enable researchers and educators to self-manage their scientific data, rapidly create GIS-enable tools, share geospatial data and tools on the web, and build dynamic workflows connecting data and tools, all without requiring significant software development skills, GIS expertise or IT administrative privileges. This presentation will describe the GABBs architecture, toolkits and libraries, and showcase the scientific use cases that utilize GABBs capabilities, as well as the challenges and solutions for GABBs to interoperate with other cyberinfrastructure platforms.

  10. Software-engineering challenges of building and deploying reusable problem solvers

    PubMed Central

    O’CONNOR, MARTIN J.; NYULAS, CSONGOR; TU, SAMSON; BUCKERIDGE, DAVID L.; OKHMATOVSKAIA, ANNA; MUSEN, MARK A.

    2012-01-01

    Problem solving methods (PSMs) are software components that represent and encode reusable algorithms. They can be combined with representations of domain knowledge to produce intelligent application systems. A goal of research on PSMs is to provide principled methods and tools for composing and reusing algorithms in knowledge-based systems. The ultimate objective is to produce libraries of methods that can be easily adapted for use in these systems. Despite the intuitive appeal of PSMs as conceptual building blocks, in practice, these goals are largely unmet. There are no widely available tools for building applications using PSMs and no public libraries of PSMs available for reuse. This paper analyzes some of the reasons for the lack of widespread adoptions of PSM techniques and illustrate our analysis by describing our experiences developing a complex, high-throughput software system based on PSM principles. We conclude that many fundamental principles in PSM research are useful for building knowledge-based systems. In particular, the task–method decomposition process, which provides a means for structuring knowledge-based tasks, is a powerful abstraction for building systems of analytic methods. However, despite the power of PSMs in the conceptual modeling of knowledge-based systems, software engineering challenges have been seriously underestimated. The complexity of integrating control knowledge modeled by developers using PSMs with the domain knowledge that they model using ontologies creates a barrier to widespread use of PSM-based systems. Nevertheless, the surge of recent interest in ontologies has led to the production of comprehensive domain ontologies and of robust ontology-authoring tools. These developments present new opportunities to leverage the PSM approach. PMID:23565031

  11. Matriarch: A Python Library for Materials Architecture.

    PubMed

    Giesa, Tristan; Jagadeesan, Ravi; Spivak, David I; Buehler, Markus J

    2015-10-12

    Biological materials, such as proteins, often have a hierarchical structure ranging from basic building blocks at the nanoscale (e.g., amino acids) to assembled structures at the macroscale (e.g., fibers). Current software for materials engineering allows the user to specify polypeptide chains and simple secondary structures prior to molecular dynamics simulation, but is not flexible in terms of the geometric arrangement of unequilibrated structures. Given some knowledge of a larger-scale structure, instructing the software to create it can be very difficult and time-intensive. To this end, the present paper reports a mathematical language, using category theory, to describe the architecture of a material, i.e., its set of building blocks and instructions for combining them. While this framework applies to any hierarchical material, here we concentrate on proteins. We implement this mathematical language as an open-source Python library called Matriarch. It is a domain-specific language that gives the user the ability to create almost arbitrary structures with arbitrary amino acid sequences and, from them, generate Protein Data Bank (PDB) files. In this way, Matriarch is more powerful than commercial software now available. Matriarch can be used in tandem with molecular dynamics simulations and helps engineers design and modify biologically inspired materials based on their desired functionality. As a case study, we use our software to alter both building blocks and building instructions for tropocollagen, and determine their effect on its structure and mechanical properties.

  12. Gate-to-gate Life-Cycle Inventory of Hardboard Production in North America

    Treesearch

    Richard Bergman

    2014-01-01

    Whole-building life-cycle assessments (LCAs) populated by life-cycle inventory (LCI) data are incorporated into environmental footprint software tools for establishing green building certification by building professionals and code. However, LCI data on some wood building products are still needed to help fill gaps in the data and thus provide a more complete picture...

  13. The US Army Corps of Engineers Roadmap for Life-Cycle Building Information Modeling (BIM)

    DTIC Science & Technology

    2012-11-01

    Building Information Modeling ( BIM ) En gi ne er R es ea rc h an...Abstract Building Information Modeling ( BIM ) technology has rapidly gained ac- ceptance throughout the planning, architecture, engineering...the Industry Foundation Class (IFC) definitions to create vendor-neutral data exchanges for use in BIM software tools. Building Information Modeling

  14. Validation of Tendril TrueHome Using Software-to-Software Comparison

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maguire, Jeffrey B; Horowitz, Scott G; Moore, Nathan

    This study performed comparative evaluation of EnergyPlus version 8.6 and Tendril TrueHome, two physics-based home energy simulation models, to identify differences in energy consumption predictions between the two programs and resolve discrepancies between them. EnergyPlus is considered a benchmark, best-in-class software tool for building energy simulation. This exercise sought to improve both software tools through additional evaluation/scrutiny.

  15. Toward Reusable Graphics Components in Ada

    DTIC Science & Technology

    1993-03-01

    Then alternatives for obtaining well- engineered reusable software components were examined. Finally, the alternatives were analyzed, and the most...reusable software components. Chapter 4 describes detailed design and implementation strategies in building a well- engineered reusable set of components in...study. 2.2 The Object-Oriented Paradigm 2.2.1 The Need for Object-Oriented Techniques. Among software engineers the software crisis is a well known

  16. PD5: a general purpose library for primer design software.

    PubMed

    Riley, Michael C; Aubrey, Wayne; Young, Michael; Clare, Amanda

    2013-01-01

    Complex PCR applications for large genome-scale projects require fast, reliable and often highly sophisticated primer design software applications. Presently, such applications use pipelining methods to utilise many third party applications and this involves file parsing, interfacing and data conversion, which is slow and prone to error. A fully integrated suite of software tools for primer design would considerably improve the development time, the processing speed, and the reliability of bespoke primer design software applications. The PD5 software library is an open-source collection of classes and utilities, providing a complete collection of software building blocks for primer design and analysis. It is written in object-oriented C(++) with an emphasis on classes suitable for efficient and rapid development of bespoke primer design programs. The modular design of the software library simplifies the development of specific applications and also integration with existing third party software where necessary. We demonstrate several applications created using this software library that have already proved to be effective, but we view the project as a dynamic environment for building primer design software and it is open for future development by the bioinformatics community. Therefore, the PD5 software library is published under the terms of the GNU General Public License, which guarantee access to source-code and allow redistribution and modification. The PD5 software library is downloadable from Google Code and the accompanying Wiki includes instructions and examples: http://code.google.com/p/primer-design.

  17. An Analysis of Open Source Security Software Products Downloads

    ERIC Educational Resources Information Center

    Barta, Brian J.

    2014-01-01

    Despite the continued demand for open source security software, a gap in the identification of success factors related to the success of open source security software persists. There are no studies that accurately assess the extent of this persistent gap, particularly with respect to the strength of the relationships of open source software…

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Foucar, James G.; Salinger, Andrew G.; Deakin, Michael

    CIME is the software infrastructure for configuring, building, running, and testing an Earth system model. It can be developed and tested as stand-alone software, but its main role is to be integrating into the CESM and ACME Earth system models.

  19. Software Reviews.

    ERIC Educational Resources Information Center

    Miller, Anne, Ed.; Radziemski, Cathy, Ed.

    1988-01-01

    Three pieces of computer software are described and reviewed: HyperCard, to build and use varied applications; Iggy's Gnees, for problem solving with shapes in grades kindergarten-two; and Algebra Shop, for practicing skills and problem solving. (MNS)

  20. Key ingredients needed when building large data processing systems for scientists

    NASA Technical Reports Server (NTRS)

    Miller, K. C.

    2002-01-01

    Why is building a large science software system so painful? Weren't teams of software engineers supposed to make life easier for scientists? Does it sometimes feel as if it would be easier to write the million lines of code in Fortran 77 yourself? The cause of this dissatisfaction is that many of the needs of the science customer remain hidden in discussions with software engineers until after a system has already been built. In fact, many of the hidden needs of the science customer conflict with stated needs and are therefore very difficult to meet unless they are addressed from the outset in a system's architectural requirements. What's missing is the consideration of a small set of key software properties in initial agreements about the requirements, the design and the cost of the system.

  1. A Roadmap to Continuous Integration for ATLAS Software Development

    NASA Astrophysics Data System (ADS)

    Elmsheuser, J.; Krasznahorkay, A.; Obreshkov, E.; Undrus, A.; ATLAS Collaboration

    2017-10-01

    The ATLAS software infrastructure facilitates efforts of more than 1000 developers working on the code base of 2200 packages with 4 million lines of C++ and 1.4 million lines of python code. The ATLAS offline code management system is the powerful, flexible framework for processing new package versions requests, probing code changes in the Nightly Build System, migration to new platforms and compilers, deployment of production releases for worldwide access and supporting physicists with tools and interfaces for efficient software use. It maintains multi-stream, parallel development environment with about 70 multi-platform branches of nightly releases and provides vast opportunities for testing new packages, for verifying patches to existing software and for migrating to new platforms and compilers. The system evolution is currently aimed on the adoption of modern continuous integration (CI) practices focused on building nightly releases early and often, with rigorous unit and integration testing. This paper describes the CI incorporation program for the ATLAS software infrastructure. It brings modern open source tools such as Jenkins and GitLab into the ATLAS Nightly System, rationalizes hardware resource allocation and administrative operations, provides improved feedback and means to fix broken builds promptly for developers. Once adopted, ATLAS CI practices will improve and accelerate innovation cycles and result in increased confidence in new software deployments. The paper reports the status of Jenkins integration with the ATLAS Nightly System as well as short and long term plans for the incorporation of CI practices.

  2. Software is a Product...Not

    DTIC Science & Technology

    1992-09-01

    understand the process if we consider software as a service , not a prod- uct. Let me expand on this statement. I do not believe we must do any of the... software -building activities differently. Instead, from the perspective of schedul- ing, budgeting, and delivering software , we should use the service ...While we’re not perfect, we do a fairly its upgrades. The pricing scheme be- good job of managing hardware engi- Software as a service . What is a

  3. Usefulness of Cone-Beam Computed Tomography and Automatic Vessel Detection Software in Emergency Transarterial Embolization.

    PubMed

    Carrafiello, Gianpaolo; Ierardi, Anna Maria; Duka, Ejona; Radaelli, Alessandro; Floridi, Chiara; Bacuzzi, Alessandro; de Bucourt, Maximilian; De Marchi, Giuseppe

    2016-04-01

    This study was designed to evaluate the utility of dual phase cone beam computed tomography (DP-CBCT) and automatic vessel detection (AVD) software to guide transarterial embolization (TAE) of angiographically challenging arterial bleedings in emergency settings. Twenty patients with an arterial bleeding at computed tomography angiography and an inconclusive identification of the bleeding vessel at the initial 2D angiographic series were included. Accuracy of DP-CBCT and AVD software were defined as the ability to detect the bleeding site and the culprit arterial bleeder, respectively. Technical success was defined as the correct positioning of the microcatheter using AVD software. Clinical success was defined as the successful embolization. Total volume of iodinated contrast medium and overall procedure time were registered. The bleeding site was not detected by initial angiogram in 20% of cases, while impossibility to identify the bleeding vessel was the reason for inclusion in the remaining cases. The bleeding site was detected by DP-CBCT in 19 of 20 (95%) patients; in one case CBCT-CT fusion was required. AVD software identified the culprit arterial branch in 18 of 20 (90%) cases. In two cases, vessel tracking required manual marking of the candidate arterial bleeder. Technical success was 95%. Successful embolization was achieved in all patients. Mean contrast volume injected for each patient was 77.5 ml, and mean overall procedural time was 50 min. C-arm CBCT and AVD software during TAE of angiographically challenging arterial bleedings is feasible and may facilitate successful embolization. Staff training in CBCT imaging and software manipulation is necessary.

  4. Home | Simulation Research

    Science.gov Websites

    Group specializes in the research, development and deployment of software that support the design and controls design, the Spawn of EnergyPlus next-generation simulation engine, for building and control energy systems tools for OpenBuildingControl to support control design, deployment and verification of building

  5. Human factors for capacity building: lessons learned from the OpenMRS implementers network.

    PubMed

    Seebregts, C J; Mamlin, B W; Biondich, P G; Fraser, H S F; Wolfe, B A; Jazayeri, D; Miranda, J; Blaya, J; Sinha, C; Bailey, C T; Kanter, A S

    2010-01-01

    The overall objective of this project was to investigate ways to strengthen the OpenMRS community by (i) developing capacity and implementing a network focusing specifically on the needs of OpenMRS implementers, (ii) strengthening community-driven aspects of OpenMRS and providing a dedicated forum for implementation-specific issues, and; (iii) providing regional support for OpenMRS implementations as well as mentorship and training. The methods used included (i) face-to-face networking using meetings and workshops; (ii) online collaboration tools, peer support and mentorship programmes; (iii) capacity and community development programmes, and; (iv) community outreach programmes. The community-driven approach, combined with a few simple interventions, has been a key factor in the growth and success of the OpenMRS Implementers Network. It has contributed to implementations in at least twenty-three different countries using basic online tools; and provided mentorship and peer support through an annual meeting, workshops and an internship program. The OpenMRS Implementers Network has formed collaborations with several other open source networks and is evolving regional OpenMRS Centres of Excellence to provide localized support for OpenMRS development and implementation. These initiatives are increasing the range of functionality and sustainability of open source software in the health domain, resulting in improved adoption and enterprise-readiness. Social organization and capacity development activities are important in growing a successful community-driven open source software model.

  6. Building Your Own Web Course: The Case for Off-the-Shelf Component Software.

    ERIC Educational Resources Information Center

    Kaplan, Howard

    1998-01-01

    Compares the features, advantages, and disadvantages of two major software options available for designing web courses: (1) component, off-the shelf software that allows for creation of audio slide lectures, course materials, discussion forums, animations, synchronous chat groups, quiz creators, and electronic mail, and (2) integrated packages…

  7. Development of a PC-based ground support system for a small satellite instrument

    NASA Astrophysics Data System (ADS)

    Deschambault, Robert L.; Gregory, Philip R.; Spenler, Stephen; Whalen, Brian A.

    1993-11-01

    The importance of effective ground support for the remote control and data retrieval of a satellite instrument cannot be understated. Problems with ground support may include the need to base personnel at a ground tracking station for extended periods, and the delay between the instrument observation and the processing of the data by the science team. Flexible solutions to such problems in the case of small satellite systems are provided by using low-cost, powerful personal computers and off-the-shelf software for data acquisition and processing, and by using Internet as a communication pathway to enable scientists to view and manipulate satellite data in real time at any ground location. The personal computer based ground support system is illustrated for the case of the cold plasma analyzer flown on the Freja satellite. Commercial software was used as building blocks for writing the ground support equipment software. Several levels of hardware support, including unit tests and development, functional tests, and integration were provided by portable and desktop personal computers. Satellite stations in Saskatchewan and Sweden were linked to the science team via phone lines and Internet, which provided remote control through a central point. These successful strategies will be used on future small satellite space programs.

  8. Methodology to Assess No Touch Audit Software Using Simulated Building Utility Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cheung, Howard; Braun, James E.; Langner, M. Rois

    This report describes a methodology developed for assessing the performance of no touch building audit tools and presents results for an available tool. Building audits are conducted in many commercial buildings to reduce building energy costs and improve building operation. Because the audits typically require significant input obtained by building engineers, they are usually only affordable for larger commercial building owners. In an effort to help small building and business owners gain the benefits of an audit at a lower cost, no touch building audit tools have been developed to remotely analyze a building's energy consumption.

  9. PNNL’s Building Operations Control Center

    ScienceCinema

    Belew, Shan

    2018-01-16

    PNNL's Building Operations Control Center (BOCC) video provides an overview of the center, its capabilities, and its objectives. The BOCC was relocated to PNNL's new 3820 Systems Engineering Building in 2015. Although a key focus of the BOCC is on monitoring and improving the operations of PNNL buildings, the center's state-of-the-art computational, software and visualization resources also have provided a platform for PNNL buildings-related research projects.

  10. SCIL Executive Summaries.

    ERIC Educational Resources Information Center

    Samuels, Alan R.; And Others

    1987-01-01

    These five papers by speakers at the Small Computers in Libraries 1987 conference include: "Acquiring and Using Shareware in Building Small Scale Automated Information systems" (Samuels); "A Software Lending Collection" (Talab); "Providing Subject Access to Microcomputer Software" (Mitchell); "Interfacing Vendor…

  11. Mapping Ad Hoc Communications Network of a Large Number Fixed-Wing UAV Swarm

    DTIC Science & Technology

    2017-03-01

    partitioned sub-swarms. The work covered in this thesis is to build a model of the NPS swarm’s communication network in ns-3 simulation software and use...partitioned sub- swarms. The work covered in this thesis is to build a model of the NPS swarm’s communication network in ns-3 simulation software and...NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS MAPPING AD HOC COMMUNICATIONS NETWORK OF A LARGE NUMBER FIXED-WING UAV SWARM by Alexis

  12. Scaling Retro-Commissioning to Small Commercial Buildings: A Turnkey Automated Hardware-Software Solution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, Guanjing; Granderson, J.; Brambley, Michael R.

    2015-07-01

    In the United States, small commercial buildings represent 51% of total floor space of all commercial buildings and consume nearly 3 quadrillion Btu (3.2 quintillion joule) of site energy annually, presenting an enormous opportunity for energy savings. Retro-commissioning (RCx), the process through which professional energy service providers identify and correct operational problems, has proven to be a cost-effective means to achieve median energy savings of 16%. However, retro-commissioning is not typically conducted at scale throughout the commercial stock. Very few small commercial buildings are retro-commissioned because utility expenses are relatively modest, margins are tighter, and capital for improvements is limited.more » In addition, small buildings do not have in-house staff with the expertise to identify improvement opportunities. In response, a turnkey hardware-software solution was developed to enable cost-effective, monitoring-based RCx of small commercial buildings. This highly tailored solution enables non-commissioning providers to identify energy and comfort problems, as well as associated cost impacts and remedies. It also facilitates scale by offering energy service providers the means to streamline their existing processes and reduce costs by more than half. The turnkey RCx sensor suitcase consists of two primary components: a suitcase of sensors for short-term building data collection that guides users through the process of deploying and retrieving their data and a software application that automates analysis of sensor data, identifies problems and generates recommendations. This paper presents the design and testing of prototype models, including descriptions of the hardware design, analysis algorithms, performance testing, and plans for dissemination.« less

  13. Reuse Metrics for Object Oriented Software

    NASA Technical Reports Server (NTRS)

    Bieman, James M.

    1998-01-01

    One way to increase the quality of software products and the productivity of software development is to reuse existing software components when building new software systems. In order to monitor improvements in reuse, the level of reuse must be measured. In this NASA supported project we (1) derived a suite of metrics which quantify reuse attributes for object oriented, object based, and procedural software, (2) designed prototype tools to take these measurements in Ada, C++, Java, and C software, (3) evaluated the reuse in available software, (4) analyzed the relationship between coupling, cohesion, inheritance, and reuse, (5) collected object oriented software systems for our empirical analyses, and (6) developed quantitative criteria and methods for restructuring software to improve reusability.

  14. Development of Efficient Authoring Software for e-Learning Contents

    NASA Astrophysics Data System (ADS)

    Kozono, Kazutake; Teramoto, Akemi; Akiyama, Hidenori

    The contents creation in e-Learning system becomes an important problem. The contents of e-Learning should include figure and voice media for a high-level educational effect. However, the use of figure and voice complicates the operation of authoring software considerably. A new authoring software, which can build e-Learning contents efficiently, has been developed to solve this problem. This paper reports development results of the authoring software.

  15. Safe Software for Space Applications: Building on the DO-178 Experience

    NASA Astrophysics Data System (ADS)

    Dorsey, Cheryl A.; Dorsey, Timothy A.

    2013-09-01

    DO-178, Software Considerations in Airborne Systems and Equipment Certification, is the well-known international standard dealing with the assurance of software used in airborne systems [1,2]. Insights into the DO-178 experiences, strengths and weaknesses can benefit the international space community. As DO-178 is an excellent standard for safe software development when used appropriately, this paper provides lessons learned and suggestions for using it effectively.

  16. Rapid Building Assessment Project

    DTIC Science & Technology

    2014-05-01

    ongoing management of commercial energy efficiency. No other company offers all of these proven services on a seamless, integrated Software -as-a- Service ...FirstFuel has added a suite of additional Software -as-a- Service analytics capabilities to support the entire energy efficiency lifecycle, including...the client side. In this document, we refer to the service side software as “BUILDER” and the client software as “BuilderRED,” following the Army

  17. Reliability Engineering for Service Oriented Architectures

    DTIC Science & Technology

    2013-02-01

    Common Object Request Broker Architecture Ecosystem In software , an ecosystem is a set of applications and/or services that grad- ually build up over time...Enterprise Service Bus Foreign In an SOA context: Any SOA, service or software which the owners of the calling software do not have control of, either...SOA Service Oriented Architecture SRE Software Reliability Engineering System Mode Many systems exhibit different modes of operation. E.g. the cockpit

  18. Investigation into the development of computer aided design software for space based sensors

    NASA Technical Reports Server (NTRS)

    Pender, C. W.; Clark, W. L.

    1987-01-01

    The described effort is phase one of the development of a Computer Aided Design (CAD) software to be used to perform radiometric sensor design. The software package will be referred to as SCAD and is directed toward the preliminary phase of the design of space based sensor system. The approach being followed is to develop a modern, graphic intensive, user friendly software package using existing software as building blocks. The emphasis will be directed toward the development of a shell containing menus, smart defaults, and interfaces, which can accommodate a wide variety of existing application software packages. The shell will offer expected utilities such as graphics, tailored menus, and a variety of drivers for I/O devices. Following the development of the shell, the development of SCAD is planned as chiefly selection and integration of appropriate building blocks. The phase one development activities have included: the selection of hardware which will be used with SCAD; the determination of the scope of SCAD; the preliminary evaluation of a number of software packages for applicability to SCAD; determination of a method for achieving required capabilities where voids exist; and then establishing a strategy for binding the software modules into an easy to use tool kit.

  19. WinHPC System Programming | High-Performance Computing | NREL

    Science.gov Websites

    Programming WinHPC System Programming Learn how to build and run an MPI (message passing interface (mpi.h) and library (msmpi.lib) are. To build from the command line, run... Start > Intel Software Development Tools > Intel C++ Compiler Professional... > C++ Build Environment for applications running

  20. Advancing Knowledge-Building Discourse through Judgments of Promising Ideas

    ERIC Educational Resources Information Center

    Chen, Bodong; Scardamalia, Marlene; Bereiter, Carl

    2015-01-01

    Evaluating promisingness of ideas is an important but underdeveloped aspect of knowledge building. The goal of this research was to examine the extent to which Grade 3 students could make promisingness judgments to facilitate knowledge-building discourse. A Promising Ideas Tool was added to Knowledge Forum software to better support…

  1. Final Technical Report - Center for Technology for Advanced Scientific Component Software (TASCS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sussman, Alan

    2014-10-21

    This is a final technical report for the University of Maryland work in the SciDAC Center for Technology for Advanced Scientific Component Software (TASCS). The Maryland work focused on software tools for coupling parallel software components built using the Common Component Architecture (CCA) APIs. Those tools are based on the Maryland InterComm software framework that has been used in multiple computational science applications to build large-scale simulations of complex physical systems that employ multiple separately developed codes.

  2. Apache Open Climate Workbench: Building Open Source Climate Science Tools and Community at the Apache Software Foundation

    NASA Astrophysics Data System (ADS)

    Joyce, M.; Ramirez, P.; Boustani, M.; Mattmann, C. A.; Khudikyan, S.; McGibbney, L. J.; Whitehall, K. D.

    2014-12-01

    Apache Open Climate Workbench (OCW; https://climate.apache.org/) is a Top-Level Project at the Apache Software Foundation that aims to provide a suite of tools for performing climate science evaluations using model outputs from a multitude of different sources (ESGF, CORDEX, U.S. NCA, NARCCAP) with remote sensing data from NASA, NOAA, and other agencies. Apache OCW is the second NASA project to become a Top-Level Project at the Apache Software Foundation. It grew out of the Jet Propulsion Laboratory's (JPL) Regional Climate Model Evaluation System (RCMES) project, a collaboration between JPL and the University of California, Los Angeles' Joint Institute for Regional Earth System Science and Engineering (JIFRESSE). Apache OCW provides scientists and developers with tools for data manipulation, metrics for dataset comparisons, and a visualization suite. In addition to a powerful low-level API, Apache OCW also supports a web application for quick, browser-controlled evaluations, a command line application for local evaluations, and a virtual machine for isolated experimentation with minimal setup. This talk will look at the difficulties and successes of moving a closed community research project out into the wild world of open source. We'll explore the growing pains Apache OCW went through to become a Top-Level Project at the Apache Software Foundation as well as the benefits gained by opening up development to the broader climate and computer science communities.

  3. Simulation-based Testing of Control Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ozmen, Ozgur; Nutaro, James J.; Sanyal, Jibonananda

    It is impossible to adequately test complex software by examining its operation in a physical prototype of the system monitored. Adequate test coverage can require millions of test cases, and the cost of equipment prototypes combined with the real-time constraints of testing with them makes it infeasible to sample more than a small number of these tests. Model based testing seeks to avoid this problem by allowing for large numbers of relatively inexpensive virtual prototypes that operate in simulation time at a speed limited only by the available computing resources. In this report, we describe how a computer system emulatormore » can be used as part of a model based testing environment; specifically, we show that a complete software stack including operating system and application software - can be deployed within a simulated environment, and that these simulations can proceed as fast as possible. To illustrate this approach to model based testing, we describe how it is being used to test several building control systems that act to coordinate air conditioning loads for the purpose of reducing peak demand. These tests involve the use of ADEVS (A Discrete Event System Simulator) and QEMU (Quick Emulator) to host the operational software within the simulation, and a building model developed with the MODELICA programming language using Buildings Library and packaged as an FMU (Functional Mock-up Unit) that serves as the virtual test environment.« less

  4. An Approach for On-Board Software Building Blocks Cooperation and Interfaces Definition

    NASA Astrophysics Data System (ADS)

    Pascucci, Dario; Campolo, Giovanni; Candia, Sante; Lisio, Giovanni

    2010-08-01

    This paper provides an insight on the Avionic SW architecture developed by Thales Alenia Space Italy (TAS-I) to achieve structuring of the OBSW as a set of self-standing and re-usable building blocks. It is initially described the underlying framework for building blocks cooperation, which is based on ECSSE-70 packets forwarding (for services request to a building block) and standard parameters exchange for data communication. Subsequently it is discussed the high level of flexibility and scalability of the resulting architecture, reporting as example an implementation of the Failure Detection, Isolation and Recovery (FDIR) function which exploits the proposed architecture. The presented approach evolves from avionic SW architecture developed in the scope of the project PRIMA (Mult-Purpose Italian Re-configurable Platform) and has been adopted for the Sentinel-1 Avionic Software (ASW).

  5. Geospatial-enabled Data Exploration and Computation through Data Infrastructure Building Blocks

    NASA Astrophysics Data System (ADS)

    Song, C. X.; Biehl, L. L.; Merwade, V.; Villoria, N.

    2015-12-01

    Geospatial data are present everywhere today with the proliferation of location-aware computing devices and sensors. This is especially true in the scientific community where large amounts of data are driving research and education activities in many domains. Collaboration over geospatial data, for example, in modeling, data analysis and visualization, must still overcome the barriers of specialized software and expertise among other challenges. The GABBs project aims at enabling broader access to geospatial data exploration and computation by developing spatial data infrastructure building blocks that leverage capabilities of end-to-end application service and virtualized computing framework in HUBzero. Funded by NSF Data Infrastructure Building Blocks (DIBBS) initiative, GABBs provides a geospatial data architecture that integrates spatial data management, mapping and visualization and will make it available as open source. The outcome of the project will enable users to rapidly create tools and share geospatial data and tools on the web for interactive exploration of data without requiring significant software development skills, GIS expertise or IT administrative privileges. This presentation will describe the development of geospatial data infrastructure building blocks and the scientific use cases that help drive the software development, as well as seek feedback from the user communities.

  6. RASTtk: A modular and extensible implementation of the RAST algorithm for building custom annotation pipelines and annotating batches of genomes

    DOE PAGES

    Brettin, Thomas; Davis, James J.; Disz, Terry; ...

    2015-02-10

    The RAST (Rapid Annotation using Subsystem Technology) annotation engine was built in 2008 to annotate bacterial and archaeal genomes. It works by offering a standard software pipeline for identifying genomic features (i.e., protein-encoding genes and RNA) and annotating their functions. Recently, in order to make RAST a more useful research tool and to keep pace with advancements in bioinformatics, it has become desirable to build a version of RAST that is both customizable and extensible. In this paper, we describe the RAST tool kit (RASTtk), a modular version of RAST that enables researchers to build custom annotation pipelines. RASTtk offersmore » a choice of software for identifying and annotating genomic features as well as the ability to add custom features to an annotation job. RASTtk also accommodates the batch submission of genomes and the ability to customize annotation protocols for batch submissions. This is the first major software restructuring of RAST since its inception.« less

  7. Healthy Air: Signs of Potential Problems in the Workplace

    MedlinePlus

    ... The U.S. Environmental Protection Agency has developed free software to help building professionals identify, solve and prevent ... Privacy Policy | Sitemap Our Family Of Sites nonprofit software Join the fight for healthy lungs and healthy ...

  8. Building a Library Web Server on a Budget.

    ERIC Educational Resources Information Center

    Orr, Giles

    1998-01-01

    Presents a method for libraries with limited budgets to create reliable Web servers with existing hardware and free software available via the Internet. Discusses staff, hardware and software requirements, and security; outlines the assembly process. (PEN)

  9. Object-oriented software design in semiautomatic building extraction

    NASA Astrophysics Data System (ADS)

    Guelch, Eberhard; Mueller, Hardo

    1997-08-01

    Developing a system for semiautomatic building acquisition is a complex process, that requires constant integration and updating of software modules and user interfaces. To facilitate these processes we apply an object-oriented design not only for the data but also for the software involved. We use the unified modeling language (UML) to describe the object-oriented modeling of the system in different levels of detail. We can distinguish between use cases from the users point of view, that represent a sequence of actions, yielding in an observable result and the use cases for the programmers, who can use the system as a class library to integrate the acquisition modules in their own software. The structure of the system is based on the model-view-controller (MVC) design pattern. An example from the integration of automated texture extraction for the visualization of results demonstrate the feasibility of this approach.

  10. The Ettention software package.

    PubMed

    Dahmen, Tim; Marsalek, Lukas; Marniok, Nico; Turoňová, Beata; Bogachev, Sviatoslav; Trampert, Patrick; Nickels, Stefan; Slusallek, Philipp

    2016-02-01

    We present a novel software package for the problem "reconstruction from projections" in electron microscopy. The Ettention framework consists of a set of modular building-blocks for tomographic reconstruction algorithms. The well-known block iterative reconstruction method based on Kaczmarz algorithm is implemented using these building-blocks, including adaptations specific to electron tomography. Ettention simultaneously features (1) a modular, object-oriented software design, (2) optimized access to high-performance computing (HPC) platforms such as graphic processing units (GPU) or many-core architectures like Xeon Phi, and (3) accessibility to microscopy end-users via integration in the IMOD package and eTomo user interface. We also provide developers with a clean and well-structured application programming interface (API) that allows for extending the software easily and thus makes it an ideal platform for algorithmic research while hiding most of the technical details of high-performance computing. Copyright © 2015 Elsevier B.V. All rights reserved.

  11. Empirical Evaluation of Hunk Metrics as Bug Predictors

    NASA Astrophysics Data System (ADS)

    Ferzund, Javed; Ahsan, Syed Nadeem; Wotawa, Franz

    Reducing the number of bugs is a crucial issue during software development and maintenance. Software process and product metrics are good indicators of software complexity. These metrics have been used to build bug predictor models to help developers maintain the quality of software. In this paper we empirically evaluate the use of hunk metrics as predictor of bugs. We present a technique for bug prediction that works at smallest units of code change called hunks. We build bug prediction models using random forests, which is an efficient machine learning classifier. Hunk metrics are used to train the classifier and each hunk metric is evaluated for its bug prediction capabilities. Our classifier can classify individual hunks as buggy or bug-free with 86 % accuracy, 83 % buggy hunk precision and 77% buggy hunk recall. We find that history based and change level hunk metrics are better predictors of bugs than code level hunk metrics.

  12. Student computer attitudes, experience and perceptions about the use of two software applications in Building Engineering

    NASA Astrophysics Data System (ADS)

    Chiner, Esther; Garcia-Vera, Victoria E.

    2017-11-01

    The purpose of this study was to examine students' computer attitudes and experience, as well as students' perceptions about the use of two specific software applications (Google Drive Spreadsheets and Arquimedes) in the Building Engineering context. The relationships among these variables were also examined. Ninety-two students took part in this study. Results suggest that students hold favourable computer attitudes. Moreover, it was found a significant positive relationship among students' attitudes and their computer experience. Findings also show that students find Arquimedes software more useful and with higher output quality than Google Drive Spreadsheets, while the latter is perceived to be easier to use. Regarding the relationship among students' attitudes towards the use of computers and their perceptions about the use of both software applications, only a significant positive relationship in the case of Arquimedes was found. Findings are discussed in terms of its implications for practice and further research.

  13. Usefulness of Cone-Beam Computed Tomography and Automatic Vessel Detection Software in Emergency Transarterial Embolization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carrafiello, Gianpaolo, E-mail: gcarraf@gmail.com; Ierardi, Anna Maria, E-mail: amierardi@yahoo.it; Duka, Ejona, E-mail: ejonaduka@hotmail.com

    BackgroundThis study was designed to evaluate the utility of dual phase cone beam computed tomography (DP-CBCT) and automatic vessel detection (AVD) software to guide transarterial embolization (TAE) of angiographically challenging arterial bleedings in emergency settings.MethodsTwenty patients with an arterial bleeding at computed tomography angiography and an inconclusive identification of the bleeding vessel at the initial 2D angiographic series were included. Accuracy of DP-CBCT and AVD software were defined as the ability to detect the bleeding site and the culprit arterial bleeder, respectively. Technical success was defined as the correct positioning of the microcatheter using AVD software. Clinical success was definedmore » as the successful embolization. Total volume of iodinated contrast medium and overall procedure time were registered.ResultsThe bleeding site was not detected by initial angiogram in 20 % of cases, while impossibility to identify the bleeding vessel was the reason for inclusion in the remaining cases. The bleeding site was detected by DP-CBCT in 19 of 20 (95 %) patients; in one case CBCT-CT fusion was required. AVD software identified the culprit arterial branch in 18 of 20 (90 %) cases. In two cases, vessel tracking required manual marking of the candidate arterial bleeder. Technical success was 95 %. Successful embolization was achieved in all patients. Mean contrast volume injected for each patient was 77.5 ml, and mean overall procedural time was 50 min.ConclusionsC-arm CBCT and AVD software during TAE of angiographically challenging arterial bleedings is feasible and may facilitate successful embolization. Staff training in CBCT imaging and software manipulation is necessary.« less

  14. Electricity Markets, Smart Grids and Smart Buildings

    NASA Astrophysics Data System (ADS)

    Falcey, Jonathan M.

    A smart grid is an electricity network that accommodates two-way power flows, and utilizes two-way communications and increased measurement, in order to provide more information to customers and aid in the development of a more efficient electricity market. The current electrical network is outdated and has many shortcomings relating to power flows, inefficient electricity markets, generation/supply balance, a lack of information for the consumer and insufficient consumer interaction with electricity markets. Many of these challenges can be addressed with a smart grid, but there remain significant barriers to the implementation of a smart grid. This paper proposes a novel method for the development of a smart grid utilizing a bottom up approach (starting with smart buildings/campuses) with the goal of providing the framework and infrastructure necessary for a smart grid instead of the more traditional approach (installing many smart meters and hoping a smart grid emerges). This novel approach involves combining deterministic and statistical methods in order to accurately estimate building electricity use down to the device level. It provides model users with a cheaper alternative to energy audits and extensive sensor networks (the current methods of quantifying electrical use at this level) which increases their ability to modify energy consumption and respond to price signals The results of this method are promising, but they are still preliminary. As a result, there is still room for improvement. On days when there were no missing or inaccurate data, this approach has R2 of about 0.84, sometimes as high as 0.94 when compared to measured results. However, there were many days where missing data brought overall accuracy down significantly. In addition, the development and implementation of the calibration process is still underway and some functional additions must be made in order to maximize accuracy. The calibration process must be completed before a reliable accuracy can be determined. While this work shows that a combination of a deterministic and statistical methods can accurately forecast building energy usage, the ability to produce accurate results is heavily dependent upon software availability, accurate data and the proper calibration of the model. Creating the software required for a smart building model is time consuming and expensive. Bad or missing data have significant negative impacts on the accuracy of the results and can be caused by a hodgepodge of equipment and communication protocols. Proper calibration of the model is essential to ensure that the device level estimations are sufficiently accurate. Any building model which is to be successful at creating a smart building must be able to overcome these challenges.

  15. Foundational Report Series: Advanced Distribution Management Systems for Grid Modernization, Implementation Strategy for a Distribution Management System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Singh, Ravindra; Reilly, James T.; Wang, Jianhui

    Electric distribution utilities encounter many challenges to successful deployment of Distribution Management Systems (DMSs). The key challenges are documented in this report, along with suggestions for overcoming them. This report offers a recommended list of activities for implementing a DMS. It takes a strategic approach to implementing DMS from a project management perspective. The project management strategy covers DMS planning, procurement, design, building, testing, Installation, commissioning, and system integration issues and solutions. It identifies the risks that are associated with implementation and suggests strategies for utilities to use to mitigate them or avoid them altogether. Attention is given to commonmore » barriers to successful DMS implementation. This report begins with an overview of the implementation strategy for a DMS and proceeds to put forward a basic approach for procuring hardware and software for a DMS; designing the interfaces with external corporate computing systems such as EMS, GIS, OMS, and AMI; and implementing a complete solution.« less

  16. A Technological Review of the Instrumented Footwear for Rehabilitation with a Focus on Parkinson’s Disease Patients

    PubMed Central

    Maculewicz, Justyna; Kofoed, Lise Busk; Serafin, Stefania

    2016-01-01

    In this review article, we summarize systems for gait rehabilitation based on instrumented footwear and present a context of their usage in Parkinson’s disease (PD) patients’ auditory and haptic rehabilitation. We focus on the needs of PD patients, but since only a few systems were made with this purpose, we go through several applications used in different scenarios when gait detection and rehabilitation are considered. We present developments of the designs, possible improvements, and software challenges and requirements. We conclude that in order to build successful systems for PD patients’ gait rehabilitation, technological solutions from several studies have to be applied and combined with knowledge from auditory and haptic cueing. PMID:26834696

  17. The Consolidated Planning and Scheduling System for Space Transportation and Space Station operations - Successful development experience

    NASA Technical Reports Server (NTRS)

    Hornstein, Rhoda S.; Willoughby, John K.; Gardner, Jo A.; Shinkle, Gerald L.

    1993-01-01

    In 1992, NASA made the decision to evolve a Consolidated Planning System (CPS) by adding the Space Transportation System (STS) requirements to the Space Station Freedom (SSF) planning software. This paper describes this evolutionary process, which began with a series of six-month design-build-test cycles, using a domain-independent architecture and a set of developmental tools known as the Advanced Scheduling Environment. It is shown that, during these tests, the CPS could be used at multiple organizational levels of planning and for integrating schedules from geographically distributed (including international) planning environments. The potential for using the CPS for other planning and scheduling tasks in the SSF program is being currently examined.

  18. search GenBank: interactive orchestration and ad-hoc choreography of Web services in the exploration of the biomedical resources of the National Center For Biotechnology Information

    PubMed Central

    2013-01-01

    Background Due to the growing number of biomedical entries in data repositories of the National Center for Biotechnology Information (NCBI), it is difficult to collect, manage and process all of these entries in one place by third-party software developers without significant investment in hardware and software infrastructure, its maintenance and administration. Web services allow development of software applications that integrate in one place the functionality and processing logic of distributed software components, without integrating the components themselves and without integrating the resources to which they have access. This is achieved by appropriate orchestration or choreography of available Web services and their shared functions. After the successful application of Web services in the business sector, this technology can now be used to build composite software tools that are oriented towards biomedical data processing. Results We have developed a new tool for efficient and dynamic data exploration in GenBank and other NCBI databases. A dedicated search GenBank system makes use of NCBI Web services and a package of Entrez Programming Utilities (eUtils) in order to provide extended searching capabilities in NCBI data repositories. In search GenBank users can use one of the three exploration paths: simple data searching based on the specified user’s query, advanced data searching based on the specified user’s query, and advanced data exploration with the use of macros. search GenBank orchestrates calls of particular tools available through the NCBI Web service providing requested functionality, while users interactively browse selected records in search GenBank and traverse between NCBI databases using available links. On the other hand, by building macros in the advanced data exploration mode, users create choreographies of eUtils calls, which can lead to the automatic discovery of related data in the specified databases. Conclusions search GenBank extends standard capabilities of the NCBI Entrez search engine in querying biomedical databases. The possibility of creating and saving macros in the search GenBank is a unique feature and has a great potential. The potential will further grow in the future with the increasing density of networks of relationships between data stored in particular databases. search GenBank is available for public use at http://sgb.biotools.pl/. PMID:23452691

  19. search GenBank: interactive orchestration and ad-hoc choreography of Web services in the exploration of the biomedical resources of the National Center For Biotechnology Information.

    PubMed

    Mrozek, Dariusz; Małysiak-Mrozek, Bożena; Siążnik, Artur

    2013-03-01

    Due to the growing number of biomedical entries in data repositories of the National Center for Biotechnology Information (NCBI), it is difficult to collect, manage and process all of these entries in one place by third-party software developers without significant investment in hardware and software infrastructure, its maintenance and administration. Web services allow development of software applications that integrate in one place the functionality and processing logic of distributed software components, without integrating the components themselves and without integrating the resources to which they have access. This is achieved by appropriate orchestration or choreography of available Web services and their shared functions. After the successful application of Web services in the business sector, this technology can now be used to build composite software tools that are oriented towards biomedical data processing. We have developed a new tool for efficient and dynamic data exploration in GenBank and other NCBI databases. A dedicated search GenBank system makes use of NCBI Web services and a package of Entrez Programming Utilities (eUtils) in order to provide extended searching capabilities in NCBI data repositories. In search GenBank users can use one of the three exploration paths: simple data searching based on the specified user's query, advanced data searching based on the specified user's query, and advanced data exploration with the use of macros. search GenBank orchestrates calls of particular tools available through the NCBI Web service providing requested functionality, while users interactively browse selected records in search GenBank and traverse between NCBI databases using available links. On the other hand, by building macros in the advanced data exploration mode, users create choreographies of eUtils calls, which can lead to the automatic discovery of related data in the specified databases. search GenBank extends standard capabilities of the NCBI Entrez search engine in querying biomedical databases. The possibility of creating and saving macros in the search GenBank is a unique feature and has a great potential. The potential will further grow in the future with the increasing density of networks of relationships between data stored in particular databases. search GenBank is available for public use at http://sgb.biotools.pl/.

  20. A Quantitative Study of Global Software Development Teams, Requirements, and Software Projects

    ERIC Educational Resources Information Center

    Parker, Linda L.

    2016-01-01

    The study explored the relationship between global software development teams, effective software requirements, and stakeholders' perception of successful software development projects within the field of information technology management. It examined the critical relationship between Global Software Development (GSD) teams creating effective…

  1. Development and evaluation of a Fault-Tolerant Multiprocessor (FTMP) computer. Volume 3: FTMP test and evaluation

    NASA Technical Reports Server (NTRS)

    Lala, J. H.; Smith, T. B., III

    1983-01-01

    The experimental test and evaluation of the Fault-Tolerant Multiprocessor (FTMP) is described. Major objectives of this exercise include expanding validation envelope, building confidence in the system, revealing any weaknesses in the architectural concepts and in their execution in hardware and software, and in general, stressing the hardware and software. To this end, pin-level faults were injected into one LRU of the FTMP and the FTMP response was measured in terms of fault detection, isolation, and recovery times. A total of 21,055 stuck-at-0, stuck-at-1 and invert-signal faults were injected in the CPU, memory, bus interface circuits, Bus Guardian Units, and voters and error latches. Of these, 17,418 were detected. At least 80 percent of undetected faults are estimated to be on unused pins. The multiprocessor identified all detected faults correctly and recovered successfully in each case. Total recovery time for all faults averaged a little over one second. This can be reduced to half a second by including appropriate self-tests.

  2. Leveraging Modeling Approaches: Reaction Networks and Rules

    PubMed Central

    Blinov, Michael L.; Moraru, Ion I.

    2012-01-01

    We have witnessed an explosive growth in research involving mathematical models and computer simulations of intracellular molecular interactions, ranging from metabolic pathways to signaling and gene regulatory networks. Many software tools have been developed to aid in the study of such biological systems, some of which have a wealth of features for model building and visualization, and powerful capabilities for simulation and data analysis. Novel high resolution and/or high throughput experimental techniques have led to an abundance of qualitative and quantitative data related to the spatio-temporal distribution of molecules and complexes, their interactions kinetics, and functional modifications. Based on this information, computational biology researchers are attempting to build larger and more detailed models. However, this has proved to be a major challenge. Traditionally, modeling tools require the explicit specification of all molecular species and interactions in a model, which can quickly become a major limitation in the case of complex networks – the number of ways biomolecules can combine to form multimolecular complexes can be combinatorially large. Recently, a new breed of software tools has been created to address the problems faced when building models marked by combinatorial complexity. These have a different approach for model specification, using reaction rules and species patterns. Here we compare the traditional modeling approach with the new rule-based methods. We make a case for combining the capabilities of conventional simulation software with the unique features and flexibility of a rule-based approach in a single software platform for building models of molecular interaction networks. PMID:22161349

  3. Leveraging modeling approaches: reaction networks and rules.

    PubMed

    Blinov, Michael L; Moraru, Ion I

    2012-01-01

    We have witnessed an explosive growth in research involving mathematical models and computer simulations of intracellular molecular interactions, ranging from metabolic pathways to signaling and gene regulatory networks. Many software tools have been developed to aid in the study of such biological systems, some of which have a wealth of features for model building and visualization, and powerful capabilities for simulation and data analysis. Novel high-resolution and/or high-throughput experimental techniques have led to an abundance of qualitative and quantitative data related to the spatiotemporal distribution of molecules and complexes, their interactions kinetics, and functional modifications. Based on this information, computational biology researchers are attempting to build larger and more detailed models. However, this has proved to be a major challenge. Traditionally, modeling tools require the explicit specification of all molecular species and interactions in a model, which can quickly become a major limitation in the case of complex networks - the number of ways biomolecules can combine to form multimolecular complexes can be combinatorially large. Recently, a new breed of software tools has been created to address the problems faced when building models marked by combinatorial complexity. These have a different approach for model specification, using reaction rules and species patterns. Here we compare the traditional modeling approach with the new rule-based methods. We make a case for combining the capabilities of conventional simulation software with the unique features and flexibility of a rule-based approach in a single software platform for building models of molecular interaction networks.

  4. Software Issues at the User Interface

    DTIC Science & Technology

    1991-05-01

    successful integration of parallel computers into mainstream scientific computing. Clearly a compiler is the most important software tool available to a...Computer Science University of Colorado Boulder, CO 80309 ABSTRACT We review software issues that are critical to the successful integration of parallel...The development of an optimizing compiler of this quality, addressing communicaton instructions as well as computational instructions is a major

  5. Software on the Peregrine System | High-Performance Computing | NREL

    Science.gov Websites

    . Development Tools View list of tools for build automation, version control, and high-level or specialized scripting. Toolchains Learn about the available toolchains to build applications from source code

  6. Data and Tools | Integrated Energy Solutions | NREL

    Science.gov Websites

    for a research campus eQUEST. Detailed analysis of today's state-of-the-art building design source software tools to support whole building energy modeling and advanced daylight analysis BESTEST-EX

  7. Cradle-to-Gate Life-Cycle Inventory of Hardboard and Engineered Wood Siding and Trim Produced in North America

    Treesearch

    Richard D. Bergman

    2015-01-01

    Developing wood product LCI data helps construct product LCAs that are then incorporated into developing whole building LCAs in environmental footprint software such as the Athena Impact Estimator for Buildings (ASMI 2015). Conducting whole building LCAs provide for points that go toward green building certification in rating systems such as LEED v4, Green Globes, and...

  8. Energy savings modelling of re-tuning energy conservation measures in large office buildings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fernandez, Nick; Katipamula, Srinivas; Wang, Weimin

    Today, many large commercial buildings use sophisticated building automation systems (BASs) to manage a wide range of building equipment. While the capabilities of BASs have increased over time, many buildings still do not fully use the BAS’s capabilities and are not properly commissioned, operated or maintained, which leads to inefficient operation, increased energy use, and reduced lifetimes of the equipment. This paper investigates the energy savings potential of several common HVAC system re-tuning measures on a typical large office building, using the Department of Energy’s building energy modeling software, EnergyPlus. The baseline prototype model uses roughly as much energy asmore » an average large office building in existing building stock, but does not utilize any re-tuning measures. Individual re-tuning measures simulated against this baseline include automatic schedule adjustments, damper minimum flow adjustments, thermostat adjustments, as well as dynamic resets (set points that change continuously with building and/or outdoor conditions) to static pressure, supply-air temperature, condenser water temperature, chilled and hot water temperature, and chilled and hot water differential pressure set points. Six combinations of these individual measures have been formulated – each designed to conform to limitations to implementation of certain individual measures that might exist in typical buildings. All the individual measures and combinations were simulated in 16 climate locations representative of specific U.S. climate zones. The modeling results suggest that the most effective energy savings measures are those that affect the demand-side of the building (air-systems and schedules). Many of the demand-side individual measures were capable of reducing annual total HVAC system energy consumption by over 20% in most cities that were modeled. Supply side measures affecting HVAC plant conditions were only modestly successful (less than 5% annual HVAC energy savings for most cities for all measures). Combining many of the re-tuning measures revealed deep savings potential. Some of the more aggressive combinations revealed 35-75% reductions in annual HVAC energy consumption, depending on climate and building vintage.« less

  9. NREL Announces Third Round of Start-Ups to Participate in the Wells Fargo

    Science.gov Websites

    innovative commercial building technologies Photo of NREL researchers talking. George Lee and Steven Low that provide scalable solutions to reduce the energy impact of commercial buildings. Including Round 3 kit for commercial buildings. Referred to apply to program by University of Colorado Boulder Software

  10. Efficient and Robust Optimization for Building Energy Simulation

    PubMed Central

    Pourarian, Shokouh; Kearsley, Anthony; Wen, Jin; Pertzborn, Amanda

    2016-01-01

    Efficiently, robustly and accurately solving large sets of structured, non-linear algebraic and differential equations is one of the most computationally expensive steps in the dynamic simulation of building energy systems. Here, the efficiency, robustness and accuracy of two commonly employed solution methods are compared. The comparison is conducted using the HVACSIM+ software package, a component based building system simulation tool. The HVACSIM+ software presently employs Powell’s Hybrid method to solve systems of nonlinear algebraic equations that model the dynamics of energy states and interactions within buildings. It is shown here that the Powell’s method does not always converge to a solution. Since a myriad of other numerical methods are available, the question arises as to which method is most appropriate for building energy simulation. This paper finds considerable computational benefits result from replacing the Powell’s Hybrid method solver in HVACSIM+ with a solver more appropriate for the challenges particular to numerical simulations of buildings. Evidence is provided that a variant of the Levenberg-Marquardt solver has superior accuracy and robustness compared to the Powell’s Hybrid method presently used in HVACSIM+. PMID:27325907

  11. Efficient and Robust Optimization for Building Energy Simulation.

    PubMed

    Pourarian, Shokouh; Kearsley, Anthony; Wen, Jin; Pertzborn, Amanda

    2016-06-15

    Efficiently, robustly and accurately solving large sets of structured, non-linear algebraic and differential equations is one of the most computationally expensive steps in the dynamic simulation of building energy systems. Here, the efficiency, robustness and accuracy of two commonly employed solution methods are compared. The comparison is conducted using the HVACSIM+ software package, a component based building system simulation tool. The HVACSIM+ software presently employs Powell's Hybrid method to solve systems of nonlinear algebraic equations that model the dynamics of energy states and interactions within buildings. It is shown here that the Powell's method does not always converge to a solution. Since a myriad of other numerical methods are available, the question arises as to which method is most appropriate for building energy simulation. This paper finds considerable computational benefits result from replacing the Powell's Hybrid method solver in HVACSIM+ with a solver more appropriate for the challenges particular to numerical simulations of buildings. Evidence is provided that a variant of the Levenberg-Marquardt solver has superior accuracy and robustness compared to the Powell's Hybrid method presently used in HVACSIM+.

  12. Lucid Provides Energy Reduction Software to DC Government

    EPA Pesticide Factsheets

    Earlier this year, Lucid, a 2014 U.S. EPA SBIR award recipient, announced that the D.C. government was using Lucid's BuildingOS® software in order to reach an an energy reduction target of 20 percent in 20 months

  13. How Building Heroes Helps Make Winning Camp Directors.

    ERIC Educational Resources Information Center

    Huether, Richard J.

    1991-01-01

    For camps to be successful, management should empower camp staff to be heroes (leaders). This is based on three rules: campers look for heroes; heroes build successful camps; and successful directors build heroes. Being a hero implies being the best one can be and always attempting to improve the example communicated to others. (LP)

  14. Strategies for Relationship and Trust Building by Successful Superintendents: A Case Study

    ERIC Educational Resources Information Center

    Huang, Leann L.

    2012-01-01

    The purpose of this study was to identify strategies and behaviors that successful superintendents used to build strong relationships and trust with their boards within their entry period. The three research questions were developed to guide this study: 1. What strategies and behaviors were successful superintendents using to build strong…

  15. A Model for Sustainable Building Energy Efficiency Retrofit (BEER) Using Energy Performance Contracting (EPC) Mechanism for Hotel Buildings in China

    NASA Astrophysics Data System (ADS)

    Xu, Pengpeng

    Hotel building is one of the high-energy-consuming building types, and retrofitting hotel buildings is an untapped solution to help cut carbon emissions contributing towards sustainable development. Energy Performance Contracting (EPC) has been promulgated as a market mechanism for the delivery of energy efficiency projects. EPC mechanism has been introduced into China relatively recently, and it has not been implemented successfully in building energy efficiency retrofit projects. The aim of this research is to develop a model for achieving the sustainability of Building Energy Efficiency Retrofit (BEER) in hotel buildings under the Energy Performance Contracting (EPC) mechanism. The objectives include: • To identify a set of Key Performance Indicators (KPIs) for measuring the sustainability of BEER in hotel buildings; • To identify Critical Success Factors (CSFs) under EPC mechanism that have a strong correlation with sustainable BEER project; • To develop a model explaining the relationships between the CSFs and the sustainability performance of BEER in hotel building. Literature reviews revealed the essence of sustainable BEER and EPC, which help to develop a conceptual framework for analyzing sustainable BEER under EPC mechanism in hotel buildings. 11 potential KPIs for sustainable BEER and 28 success factors of EPC were selected based on the developed framework. A questionnaire survey was conducted to ascertain the importance of selected performance indicators and success factors. Fuzzy set theory was adopted in identifying the KPIs. Six KPIs were identified from the 11 selected performance indicators. Through a questionnaire survey, out of the 28 success factors, 21 Critical Success Factors (CSFs) were also indentified. Using the factor analysis technique, the 21 identified CSFs in this study were grouped into six clusters to help explain project success of sustainable BEER. Finally, AHP/ANP approach was used in this research to develop a model to examine the interrelationships among the identified CSFs, KPIs, and sustainable dimensions of BEER. The findings indicate that the success of sustainable BEER in hotel buildings under the EPC mechanism is mainly decided by project objectives control mechanism, available technology, organizing capacity of team leader, trust among partners, accurate M&V, and team workers' technical skills.

  16. Secure Video Surveillance System Acquisition Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2009-12-04

    The SVSS Acquisition Software collects and displays video images from two cameras through a VPN, and store the images onto a collection controller. The software is configured to allow a user to enter a time window to display up to 2 1/2, hours of video review. The software collects images from the cameras at a rate of 1 image per second and automatically deletes images older than 3 hours. The software code operates in a linux environment and can be run in a virtual machine on Windows XP. The Sandia software integrates the different COTS software together to build themore » video review system.« less

  17. Energy performance evaluation of AAC

    NASA Astrophysics Data System (ADS)

    Aybek, Hulya

    The U.S. building industry constitutes the largest consumer of energy (i.e., electricity, natural gas, petroleum) in the world. The building sector uses almost 41 percent of the primary energy and approximately 72 percent of the available electricity in the United States. As global energy-generating resources are being depleted at exponential rates, the amount of energy consumed and wasted cannot be ignored. Professionals concerned about the environment have placed a high priority on finding solutions that reduce energy consumption while maintaining occupant comfort. Sustainable design and the judicious combination of building materials comprise one solution to this problem. A future including sustainable energy may result from using energy simulation software to accurately estimate energy consumption and from applying building materials that achieve the potential results derived through simulation analysis. Energy-modeling tools assist professionals with making informed decisions about energy performance during the early planning phases of a design project, such as determining the most advantageous combination of building materials, choosing mechanical systems, and determining building orientation on the site. By implementing energy simulation software to estimate the effect of these factors on the energy consumption of a building, designers can make adjustments to their designs during the design phase when the effect on cost is minimal. The primary objective of this research consisted of identifying a method with which to properly select energy-efficient building materials and involved evaluating the potential of these materials to earn LEED credits when properly applied to a structure. In addition, this objective included establishing a framework that provides suggestions for improvements to currently available simulation software that enhance the viability of the estimates concerning energy efficiency and the achievements of LEED credits. The primary objective was accomplished by using conducting several simulation models to determine the relative energy efficiency of wood-framed, metal-framed, and Aerated Autoclaved Concrete (AAC) wall structures for both commercial and residential buildings.

  18. The software architecture to control the Cherenkov Telescope Array

    NASA Astrophysics Data System (ADS)

    Oya, I.; Füßling, M.; Antonino, P. O.; Conforti, V.; Hagge, L.; Melkumyan, D.; Morgenstern, A.; Tosti, G.; Schwanke, U.; Schwarz, J.; Wegner, P.; Colomé, J.; Lyard, E.

    2016-07-01

    The Cherenkov Telescope Array (CTA) project is an initiative to build two large arrays of Cherenkov gamma- ray telescopes. CTA will be deployed as two installations, one in the northern and the other in the southern hemisphere, containing dozens of telescopes of different sizes. CTA is a big step forward in the field of ground- based gamma-ray astronomy, not only because of the expected scientific return, but also due to the order-of- magnitude larger scale of the instrument to be controlled. The performance requirements associated with such a large and distributed astronomical installation require a thoughtful analysis to determine the best software solutions. The array control and data acquisition (ACTL) work-package within the CTA initiative will deliver the software to control and acquire the data from the CTA instrumentation. In this contribution we present the current status of the formal ACTL system decomposition into software building blocks and the relationships among them. The system is modelled via the Systems Modelling Language (SysML) formalism. To cope with the complexity of the system, this architecture model is sub-divided into different perspectives. The relationships with the stakeholders and external systems are used to create the first perspective, the context of the ACTL software system. Use cases are employed to describe the interaction of those external elements with the ACTL system and are traced to a hierarchy of functionalities (abstract system functions) describing the internal structure of the ACTL system. These functions are then traced to fully specified logical elements (software components), the deployment of which as technical elements, is also described. This modelling approach allows us to decompose the ACTL software in elements to be created and the ow of information within the system, providing us with a clear way to identify sub-system interdependencies. This architectural approach allows us to build the ACTL system model and trace requirements to deliverables (source code, documentation, etc.), and permits the implementation of a flexible use-case driven software development approach thanks to the traceability from use cases to the logical software elements. The Alma Common Software (ACS) container/component framework, used for the control of the Atacama Large Millimeter/submillimeter Array (ALMA) is the basis for the ACTL software and as such it is considered as an integral part of the software architecture.

  19. A General Water Resources Regulation Software System in China

    NASA Astrophysics Data System (ADS)

    LEI, X.

    2017-12-01

    To avoid iterative development of core modules in water resource normal regulation and emergency regulation and improve the capability of maintenance and optimization upgrading of regulation models and business logics, a general water resources regulation software framework was developed based on the collection and analysis of common demands for water resources regulation and emergency management. It can provide a customizable, secondary developed and extensible software framework for the three-level platform "MWR-Basin-Province". Meanwhile, this general software system can realize business collaboration and information sharing of water resources regulation schemes among the three-level platforms, so as to improve the decision-making ability of national water resources regulation. There are four main modules involved in the general software system: 1) A complete set of general water resources regulation modules allows secondary developer to custom-develop water resources regulation decision-making systems; 2) A complete set of model base and model computing software released in the form of Cloud services; 3) A complete set of tools to build the concept map and model system of basin water resources regulation, as well as a model management system to calibrate and configure model parameters; 4) A database which satisfies business functions and functional requirements of general water resources regulation software can finally provide technical support for building basin or regional water resources regulation models.

  20. Browndye: A Software Package for Brownian Dynamics

    PubMed Central

    McCammon, J. Andrew

    2010-01-01

    A new software package, Browndye, is presented for simulating the diffusional encounter of two large biological molecules. It can be used to estimate second-order rate constants and encounter probabilities, and to explore reaction trajectories. Browndye builds upon previous knowledge and algorithms from software packages such as UHBD, SDA, and Macrodox, while implementing algorithms that scale to larger systems. PMID:21132109

  1. A Buyer Behaviour Framework for the Development and Design of Software Agents in E-Commerce.

    ERIC Educational Resources Information Center

    Sproule, Susan; Archer, Norm

    2000-01-01

    Software agents are computer programs that run in the background and perform tasks autonomously as delegated by the user. This paper blends models from marketing research and findings from the field of decision support systems to build a framework for the design of software agents to support in e-commerce buying applications. (Contains 35…

  2. The COSPAR Capacity Building Initiative - past, present, future, and highlights

    NASA Astrophysics Data System (ADS)

    Gabriel, Carlos; Mendez, Mariano; D'Amicis, Raffaella; Santolik, Ondrej; Mathieu, Pierre-Philippe; Smith, Randall

    At the time of the COSPAR General Assembly in Moscow, the 21st workshop of the Programme for Capacity Building will have taken place. We have started in 2001 with the aim of: i) increasing the knowledge and use of public archives of space data in developing countries, ii) providing highly-practical instruction in the use of these archives and the associated publicly-available software, and iii) fostering personal links between participants and the experienced scientists who lecture during the workshops and supervise the projects carried on by the students. Workshops in many space disciplines have been successfully held so far (X-ray, Gamma-ray and Space Optical and UV Astronomy, Magnetospheric Physics, Space Oceanography, Remote Sensing and Planetary Science) in thirteen countries (Argentina, Brazil, China, Egypt, India, Indonesia, Malaysia, Morocco, Romania, Russia, South Africa, Thailand and Uruguay). An associated Fellowship Programme is helping former participants of these workshops to build on skills gained at them. We will summarize the past and discuss the present and future of the Programme, including highlights like the most recent one: the identification of a transient magnetar (the 9th object of this class so far discovered) in the vicinity of a supernova by one of our students, during the CB workshop on high-energy Astrophysics in Xuyi, China, in September 2013.

  3. Modular modeling system for building distributed hydrologic models with a user-friendly software package

    NASA Astrophysics Data System (ADS)

    Wi, S.; Ray, P. A.; Brown, C.

    2015-12-01

    A software package developed to facilitate building distributed hydrologic models in a modular modeling system is presented. The software package provides a user-friendly graphical user interface that eases its practical use in water resources-related research and practice. The modular modeling system organizes the options available to users when assembling models according to the stages of hydrological cycle, such as potential evapotranspiration, soil moisture accounting, and snow/glacier melting processes. The software is intended to be a comprehensive tool that simplifies the task of developing, calibrating, validating, and using hydrologic models through the inclusion of intelligent automation to minimize user effort, and reduce opportunities for error. Processes so far automated include the definition of system boundaries (i.e., watershed delineation), climate and geographical input generation, and parameter calibration. Built-in post-processing toolkits greatly improve the functionality of the software as a decision support tool for water resources system management and planning. Example post-processing toolkits enable streamflow simulation at ungauged sites with predefined model parameters, and perform climate change risk assessment by means of the decision scaling approach. The software is validated through application to watersheds representing a variety of hydrologic regimes.

  4. Building Interactive Simulations in Web Pages without Programming.

    PubMed

    Mailen Kootsey, J; McAuley, Grant; Bernal, Julie

    2005-01-01

    A software system is described for building interactive simulations and other numerical calculations in Web pages. The system is based on a new Java-based software architecture named NumberLinX (NLX) that isolates each function required to build the simulation so that a library of reusable objects could be assembled. The NLX objects are integrated into a commercial Web design program for coding-free page construction. The model description is entered through a wizard-like utility program that also functions as a model editor. The complete system permits very rapid construction of interactive simulations without coding. A wide range of applications are possible with the system beyond interactive calculations, including remote data collection and processing and collaboration over a network.

  5. Methodology for object-oriented real-time systems analysis and design: Software engineering

    NASA Technical Reports Server (NTRS)

    Schoeffler, James D.

    1991-01-01

    Successful application of software engineering methodologies requires an integrated analysis and design life-cycle in which the various phases flow smoothly 'seamlessly' from analysis through design to implementation. Furthermore, different analysis methodologies often lead to different structuring of the system so that the transition from analysis to design may be awkward depending on the design methodology to be used. This is especially important when object-oriented programming is to be used for implementation when the original specification and perhaps high-level design is non-object oriented. Two approaches to real-time systems analysis which can lead to an object-oriented design are contrasted: (1) modeling the system using structured analysis with real-time extensions which emphasizes data and control flows followed by the abstraction of objects where the operations or methods of the objects correspond to processes in the data flow diagrams and then design in terms of these objects; and (2) modeling the system from the beginning as a set of naturally occurring concurrent entities (objects) each having its own time-behavior defined by a set of states and state-transition rules and seamlessly transforming the analysis models into high-level design models. A new concept of a 'real-time systems-analysis object' is introduced and becomes the basic building block of a series of seamlessly-connected models which progress from the object-oriented real-time systems analysis and design system analysis logical models through the physical architectural models and the high-level design stages. The methodology is appropriate to the overall specification including hardware and software modules. In software modules, the systems analysis objects are transformed into software objects.

  6. Acceptance test procedure bldg. 271-U remote monitoring of project W-059 B-Plant canyon exhaust system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    MCDANIEL, K.S.

    1999-09-01

    The test procedure provides for verifying indications and alarms The test procedure provides for verifying indications and alarms associated with the B Plant Canyon Ventilation System as they are being displayed on a remote monitoring workstation located in building 271-U. The system application software was installed by PLCS Plus under contract from B&W Hanford Company. The application software was installed on an existing operator workstation in building 271U which is owned and operated by Bechtel Hanford Inc.

  7. Investigating the Acquisition of Software Systems that Rely on Open Architecture and Open Source Software

    DTIC Science & Technology

    2010-03-01

    associated with certain software systems [Breaux and Anton 2008]. With this basis to build on, it is now possible to analyze the alignment of...Kazman, R., (2003). Software Architecture in Practice, 2nd Edition, Addison-Wesley Pro- fessional, New York.. Breaux, T.D. and Anton , A.I. (2008... calculus for license rights and obligations in license and context models. Using them, we calculate rights and obligations for specific sys- tems, identify

  8. The HARNESS Workbench: Unified and Adaptive Access to Diverse HPC Platforms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sunderam, Vaidy S.

    2012-03-20

    The primary goal of the Harness WorkBench (HWB) project is to investigate innovative software environments that will help enhance the overall productivity of applications science on diverse HPC platforms. Two complementary frameworks were designed: one, a virtualized command toolkit for application building, deployment, and execution, that provides a common view across diverse HPC systems, in particular the DOE leadership computing platforms (Cray, IBM, SGI, and clusters); and two, a unified runtime environment that consolidates access to runtime services via an adaptive framework for execution-time and post processing activities. A prototype of the first was developed based on the concept ofmore » a 'system-call virtual machine' (SCVM), to enhance portability of the HPC application deployment process across heterogeneous high-end machines. The SCVM approach to portable builds is based on the insertion of toolkit-interpretable directives into original application build scripts. Modifications resulting from these directives preserve the semantics of the original build instruction flow. The execution of the build script is controlled by our toolkit that intercepts build script commands in a manner transparent to the end-user. We have applied this approach to a scientific production code (Gamess-US) on the Cray-XT5 machine. The second facet, termed Unibus, aims to facilitate provisioning and aggregation of multifaceted resources from resource providers and end-users perspectives. To achieve that, Unibus proposes a Capability Model and mediators (resource drivers) to virtualize access to diverse resources, and soft and successive conditioning to enable automatic and user-transparent resource provisioning. A proof of concept implementation has demonstrated the viability of this approach on high end machines, grid systems and computing clouds.« less

  9. Strategies/Behaviors That Successful Superintendents Use to Build Strong Relationships and Trust during Their Entry Period

    ERIC Educational Resources Information Center

    Green, C. K.

    2012-01-01

    The purpose of the study was to identify strategies/behaviors that successful superintendents used to build strong relationships and trust with their school boards within their entry period. The following research questions guided the study: (1) What strategies/behaviors are successful superintendents using to build strong relationships and trust…

  10. Seismic vulnerability assessment to earthquake at urban scale: A case of Mostaganem city in Algeria

    PubMed Central

    Benanane, Abdelkader; Boutaraa, Zohra

    2018-01-01

    The focus of this study was the seismic vulnerability assessment of buildings constituting Mostaganem city in Algeria. Situated 320 km to the west of Algiers, Mostaganem city encompasses a valuable cultural and architectural built heritage. The city has suffered several moderate earthquakes in recent years; this has led to extensive structural damage to old structures, especially unreinforced historical buildings. This study was divided into two essential steps, the first step being to establish fragility curves based on a non-linear static pushover analysis for each typology and height of buildings. Twenty-seven pushover analyses were performed by means of SAP2000 software (three analyses for each type of building). The second step was to adopt the US HAZUS software and to modify it to suit the typical setting and parameters of the city of Mostaganem. A seismic vulnerability analysis of Mostaganem city was conducted using HAZUS software after inputting the new parameters of the fragility curves established within the first step. The results indicated that the number of poor-quality buildings expected to be totally destroyed under a 5.5 Mw earthquake scenario could reach more than 28 buildings. Three percent of unreinforced masonry (URM) buildings were completely damaged and 10% were extensively damaged. Of the concrete frame buildings, 6% were extensively damaged and 19% were moderately damaged. According to the built year, 6% of both concrete frame and URM buildings built before 1980 are estimated to be collapsing. Buildings constructed between 1980 and 1999 are more resistant; 8% of those structures were extensively damaged and 18% were moderately damaged. Only 10% of buildings constructed after 1999 were moderately damaged. The results also show that the main hospital of the city, built before 1960, will be extensively damaged during an earthquake of 5.5 Mw. The number of human casualties could reach several hundreds – 10.5% of residents of URM buildings are injured or dead. Compared with the URM buildings, concrete frame buildings have lower casualty rates of 1.5% and 0.5% for those built before and after 1980, respectively. It was concluded that Mostaganem city belongs to seismic vulnerable zones in Algeria; in this regard, an action plan is needed for the rehabilitation of old constructions. In addition, the effectiveness of establishing and introducing new and appropriate fragility curves was demonstrated.

  11. Determining position inside building via laser rangefinder and handheld computer

    DOEpatents

    Ramsey, Jr James L. [Albuquerque, NM; Finley, Patrick [Albuquerque, NM; Melton, Brad [Albuquerque, NM

    2010-01-12

    An apparatus, computer software, and a method of determining position inside a building comprising selecting on a PDA at least two walls of a room in a digitized map of a building or a portion of a building, pointing and firing a laser rangefinder at corresponding physical walls, transmitting collected range information to the PDA, and computing on the PDA a position of the laser rangefinder within the room.

  12. Autotune Calibrates Models to Building Use Data

    ScienceCinema

    None

    2018-01-16

    Models of existing buildings are currently unreliable unless calibrated manually by a skilled professional. Autotune, as the name implies, automates this process by calibrating the model of an existing building to measured data, and is now available as open source software. This enables private businesses to incorporate Autotune into their products so that their customers can more effectively estimate cost savings of reduced energy consumption measures in existing buildings.

  13. Assigning Robust Default Values in Building Performance Simulation Software for Improved Decision-Making in the Initial Stages of Building Design.

    PubMed

    Hiyama, Kyosuke

    2015-01-01

    Applying data mining techniques on a database of BIM models could provide valuable insights in key design patterns implicitly present in these BIM models. The architectural designer would then be able to use previous data from existing building projects as default values in building performance simulation software for the early phases of building design. The author has proposed the method to minimize the magnitude of the variation in these default values in subsequent design stages. This approach maintains the accuracy of the simulation results in the initial stages of building design. In this study, a more convincing argument is presented to demonstrate the significance of the new method. The variation in the ideal default values for different building design conditions is assessed first. Next, the influence of each condition on these variations is investigated. The space depth is found to have a large impact on the ideal default value of the window to wall ratio. In addition, the presence or absence of lighting control and natural ventilation has a significant influence on the ideal default value. These effects can be used to identify the types of building conditions that should be considered to determine the ideal default values.

  14. Assigning Robust Default Values in Building Performance Simulation Software for Improved Decision-Making in the Initial Stages of Building Design

    PubMed Central

    2015-01-01

    Applying data mining techniques on a database of BIM models could provide valuable insights in key design patterns implicitly present in these BIM models. The architectural designer would then be able to use previous data from existing building projects as default values in building performance simulation software for the early phases of building design. The author has proposed the method to minimize the magnitude of the variation in these default values in subsequent design stages. This approach maintains the accuracy of the simulation results in the initial stages of building design. In this study, a more convincing argument is presented to demonstrate the significance of the new method. The variation in the ideal default values for different building design conditions is assessed first. Next, the influence of each condition on these variations is investigated. The space depth is found to have a large impact on the ideal default value of the window to wall ratio. In addition, the presence or absence of lighting control and natural ventilation has a significant influence on the ideal default value. These effects can be used to identify the types of building conditions that should be considered to determine the ideal default values. PMID:26090512

  15. Advanced Software V&V for Civil Aviation and Autonomy

    NASA Technical Reports Server (NTRS)

    Brat, Guillaume P.

    2017-01-01

    With the advances in high-computing platform (e.g., advanced graphical processing units or multi-core processors), computationally-intensive software techniques such as the ones used in artificial intelligence or formal methods have provided us with an opportunity to further increase safety in the aviation industry. Some of these techniques have facilitated building safety at design time, like in aircraft engines or software verification and validation, and others can introduce safety benefits during operations as long as we adapt our processes. In this talk, I will present how NASA is taking advantage of these new software techniques to build in safety at design time through advanced software verification and validation, which can be applied earlier and earlier in the design life cycle and thus help also reduce the cost of aviation assurance. I will then show how run-time techniques (such as runtime assurance or data analytics) offer us a chance to catch even more complex problems, even in the face of changing and unpredictable environments. These new techniques will be extremely useful as our aviation systems become more complex and more autonomous.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, William Eugene

    These slides describe different strategies for installing Python software. Although I am a big fan of Python software development, robust strategies for software installation remains a challenge. This talk describes several different installation scenarios. The Good: the user has administrative privileges - Installing on Windows with an installer executable, Installing with Linux application utility, Installing a Python package from the PyPI repository, and Installing a Python package from source. The Bad: the user does not have administrative privileges - Using a virtual environment to isolate package installations, and Using an installer executable on Windows with a virtual environment. The Ugly:more » the user needs to install an extension package from source - Installing a Python extension package from source, and PyCoinInstall - Managing builds for Python extension packages. The last item referring to PyCoinInstall describes a utility being developed for the COIN-OR software, which is used within the operations research community. COIN-OR includes a variety of Python and C++ software packages, and this script uses a simple plug-in system to support the management of package builds and installation.« less

  17. Software for Secondary-School Learning About Robotics

    NASA Technical Reports Server (NTRS)

    Shelton, Robert O.; Smith, Stephanie L.; Truong, Dat; Hodgson, Terry R.

    2005-01-01

    The ROVer Ranch is an interactive computer program designed to help secondary-school students learn about space-program robotics and related basic scientific concepts by involving the students in simplified design and programming tasks that exercise skills in mathematics and science. The tasks involve building simulated robots and then observing how they behave. The program furnishes (1) programming tools that a student can use to assemble and program a simulated robot and (2) a virtual three-dimensional mission simulator for testing the robot. First, the ROVer Ranch presents fundamental information about robotics, mission goals, and facts about the mission environment. On the basis of this information, and using the aforementioned tools, the student assembles a robot by selecting parts from such subsystems as propulsion, navigation, and scientific tools, the student builds a simulated robot to accomplish its mission. Once the robot is built, it is programmed and then placed in a three-dimensional simulated environment. Success or failure in the simulation depends on the planning and design of the robot. Data and results of the mission are available in a summary log once the mission is concluded.

  18. Loggers and Forest Fragmentation: Behavioral Models of Road Building in the Amazon Basin

    NASA Technical Reports Server (NTRS)

    Arima, Eugenio Y.; Walker, Robert T.; Perz, Stephen G.; Caldas, Marcellus

    2005-01-01

    Although a large literature now exists on the drivers of tropical deforestation, less is known about its spatial manifestation. This is a critical shortcoming in our knowledge base since the spatial pattern of land-cover change and forest fragmentation, in particular, strongly affect biodiversity. The purpose of this article is to consider emergent patterns of road networks, the initial proximate cause of fragmentation in tropical forest frontiers. Specifically, we address the road-building processes of loggers who are very active in the Amazon landscape. To this end, we develop an explanation of road expansions, using a positive approach combining a theoretical model of economic behavior with geographic information systems (GIs) software in order to mimic the spatial decisions of road builders. We simulate two types of road extensions commonly found in the Amazon basin in a region: showing the fishbone pattern of fragmentation. Although our simulation results are only partially successful, they call attention to the role of multiple agents in the landscape, the importance of legal and institutional constraints on economic behavior, and the power of GIs as a research tool.

  19. Santa Fé: building a virtual city to develop a family health game.

    PubMed

    Tubelo, Rodrigo; Dahmer, Alessandra; Pinheiro, Luciana; Pinto, Maria E

    2013-01-01

    The current tendency of education in health is the use of new technologies like Virtual Reality. The course of UNASUS-UFCSPA specialization in family health was developed for health professionals that work in primary health care (PHC); in order to reach all Brazilian territory. Moodle is a platform where virtual activities are posted and evaluated. Santa Fé is a virtual city created in the Sketch up Pro, which aims to fit in specific clinical cases that involve matters of medicine, nursing and dentistry. The Software eAdventure was the tool used for the development of a game, offering interaction to the student with the Virtual City and the clinical cases, in the perspective of learning utilizing an entertainment method and evaluating individual performance of the students. The building of the city in the Sketch up Pro was successful and at low cost. The eAdventure was an efficient and intuitive tool, therefore, there was not necessarily a huge specific knowledge of technology or hardware with high speed processing and also speedy broad band internet for its use.

  20. User News. Volume 17, Number 1 -- Spring 1996

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    This is a newsletter for users of the DOE-2, PowerDOE, SPARK, and BLAST building energy simulation programs. The topics for the Spring 1996 issue include the SPARK simulation environment, DOE-2 validation, listing of free fenestration software from LBNL, Web sites for building energy efficiency, the heat balance method of calculating building heating and cooling loads.

  1. 48 CFR 227.7203-6 - Contract clauses.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Software and Computer Software Documentation 227.7203-6 Contract clauses. (a)(1) Use the clause at 252.227-7014, Rights in Noncommercial Computer Software and Noncommercial Computer Software Documentation, in solicitations and contracts when the successful offeror(s) will be required to deliver computer software or...

  2. 48 CFR 227.7203-6 - Contract clauses.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Software and Computer Software Documentation 227.7203-6 Contract clauses. (a)(1) Use the clause at 252.227-7014, Rights in Noncommercial Computer Software and Noncommercial Computer Software Documentation, in solicitations and contracts when the successful offeror(s) will be required to deliver computer software or...

  3. 48 CFR 227.7203-6 - Contract clauses.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Software and Computer Software Documentation 227.7203-6 Contract clauses. (a)(1) Use the clause at 252.227-7014, Rights in Noncommercial Computer Software and Noncommercial Computer Software Documentation, in solicitations and contracts when the successful offeror(s) will be required to deliver computer software or...

  4. 48 CFR 227.7203-6 - Contract clauses.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Software and Computer Software Documentation 227.7203-6 Contract clauses. (a)(1) Use the clause at 252.227-7014, Rights in Noncommercial Computer Software and Noncommercial Computer Software Documentation, in solicitations and contracts when the successful offeror(s) will be required to deliver computer software or...

  5. 48 CFR 227.7203-6 - Contract clauses.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Software and Computer Software Documentation 227.7203-6 Contract clauses. (a)(1) Use the clause at 252.227-7014, Rights in Noncommercial Computer Software and Noncommercial Computer Software Documentation, in solicitations and contracts when the successful offeror(s) will be required to deliver computer software or...

  6. ObsPy: A Python toolbox for seismology - Sustainability, New Features, and Applications

    NASA Astrophysics Data System (ADS)

    Krischer, L.; Megies, T.; Sales de Andrade, E.; Barsch, R.; MacCarthy, J.

    2016-12-01

    ObsPy (https://www.obspy.org) is a community-driven, open-source project dedicated to offer a bridge for seismology into the scientific Python ecosystem. Amongst other things, it provides Read and write support for essentially every commonly used data format in seismology with a unified interface. This includes waveform data as well as station and event meta information. A signal processing toolbox tuned to the specific needs of seismologists. Integrated access to the largest data centers, web services, and databases. Wrappers around third party codes like libmseed and evalresp. Using ObsPy enables users to take advantage of the vast scientific ecosystem that has developed around Python. In contrast to many other programming languages and tools, Python is simple enough to enable an exploratory and interactive coding style desired by many scientists. At the same time it is a full-fledged programming language usable by software engineers to build complex and large programs. This combination makes it very suitable for use in seismology where research code often must be translated to stable and production ready environments, especially in the age of big data. ObsPy has seen constant development for more than six years and enjoys a large rate of adoption in the seismological community with thousands of users. Successful applications include time-dependent and rotational seismology, big data processing, event relocations, and synthetic studies about attenuation kernels and full-waveform inversions to name a few examples. Additionally it sparked the development of several more specialized packages slowly building a modern seismological ecosystem around it. We will present a short overview of the capabilities of ObsPy and point out several representative use cases and more specialized software built around ObsPy. Additionally we will discuss new and upcoming features, as well as the sustainability of open-source scientific software.

  7. LARGE BUILDING HVAC SIMULATION

    EPA Science Inventory

    The report discusses the monitoring and collection of data relating to indoor pressures and radon concentrations under several test conditions in a large school building in Bartow, Florida. The Florida Solar Energy Center (FSEC) used an integrated computational software, FSEC 3.0...

  8. pyam: Python Implementation of YaM

    NASA Technical Reports Server (NTRS)

    Myint, Steven; Jain, Abhinandan

    2012-01-01

    pyam is a software development framework with tools for facilitating the rapid development of software in a concurrent software development environment. pyam provides solutions for development challenges associated with software reuse, managing multiple software configurations, developing software product lines, and multiple platform development and build management. pyam uses release-early, release-often development cycles to allow developers to integrate their changes incrementally into the system on a continual basis. It facilitates the creation and merging of branches to support the isolated development of immature software to avoid impacting the stability of the development effort. It uses modules and packages to organize and share software across multiple software products, and uses the concepts of link and work modules to reduce sandbox setup times even when the code-base is large. One sidebenefit is the enforcement of a strong module-level encapsulation of a module s functionality and interface. This increases design transparency, system stability, and software reuse. pyam is written in Python and is organized as a set of utilities on top of the open source SVN software version control package. All development software is organized into a collection of modules. pyam packages are defined as sub-collections of the available modules. Developers can set up private sandboxes for module/package development. All module/package development takes place on private SVN branches. High-level pyam commands support the setup, update, and release of modules and packages. Released and pre-built versions of modules are available to developers. Developers can tailor the source/link module mix for their sandboxes so that new sandboxes (even large ones) can be built up easily and quickly by pointing to pre-existing module releases. All inter-module interfaces are publicly exported via links. A minimal, but uniform, convention is used for building modules.

  9. Assessment Environment for Complex Systems Software Guide

    NASA Technical Reports Server (NTRS)

    2013-01-01

    This Software Guide (SG) describes the software developed to test the Assessment Environment for Complex Systems (AECS) by the West Virginia High Technology Consortium (WVHTC) Foundation's Mission Systems Group (MSG) for the National Aeronautics and Space Administration (NASA) Aeronautics Research Mission Directorate (ARMD). This software is referred to as the AECS Test Project throughout the remainder of this document. AECS provides a framework for developing, simulating, testing, and analyzing modern avionics systems within an Integrated Modular Avionics (IMA) architecture. The purpose of the AECS Test Project is twofold. First, it provides a means to test the AECS hardware and system developed by MSG. Second, it provides an example project upon which future AECS research may be based. This Software Guide fully describes building, installing, and executing the AECS Test Project as well as its architecture and design. The design of the AECS hardware is described in the AECS Hardware Guide. Instructions on how to configure, build and use the AECS are described in the User's Guide. Sample AECS software, developed by the WVHTC Foundation, is presented in the AECS Software Guide. The AECS Hardware Guide, AECS User's Guide, and AECS Software Guide are authored by MSG. The requirements set forth for AECS are presented in the Statement of Work for the Assessment Environment for Complex Systems authored by NASA Dryden Flight Research Center (DFRC). The intended audience for this document includes software engineers, hardware engineers, project managers, and quality assurance personnel from WVHTC Foundation (the suppliers of the software), NASA (the customer), and future researchers (users of the software). Readers are assumed to have general knowledge in the field of real-time, embedded computer software development.

  10. Software quality in 1997

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, C.

    1997-11-01

    For many years, software quality assurance lagged behind hardware quality assurance in terms of methods, metrics, and successful results. New approaches such as Quality Function Deployment (QFD) the ISO 9000-9004 standards, the SEI maturity levels, and Total Quality Management (TQM) are starting to attract wide attention, and in some cases to bring software quality levels up to a parity with manufacturing quality levels. Since software is on the critical path for many engineered products, and for internal business systems as well, the new approaches are starting to affect global competition and attract widespread international interest. It can be hypothesized thatmore » success in mastering software quality will be a key strategy for dominating global software markets in the 21st century.« less

  11. Factors that Impact Software Project Success in Offshore Information Technology (IT) Companies

    ERIC Educational Resources Information Center

    Edara, Venkatarao

    2011-01-01

    Information technology (IT) projects are unsuccessful at a rate of 65% to 75% per year, in spite of employing the latest technologies and training employees. Although many studies have been conducted on project successes in U.S. companies, there is a lack of research studying the impact of various factors on software project success in offshore IT…

  12. Software Quality Perceptions of Stakeholders Involved in the Software Development Process

    ERIC Educational Resources Information Center

    Padmanabhan, Priya

    2013-01-01

    Software quality is one of the primary determinants of project management success. Stakeholders involved in software development widely agree that quality is important (Barney and Wohlin 2009). However, they may differ on what constitutes software quality, and which of its attributes are more important than others. Although, software quality…

  13. Investigation of wind behaviour around high-rise buildings

    NASA Astrophysics Data System (ADS)

    Mat Isa, Norasikin; Fitriah Nasir, Nurul; Sadikin, Azmahani; Ariff Hairul Bahara, Jamil

    2017-09-01

    A study on the investigation of wind behaviour around the high-rise buildings is done through an experiment using a wind tunnel and computational fluid dynamics. High-rise buildings refer to buildings or structures that have more than 12 floors. Wind is invisible to the naked eye; thus, it is hard to see and analyse its flow around and over buildings without the use of proper methods, such as the use of wind tunnel and computational fluid dynamics software.The study was conducted on buildings located in Presint 4, Putrajaya, Malaysia which is the Ministry of Rural and Regional Development, Ministry of Information Communications and Culture, Ministry of Urban Wellbeing, Housing and Local Government and the Ministry of Women, Family, and Community by making scaled models of the buildings. The parameters in which this study is conducted on are, four different wind velocities used based on the seasonal monsoons, and wind direction. ANSYS Fluent workbench software is used to compute the simulations in order to achieve the objectives of this study. The data from the computational fluid dynamics are validated with the experiment done through the wind tunnel. From the results obtained through the use of the computation fluid dynamics, this study can identify the characteristics of wind around buildings, including boundary layer of the buildings, separation flow, wake region and etc. Then analyses is conducted on the occurance resulting from the wind that passes the buildings based on the velocity difference between before and after the wind passes the buildings.

  14. Systemic Vulnerabilities

    DTIC Science & Technology

    2014-10-01

    CRm CAL FA~WR£S Q I • Software Engineering Institute I Ccamt>gw l\\~llon Lniwndty 34 Basic attack tree Destroy Building Generate Sufficient...by computer-security company marketing literature that touts 11hacker proof software,11 11triple-DES security,11 and the like. In truth, unbreakable

  15. A review of building information modelling

    NASA Astrophysics Data System (ADS)

    Wang, Wen; Han, Rui

    2018-05-01

    Building Information Modelling (BIM) is widely seen as a catalyst for innovation and productivity. It is becoming standard for new construction and is the most significant technology changing how we design, build, use and manage the building. It is a dominant technological trend in the software industry and although the theoretical groundwork was laid in the previous century, it is a popular topic in academic research. BIM is discussed in this study, which results can provide better and more comprehensive choices for building owners, designers, and developers in future.

  16. Addressing failures in exascale computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Snir, Marc; Wisniewski, Robert W.; Abraham, Jacob A.

    2014-05-01

    We present here a report produced by a workshop on “Addressing Failures in Exascale Computing” held in Park City, Utah, August 4–11, 2012. The charter of this workshop was to establish a common taxonomy about resilience across all the levels in a computing system; discuss existing knowledge on resilience across the various hardware and software layers of an exascale system; and build on those results, examining potential solutions from both a hardware and software perspective and focusing on a combined approach. The workshop brought together participants with expertise in applications, system software, and hardware; they came from industry, government, andmore » academia; and their interests ranged from theory to implementation. The combination allowed broad and comprehensive discussions and led to this document, which summarizes and builds on those discussions.« less

  17. Addressing Failures in Exascale Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Snir, Marc; Wisniewski, Robert; Abraham, Jacob

    2014-01-01

    We present here a report produced by a workshop on Addressing failures in exascale computing' held in Park City, Utah, 4-11 August 2012. The charter of this workshop was to establish a common taxonomy about resilience across all the levels in a computing system, discuss existing knowledge on resilience across the various hardware and software layers of an exascale system, and build on those results, examining potential solutions from both a hardware and software perspective and focusing on a combined approach. The workshop brought together participants with expertise in applications, system software, and hardware; they came from industry, government, andmore » academia, and their interests ranged from theory to implementation. The combination allowed broad and comprehensive discussions and led to this document, which summarizes and builds on those discussions.« less

  18. Comparison of different approaches of modelling in a masonry building

    NASA Astrophysics Data System (ADS)

    Saba, M.; Meloni, D.

    2017-12-01

    The present work has the objective to model a simple masonry building, through two different modelling methods in order to assess their validity in terms of evaluation of static stresses. Have been chosen two of the most commercial software used to address this kind of problem, which are of S.T.A. Data S.r.l. and Sismicad12 of Concrete S.r.l. While the 3Muri software adopts the Frame by Macro Elements Method (FME), which should be more schematic and more efficient, Sismicad12 software uses the Finite Element Method (FEM), which guarantees accurate results, with greater computational burden. Remarkably differences of the static stresses, for such a simple structure between the two approaches have been found, and an interesting comparison and analysis of the reasons is proposed.

  19. 3D modeling based on CityEngine

    NASA Astrophysics Data System (ADS)

    Jia, Guangyin; Liao, Kaiju

    2017-03-01

    Currently, there are many 3D modeling softwares, like 3DMAX, AUTOCAD, and more populous BIM softwares represented by REVIT. CityEngine modeling software introduced in this paper can fully utilize the existing GIS data and combine other built models to make 3D modeling on internal and external part of buildings in a rapid and batch manner, so as to improve the 3D modeling efficiency.

  20. Logic Model Checking of Unintended Acceleration Claims in Toyota Vehicles

    NASA Technical Reports Server (NTRS)

    Gamble, Ed

    2012-01-01

    Part of the US Department of Transportation investigation of Toyota sudden unintended acceleration (SUA) involved analysis of the throttle control software, JPL Laboratory for Reliable Software applied several techniques including static analysis and logic model checking, to the software; A handful of logic models were build, Some weaknesses were identified; however, no cause for SUA was found; The full NASA report includes numerous other analyses

  1. ControlShell - A real-time software framework

    NASA Technical Reports Server (NTRS)

    Schneider, Stanley A.; Ullman, Marc A.; Chen, Vincent W.

    1991-01-01

    ControlShell is designed to enable modular design and impplementation of real-time software. It is an object-oriented tool-set for real-time software system programming. It provides a series of execution and data interchange mechansims that form a framework for building real-time applications. These mechanisms allow a component-based approach to real-time software generation and mangement. By defining a set of interface specifications for intermodule interaction, ControlShell provides a common platform that is the basis for real-time code development and exchange.

  2. Software management tools: Lessons learned from use

    NASA Technical Reports Server (NTRS)

    Reifer, D. J.; Valett, J.; Knight, J.; Wenneson, G.

    1985-01-01

    Experience in inserting software project planning tools into more than 100 projects producing mission critical software are discussed. The problems the software project manager faces are listed along with methods and tools available to handle them. Experience is reported with the Project Manager's Workstation (PMW) and the SoftCost-R cost estimating package. Finally, the results of a survey, which looked at what could be done in the future to overcome the problems experienced and build a set of truly useful tools, are presented.

  3. ACE: A distributed system to manage large data archives

    NASA Technical Reports Server (NTRS)

    Daily, Mike I.; Allen, Frank W.

    1993-01-01

    Competitive pressures in the oil and gas industry are requiring a much tighter integration of technical data into E and P business processes. The development of new systems to accommodate this business need must comprehend the significant numbers of large, complex data objects which the industry generates. The life cycle of the data objects is a four phase progression from data acquisition, to data processing, through data interpretation, and ending finally with data archival. In order to implement a cost effect system which provides an efficient conversion from data to information and allows effective use of this information, an organization must consider the technical data management requirements in all four phases. A set of technical issues which may differ in each phase must be addressed to insure an overall successful development strategy. The technical issues include standardized data formats and media for data acquisition, data management during processing, plus networks, applications software, and GUI's for interpretation of the processed data. Mass storage hardware and software is required to provide cost effective storage and retrieval during the latter three stages as well as long term archival. Mobil Oil Corporation's Exploration and Producing Technical Center (MEPTEC) has addressed the technical and cost issues of designing, building, and implementing an Advanced Computing Environment (ACE) to support the petroleum E and P function, which is critical to the corporation's continued success. Mobile views ACE as a cost effective solution which can give Mobile a competitive edge as well as a viable technical solution.

  4. Parametric Accuracy: Building Information Modeling Process Applied to the Cultural Heritage Preservation

    NASA Astrophysics Data System (ADS)

    Garagnani, S.; Manferdini, A. M.

    2013-02-01

    Since their introduction, modeling tools aimed to architectural design evolved in today's "digital multi-purpose drawing boards" based on enhanced parametric elements able to originate whole buildings within virtual environments. Semantic splitting and elements topology are features that allow objects to be "intelligent" (i.e. self-aware of what kind of element they are and with whom they can interact), representing this way basics of Building Information Modeling (BIM), a coordinated, consistent and always up to date workflow improved in order to reach higher quality, reliability and cost reductions all over the design process. Even if BIM was originally intended for new architectures, its attitude to store semantic inter-related information can be successfully applied to existing buildings as well, especially if they deserve particular care such as Cultural Heritage sites. BIM engines can easily manage simple parametric geometries, collapsing them to standard primitives connected through hierarchical relationships: however, when components are generated by existing morphologies, for example acquiring point clouds by digital photogrammetry or laser scanning equipment, complex abstractions have to be introduced while remodeling elements by hand, since automatic feature extraction in available software is still not effective. In order to introduce a methodology destined to process point cloud data in a BIM environment with high accuracy, this paper describes some experiences on monumental sites documentation, generated through a plug-in written for Autodesk Revit and codenamed GreenSpider after its capability to layout points in space as if they were nodes of an ideal cobweb.

  5. DeviceEditor visual biological CAD canvas

    PubMed Central

    2012-01-01

    Background Biological Computer Aided Design (bioCAD) assists the de novo design and selection of existing genetic components to achieve a desired biological activity, as part of an integrated design-build-test cycle. To meet the emerging needs of Synthetic Biology, bioCAD tools must address the increasing prevalence of combinatorial library design, design rule specification, and scar-less multi-part DNA assembly. Results We report the development and deployment of web-based bioCAD software, DeviceEditor, which provides a graphical design environment that mimics the intuitive visual whiteboard design process practiced in biological laboratories. The key innovations of DeviceEditor include visual combinatorial library design, direct integration with scar-less multi-part DNA assembly design automation, and a graphical user interface for the creation and modification of design specification rules. We demonstrate how biological designs are rendered on the DeviceEditor canvas, and we present effective visualizations of genetic component ordering and combinatorial variations within complex designs. Conclusions DeviceEditor liberates researchers from DNA base-pair manipulation, and enables users to create successful prototypes using standardized, functional, and visual abstractions. Open and documented software interfaces support further integration of DeviceEditor with other bioCAD tools and software platforms. DeviceEditor saves researcher time and institutional resources through correct-by-construction design, the automation of tedious tasks, design reuse, and the minimization of DNA assembly costs. PMID:22373390

  6. Systems Engineering Building Advances Power Grid Research

    ScienceCinema

    Virden, Jud; Huang, Henry; Skare, Paul; Dagle, Jeff; Imhoff, Carl; Stoustrup, Jakob; Melton, Ron; Stiles, Dennis; Pratt, Rob

    2018-01-16

    Researchers and industry are now better equipped to tackle the nation’s most pressing energy challenges through PNNL’s new Systems Engineering Building – including challenges in grid modernization, buildings efficiency and renewable energy integration. This lab links real-time grid data, software platforms, specialized laboratories and advanced computing resources for the design and demonstration of new tools to modernize the grid and increase buildings energy efficiency.

  7. Analysis of a school building damaged by the 2015 Ranau earthquake Malaysia

    NASA Astrophysics Data System (ADS)

    Takano, Shugo; Saito, Taiki

    2017-10-01

    On June 5th, 2015 a severe earthquake with a moment Magnitude of 6.0 occurred in Ranau, Malaysia. Depth of the epicenter is 10 km. Due to the earthquake, many facilities were damaged and 18 people were killed due to rockfalls [1]. Because the British Standard (BS) is adopted as a regulation for built buildings in Malaysia, the seismic force is not considered in the structural design. Therefore, the seismic resistance of Malaysian buildings is unclear. To secure the human life and building safety, it is important to grasp seismic resistance of the building. The objective of this study is to evaluate the seismic resistance of the existing buildings in Malaysia built by the British Standard. A school building that was damaged at the Ranau earthquake is selected as the target building. The building is a four story building and the ground floor is designed to be a parking space for the staff. The structural types are infill masonries where main frame is configured by reinforced concrete columns and beams and brick is installed inside the frame as walls. Analysis is performed using the STERA_3D software that is the software to analyze the seismic performance of buildings developed by one of the authors. Firstly, the natural period of the building is calculated and compared with the result of micro-tremor measurement. Secondly, the nonlinear push-over analysis was conducted to evaluate the horizontal load bearing capacity of the building. Thirdly, the earthquake response analysis was conducted using the time history acceleration data measured at the Ranau earthquake by the seismograph installed at Kota Kinabalu. By comparing the results of earthquake response analysis and the actual damage of the building, the reason that caused damage to the building is clarified.

  8. Managing a Real-Time Embedded Linux Platform with Buildroot

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Diamond, J.; Martin, K.

    2015-01-01

    Developers of real-time embedded software often need to build the operating system, kernel, tools and supporting applications from source to work with the differences in their hardware configuration. The first attempts to introduce Linux-based real-time embedded systems into the Fermilab accelerator controls system used this approach but it was found to be time-consuming, difficult to maintain and difficult to adapt to different hardware configurations. Buildroot is an open source build system with a menu-driven configuration tool (similar to the Linux kernel build system) that automates this process. A customized Buildroot [1] system has been developed for use in the Fermilabmore » accelerator controls system that includes several hardware configuration profiles (including Intel, ARM and PowerPC) and packages for Fermilab support software. A bootable image file is produced containing the Linux kernel, shell and supporting software suite that varies from 3 to 20 megabytes large – ideal for network booting. The result is a platform that is easier to maintain and deploy in diverse hardware configurations« less

  9. Development and utilization of USGS ShakeCast for rapid post-earthquake assessment of critical facilities and infrastructure

    USGS Publications Warehouse

    Wald, David J.; Lin, Kuo-wan; Kircher, C.A.; Jaiswal, Kishor; Luco, Nicolas; Turner, L.; Slosky, Daniel

    2017-01-01

    The ShakeCast system is an openly available, near real-time post-earthquake information management system. ShakeCast is widely used by public and private emergency planners and responders, lifeline utility operators and transportation engineers to automatically receive and process ShakeMap products for situational awareness, inspection priority, or damage assessment of their own infrastructure or building portfolios. The success of ShakeCast to date and its broad, critical-user base mandates improved software usability and functionality, including improved engineering-based damage and loss functions. In order to make the software more accessible to novice users—while still utilizing advanced users’ technical and engineering background—we have developed a “ShakeCast Workbook”, a well documented, Excel spreadsheet-based user interface that allows users to input notification and inventory data and export XML files requisite for operating the ShakeCast system. Users will be able to select structure based on a minimum set of user-specified facility (building location, size, height, use, construction age, etc.). “Expert” users will be able to import user-modified structural response properties into facility inventory associated with the HAZUS Advanced Engineering Building Modules (AEBM). The goal of the ShakeCast system is to provide simplified real-time potential impact and inspection metrics (i.e., green, yellow, orange and red priority ratings) to allow users to institute customized earthquake response protocols. Previously, fragilities were approximated using individual ShakeMap intensity measures (IMs, specifically PGA and 0.3 and 1s spectral accelerations) for each facility but we are now performing capacity-spectrum damage state calculations using a more robust characterization of spectral deamnd.We are also developing methods for the direct import of ShakeMap’s multi-period spectra in lieu of the assumed three-domain design spectrum (at 0.3s for constant acceleration; 1s or 3s for constant velocity and constant displacement at very long response periods). As part of ongoing ShakeCast research and development, we will also explore the use of ShakeMap IM uncertainty estimates and evaluate the assumption of employing multiple response spectral damping values rather than the single 5%-damped value currently employed. Developing and incorporating advanced fragility assignments into the ShakeCast Workbook requires related software modifications and database improvements; these enhancements are part of an extensive rewrite of the ShakeCast application.

  10. A molecular fragment cheminformatics roadmap for mesoscopic simulation.

    PubMed

    Truszkowski, Andreas; Daniel, Mirco; Kuhn, Hubert; Neumann, Stefan; Steinbeck, Christoph; Zielesny, Achim; Epple, Matthias

    2014-12-01

    Mesoscopic simulation studies the structure, dynamics and properties of large molecular ensembles with millions of atoms: Its basic interacting units (beads) are no longer the nuclei and electrons of quantum chemical ab-initio calculations or the atom types of molecular mechanics but molecular fragments, molecules or even larger molecular entities. For its simulation setup and output a mesoscopic simulation kernel software uses abstract matrix (array) representations for bead topology and connectivity. Therefore a pure kernel-based mesoscopic simulation task is a tedious, time-consuming and error-prone venture that limits its practical use and application. A consequent cheminformatics approach tackles these problems and provides solutions for a considerably enhanced accessibility. This study aims at outlining a complete cheminformatics roadmap that frames a mesoscopic Molecular Fragment Dynamics (MFD) simulation kernel to allow its efficient use and practical application. The molecular fragment cheminformatics roadmap consists of four consecutive building blocks: An adequate fragment structure representation (1), defined operations on these fragment structures (2), the description of compartments with defined compositions and structural alignments (3), and the graphical setup and analysis of a whole simulation box (4). The basis of the cheminformatics approach (i.e. building block 1) is a SMILES-like line notation (denoted f SMILES) with connected molecular fragments to represent a molecular structure. The f SMILES notation and the following concepts and methods for building blocks 2-4 are outlined with examples and practical usage scenarios. It is shown that the requirements of the roadmap may be partly covered by already existing open-source cheminformatics software. Mesoscopic simulation techniques like MFD may be considerably alleviated and broadened for practical use with a consequent cheminformatics layer that successfully tackles its setup subtleties and conceptual usage hurdles. Molecular Fragment Cheminformatics may be regarded as a crucial accelerator to propagate MFD and similar mesoscopic simulation techniques in the molecular sciences. Graphical abstractA molecular fragment cheminformatics roadmap for mesoscopic simulation.

  11. Considerations for the Evaluation of Software.

    ERIC Educational Resources Information Center

    Fields, Thomas A.

    1984-01-01

    The paper describes the decision process in the determination of software purchase for the professional's use of the microcomputer. Consideration to ensure that the software purchased will be a success, which software will be needed, and options that will facilitate the location and evaluation of software are discussed. (Author/CL)

  12. SNPversity: A web-based tool for visualizing diversity

    USDA-ARS?s Scientific Manuscript database

    Background: Many stand-alone desktop software suites exist to visualize single nucleotide polymorphisms (SNP) diversity, but web-based software that can be easily implemented and used for biological databases is absent. SNPversity was created to answer this need by building an open-source visualizat...

  13. Teacher-Pedagogy Approach for Sustainable Proficiency

    ERIC Educational Resources Information Center

    Nath, Baiju K.; Balan, Meera

    2010-01-01

    Quality concerns of an institution shall be explained in terms of hardware and software. The hardware comprises of building and other infrastructural facilities and software involves teachers, students and administrative staff. Various agencies such as National Council for Educational Research & Training (NCERT), National Council for Teacher…

  14. RESEARCH AND DESIGN ABOUT VERSATILE 3D-CAD ENGINE FOR CONSTRUCTION

    NASA Astrophysics Data System (ADS)

    Tanaka, Shigenori; Kubota, Satoshi; Kitagawa, Etsuji; Monobe, Kantaro; Nakamura, Kenji

    In the construction field of Japan, it is an important subject to build the environment where 3D-CAD data is used for CALS/EC, information construction, and an improvement in productivity. However, in the construction field, 3D-CAD software does not exist under the present circumstances. Then, in order to support development of domestic 3D-CAD software, it is required to develop a 3D-CAD engine. In this research, in order to familiarize the 3D-CAD software at low cost and quickly and build the environment where the 3D-CAD software is utilizable, investigation for designing a 3D-CAD engine is proposed. The target for investigation are the use scene of 3D-CAD, the seeds which accompany 3D-CAD, a standardization trend, existing products, IT component engineering. Based on results of the investigation, the functional requirements for the 3D-CAD engine for the construction field were concluded.

  15. An easy-to-build, low-budget point-of-care ultrasound simulator: from Linux to a web-based solution.

    PubMed

    Damjanovic, Domagoj; Goebel, Ulrich; Fischer, Benedikt; Huth, Martin; Breger, Hartmut; Buerkle, Hartmut; Schmutz, Axel

    2017-12-01

    Hands-on training in point-of-care ultrasound (POC-US) should ideally comprise bedside teaching, as well as simulated clinical scenarios. High-fidelity phantoms and portable ultrasound simulation systems are commercially available, however, at considerable costs. This limits their suitability for medical schools. A Linux-based software for Emergency Department Ultrasound Simulation (edus2TM) was developed by Kulyk and Olszynski in 2011. Its feasibility for POC-US education has been well-documented, and shows good acceptance. An important limitation to an even more widespread use of edus2, however, may be due to the need for a virtual machine for WINDOWS ® systems. Our aim was to adapt the original software toward an HTML-based solution, thus making it affordable and applicable in any simulation setting. We created an HTML browser-based ultrasound simulation application, which reads the input of different sensors, triggering an ultrasound video to be displayed on a respective device. RFID tags, NFC tags, and QR Codes™ have been integrated into training phantoms or were attached to standardized patients. The RFID antenna was hidden in a mock ultrasound probe. The application is independent from the respective device. Our application was used successfully with different trigger/scanner combinations and mounted readily into simulated training scenarios. The application runs independently from operating systems or electronic devices. This low-cost, browser-based ultrasound simulator is easy-to-build, very adaptive, and independent from operating systems. It has the potential to facilitate POC-US training throughout the world, especially in resource-limited areas.

  16. Scalable geocomputation: evolving an environmental model building platform from single-core to supercomputers

    NASA Astrophysics Data System (ADS)

    Schmitz, Oliver; de Jong, Kor; Karssenberg, Derek

    2017-04-01

    There is an increasing demand to run environmental models on a big scale: simulations over large areas at high resolution. The heterogeneity of available computing hardware such as multi-core CPUs, GPUs or supercomputer potentially provides significant computing power to fulfil this demand. However, this requires detailed knowledge of the underlying hardware, parallel algorithm design and the implementation thereof in an efficient system programming language. Domain scientists such as hydrologists or ecologists often lack this specific software engineering knowledge, their emphasis is (and should be) on exploratory building and analysis of simulation models. As a result, models constructed by domain specialists mostly do not take full advantage of the available hardware. A promising solution is to separate the model building activity from software engineering by offering domain specialists a model building framework with pre-programmed building blocks that they combine to construct a model. The model building framework, consequently, needs to have built-in capabilities to make full usage of the available hardware. Developing such a framework providing understandable code for domain scientists and being runtime efficient at the same time poses several challenges on developers of such a framework. For example, optimisations can be performed on individual operations or the whole model, or tasks need to be generated for a well-balanced execution without explicitly knowing the complexity of the domain problem provided by the modeller. Ideally, a modelling framework supports the optimal use of available hardware whichsoever combination of model building blocks scientists use. We demonstrate our ongoing work on developing parallel algorithms for spatio-temporal modelling and demonstrate 1) PCRaster, an environmental software framework (http://www.pcraster.eu) providing spatio-temporal model building blocks and 2) parallelisation of about 50 of these building blocks using the new Fern library (https://github.com/geoneric/fern/), an independent generic raster processing library. Fern is a highly generic software library and its algorithms can be configured according to the configuration of a modelling framework. With manageable programming effort (e.g. matching data types between programming and domain language) we created a binding between Fern and PCRaster. The resulting PCRaster Python multicore module can be used to execute existing PCRaster models without having to make any changes to the model code. We show initial results on synthetic and geoscientific models indicating significant runtime improvements provided by parallel local and focal operations. We further outline challenges in improving remaining algorithms such as flow operations over digital elevation maps and further potential improvements like enhancing disk I/O.

  17. ControlShell: A real-time software framework

    NASA Technical Reports Server (NTRS)

    Schneider, Stanley A.; Chen, Vincent W.; Pardo-Castellote, Gerardo

    1994-01-01

    The ControlShell system is a programming environment that enables the development and implementation of complex real-time software. It includes many building tools for complex systems, such as a graphical finite state machine (FSM) tool to provide strategic control. ControlShell has a component-based design, providing interface definitions and mechanisms for building real-time code modules along with providing basic data management. Some of the system-building tools incorporated in ControlShell are a graphical data flow editor, a component data requirement editor, and a state-machine editor. It also includes a distributed data flow package, an execution configuration manager, a matrix package, and an object database and dynamic binding facility. This paper presents an overview of ControlShell's architecture and examines the functions of several of its tools.

  18. Superintendents' Entry Periods: Strategies and Behaviors That Successful Superintendents Use to Build Strong Relationships and Trust with Their School Boards during Their Entry Period

    ERIC Educational Resources Information Center

    Howland, Sean J.

    2012-01-01

    The purpose of the study was to identify strategies/behaviors that successful superintendents used to build strong relationships and trust with their school boards during their entry periods. Three research questions guided the study: (1) What strategies/behaviors are successful superintendents using to build strong relationships and trust with…

  19. Finite element based simulation on friction stud welding of metal matrix composites to steel

    NASA Astrophysics Data System (ADS)

    Hynes, N. Rajesh Jesudoss; Tharmaraj, R.; Velu, P. Shenbaga; Kumar, R.

    2016-05-01

    Friction welding is a solid state joining technique used for joining similar and dissimilar materials with high integrity. This new technique is being successfully applied to the aerospace, automobile, and ship building industries, and is attracting more and more research interest. The quality of Friction Stud Welded joints depends on the frictional heat generated at the interface. Hence, thermal analysis on friction stud welding of stainless steel (AISI 304) and aluminium silicon carbide (AlSiC) combination is carried out in the present work. In this study, numerical simulation is carried out using ANSYS software and the temperature profiles are predicted at various increments of time. The developed numerical model is found to be adequate to predict temperature distribution of friction stud weld aluminium silicon carbide/stainless steel joints.

  20. An evaluation of object-oriented image analysis techniques to identify motorized vehicle effects in semi-arid to arid ecosystems of the American West

    USGS Publications Warehouse

    Mladinich, C.

    2010-01-01

    Human disturbance is a leading ecosystem stressor. Human-induced modifications include transportation networks, areal disturbances due to resource extraction, and recreation activities. High-resolution imagery and object-oriented classification rather than pixel-based techniques have successfully identified roads, buildings, and other anthropogenic features. Three commercial, automated feature-extraction software packages (Visual Learning Systems' Feature Analyst, ENVI Feature Extraction, and Definiens Developer) were evaluated by comparing their ability to effectively detect the disturbed surface patterns from motorized vehicle traffic. Each package achieved overall accuracies in the 70% range, demonstrating the potential to map the surface patterns. The Definiens classification was more consistent and statistically valid. Copyright ?? 2010 by Bellwether Publishing, Ltd. All rights reserved.

  1. BIM and IoT: A Synopsis from GIS Perspective

    NASA Astrophysics Data System (ADS)

    Isikdag, U.

    2015-10-01

    Internet-of-Things (IoT) focuses on enabling communication between all devices, things that are existent in real life or that are virtual. Building Information Models (BIMs) and Building Information Modelling is a hype that has been the buzzword of the construction industry for last 15 years. BIMs emerged as a result of a push by the software companies, to tackle the problems of inefficient information exchange between different software and to enable true interoperability. In BIM approach most up-to-date an accurate models of a building are stored in shared central databases during the design and the construction of a project and at post-construction stages. GIS based city monitoring / city management applications require the fusion of information acquired from multiple resources, BIMs, City Models and Sensors. This paper focuses on providing a method for facilitating the GIS based fusion of information residing in digital building "Models" and information acquired from the city objects i.e. "Things". Once this information fusion is accomplished, many fields ranging from Emergency Response, Urban Surveillance, Urban Monitoring to Smart Buildings will have potential benefits.

  2. Occupy Hard Drives: Making your work more valuable by giving it away

    NASA Astrophysics Data System (ADS)

    Weiner, Benjamin J.

    2014-01-01

    Astronomy is more than ever reliant on scientist-built software, but our systems of supporting research and giving credit for research work have failed to evolve with this reality. Both the perception of short term advantage, and an artificial distinction between "tools" and "science," lead to software and data remaining proprietary or unpublished. The lack of incentives to build and maintain software leads to both a decay of the software infrastructure, and a potential for growing class inequality, a pundit-technician divide. Top-down efforts to direct the field such as the recent US decadal survey have not adequately addressed this future. I argue that writing, freely releasing, and publishing your software is currently not adequately funded, rewarded, or credited, and that you should do it anyway. Writing your software as if you plan to release it is better for you and for the code. Releasing software can get credit from the rest of the community beyond your circle of collaborators or letter-writers, and it can benefit you and everyone else by making astronomy a better place to work. Building a culture of cooperation will be a more effective approach to reforming the system of credit than waiting for leadership from above or outside, but requires that each of us consciously encourage process, values, and behavior that support such a change.

  3. 76 FR 4864 - Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-27

    ...: National Institute of Standards and Technology (NIST). Title: BEES (Building for Environmental and Economic.... Needs and Uses: Building for Environmental and Economic Sustainability (BEES) Please is a voluntary... may be evaluated scientifically using the BEES software. These data include product-specific materials...

  4. Knowledge-intensive software design systems: Can too much knowledge be a burden?

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.

    1992-01-01

    While acknowledging the considerable benefits of domain-specific, knowledge-intensive approaches to automated software engineering, it is prudent to carefully examine the costs of such approaches, as well. In adding domain knowledge to a system, a developer makes a commitment to understanding, representing, maintaining, and communicating that knowledge. This substantial overhead is not generally associated with domain-independent approaches. In this paper, I examine the downside of incorporating additional knowledge, and illustrate with examples based on our experience in building the SIGMA system. I also offer some guidelines for developers building domain-specific systems.

  5. Knowledge-intensive software design systems: Can too much knowledge be a burden?

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.

    1992-01-01

    While acknowledging the considerable benefits of domain-specific, knowledge-intensive approaches to automated software engineering, it is prudent to carefully examine the costs of such approaches, as well. In adding domain knowledge to a system, a developer makes a commitment to understanding, representing, maintaining, and communicating that knowledge. This substantial overhead is not generally associated with domain-independent approaches. In this paper, I examine the downside of incorporating additional knowledge, and illustrate with examples based on our experiences building the SIGMA system. I also offer some guidelines for developers building domain-specific systems.

  6. Survivability as a Tool for Evaluating Open Source Software

    DTIC Science & Technology

    2015-06-01

    the thesis limited the program development, so it is only able to process project issues (bugs or feature requests), which is an important metric for...Ideally, these insights may provide an analytic framework to generate guidance for decision makers that may support the inclusion of OSS to more...refine their efforts to build quality software and to strengthen their software development communities. 1.4 Research Questions This thesis addresses

  7. User-Driven Quality Certification of Workplace Software, the UsersAward Experience

    DTIC Science & Technology

    2004-06-01

    the set of criteria and the chosen level of approval was sufficiently balanced . Furthermore, the fact that both software providers experienced... Worklife - Building Social Capacity - European Approaches, Edition sigma Berlin. Lind, T. (2002). IT-kartan, användare och IT-system i svenskt

  8. Communal Resources in Open Source Software Development

    ERIC Educational Resources Information Center

    Spaeth, Sebastian; Haefliger, Stefan; von Krogh, Georg; Renzl, Birgit

    2008-01-01

    Introduction: Virtual communities play an important role in innovation. The paper focuses on the particular form of collective action in virtual communities underlying as Open Source software development projects. Method: Building on resource mobilization theory and private-collective innovation, we propose a theory of collective action in…

  9. Software Prototyping: Designing Systems for Users.

    ERIC Educational Resources Information Center

    Spies, Phyllis Bova

    1983-01-01

    Reports on major change in computer software development process--the prototype model, i.e., implementation of skeletal system that is enhanced during interaction with users. Expensive and unreliable software, software design errors, traditional development approach, resources required for prototyping, success stories, and systems designer's role…

  10. Proceedings of the 14th Annual Software Engineering Workshop

    NASA Technical Reports Server (NTRS)

    1989-01-01

    Several software related topics are presented. Topics covered include studies and experiment at the Software Engineering Laboratory at the Goddard Space Flight Center, predicting project success from the Software Project Management Process, software environments, testing in a reuse environment, domain directed reuse, and classification tree analysis using the Amadeus measurement and empirical analysis.

  11. Software System Safety and the NASA Aeronautics Blueprint

    NASA Technical Reports Server (NTRS)

    Holloway, C. Michael; Hayhurst, Kelly J.

    2002-01-01

    NASA's Aeronautics Blueprint lays out a research agenda for the Agency s aeronautics program. The word software appears only four times in this Blueprint, but the critical importance of safe and correct software to the fulfillment of the proposed research is evident on almost every page. Most of the technology solutions proposed to address challenges in aviation are software dependent technologies. Of the fifty-two specific technology solutions described in the Blueprint, forty-one depend, at least in part, on software for success. For thirty-five of these forty-one, software is not only critical to success, but also to human safety. That is, implementing the technology solutions will require using software in such a way that it may, if not specified, designed, and implemented properly, lead to fatal accidents. These results have at least two implications for the research based on the Blueprint: (1) knowledge about the current state-of-the-art and state-of-the-practice in software engineering and software system safety is essential, and (2) research into current unsolved problems in these software disciplines is also essential.

  12. Machine learning research 1989-90

    NASA Technical Reports Server (NTRS)

    Porter, Bruce W.; Souther, Arthur

    1990-01-01

    Multifunctional knowledge bases offer a significant advance in artificial intelligence because they can support numerous expert tasks within a domain. As a result they amortize the costs of building a knowledge base over multiple expert systems and they reduce the brittleness of each system. Due to the inevitable size and complexity of multifunctional knowledge bases, their construction and maintenance require knowledge engineering and acquisition tools that can automatically identify interactions between new and existing knowledge. Furthermore, their use requires software for accessing those portions of the knowledge base that coherently answer questions. Considerable progress was made in developing software for building and accessing multifunctional knowledge bases. A language was developed for representing knowledge, along with software tools for editing and displaying knowledge, a machine learning program for integrating new information into existing knowledge, and a question answering system for accessing the knowledge base.

  13. ATLAS software stack on ARM64

    NASA Astrophysics Data System (ADS)

    Smith, Joshua Wyatt; Stewart, Graeme A.; Seuster, Rolf; Quadt, Arnulf; ATLAS Collaboration

    2017-10-01

    This paper reports on the port of the ATLAS software stack onto new prototype ARM64 servers. This included building the “external” packages that the ATLAS software relies on. Patches were needed to introduce this new architecture into the build as well as patches that correct for platform specific code that caused failures on non-x86 architectures. These patches were applied such that porting to further platforms will need no or only very little adjustments. A few additional modifications were needed to account for the different operating system, Ubuntu instead of Scientific Linux 6 / CentOS7. Selected results from the validation of the physics outputs on these ARM 64-bit servers will be shown. CPU, memory and IO intensive benchmarks using ATLAS specific environment and infrastructure have been performed, with a particular emphasis on the performance vs. energy consumption.

  14. SSMNG Software Service Manager: A Scalable Building Blocks Architecture for PUS Services & FDIR Management

    NASA Astrophysics Data System (ADS)

    Lisio, Giovanni; Candia, Sante; Campolo, Giovanni; Pascucci, Dario

    2011-08-01

    Thales Alenia Space Italy has carried out the definition of a configurable (on mission basis) PUS ECSS-E_70- 41A see [3] Centralised Services Layer, characterised by:- a mission-independent set of 'classes' implementing the services logic.- a mission-dependent set of configuration data and selection flags.The software components belonging to this layer implement the PUS standard services ECSS-E_70-41A and a set of mission-specific services. The design of this layer has been performed by separating the services mechanisms (mission-independent execution logic) from the services configuration information (mission-dependent data). Once instantiated for a specific mission, the PUS Centralised Services Layer offers a large set of capabilities available to the CSCI's Applications Layer. This paper describes the building blocks PUS architectural solution developed by Thales Alenia Space Italy, emphasizing the mechanisms which allow easy configuration of the Scalable PUS library to fulfill the requirements of different missions. This paper also focus the Thales Alenia Space solution to automatically generate the mission-specific "PUS Services" flight software based on mission specific requirements. Building the PUS services mechanisms, which are configurable on mission basis is part of the PRIMA (Multipurpose Spacecraft Bus ) 'missionisation' process improvement. PRIMA Platform Avionics Software (ASW) is continuously evolving to improve modularity and standardization of interfaces and of SW components (see references in [1]).

  15. GERICOS: A Generic Framework for the Development of On-Board Software

    NASA Astrophysics Data System (ADS)

    Plasson, P.; Cuomo, C.; Gabriel, G.; Gauthier, N.; Gueguen, L.; Malac-Allain, L.

    2016-08-01

    This paper presents an overview of the GERICOS framework (GEneRIC Onboard Software), its architecture, its various layers and its future evolutions. The GERICOS framework, developed and qualified by LESIA, offers a set of generic, reusable and customizable software components for the rapid development of payload flight software. The GERICOS framework has a layered structure. The first layer (GERICOS::CORE) implements the concept of active objects and forms an abstraction layer over the top of real-time kernels. The second layer (GERICOS::BLOCKS) offers a set of reusable software components for building flight software based on generic solutions to recurrent functionalities. The third layer (GERICOS::DRIVERS) implements software drivers for several COTS IP cores of the LEON processor ecosystem.

  16. Modular Software for Spacecraft Navigation Using the Global Positioning System (GPS)

    NASA Technical Reports Server (NTRS)

    Truong, S. H.; Hartman, K. R.; Weidow, D. A.; Berry, D. L.; Oza, D. H.; Long, A. C.; Joyce, E.; Steger, W. L.

    1996-01-01

    The Goddard Space Flight Center Flight Dynamics and Mission Operations Divisions have jointly investigated the feasibility of engineering modular Global Positioning SYSTEM (GPS) navigation software to support both real time flight and ground postprocessing configurations. The goals of this effort are to define standard GPS data interfaces and to engineer standard, reusable navigation software components that can be used to build a broad range of GPS navigation support applications. The paper discusses the GPS modular software (GMOD) system and operations concepts, major requirements, candidate software architecture, feasibility assessment and recommended software interface standards. In additon, ongoing efforts to broaden the scope of the initial study and to develop modular software to support autonomous navigation using GPS are addressed,

  17. Busting out of crystallography's Sisyphean prison: from pencil and paper to structure solving at the press of a button: past, present and future of crystallographic software development, maintenance and distribution.

    PubMed

    Cranswick, Lachlan Michael David

    2008-01-01

    The history of crystallographic computing and use of crystallographic software is one which traces the escape from the drudgery of manual human calculations to a world where the user delegates most of the travail to electronic computers. In practice, this involves practising crystallographers communicating their thoughts to the crystallographic program authors, in the hope that new procedures will be implemented within their software. Against this background, the development of small-molecule single-crystal and powder diffraction software is traced. Starting with the analogue machines and the use of Hollerith tabulators of the late 1930's, it is shown that computing developments have been science led, with new technologies being harnessed to solve pressing crystallographic problems. The development of software is also traced, with a final caution that few of the computations now performed daily are really understood by the program users. Unless a sufficient body of people continues to dismantle and re-build programs, the knowledge encoded in the old programs will become as inaccessible as the knowledge of how to build the Great Pyramid at Giza.

  18. Validation of highly reliable, real-time knowledge-based systems

    NASA Technical Reports Server (NTRS)

    Johnson, Sally C.

    1988-01-01

    Knowledge-based systems have the potential to greatly increase the capabilities of future aircraft and spacecraft and to significantly reduce support manpower needed for the space station and other space missions. However, a credible validation methodology must be developed before knowledge-based systems can be used for life- or mission-critical applications. Experience with conventional software has shown that the use of good software engineering techniques and static analysis tools can greatly reduce the time needed for testing and simulation of a system. Since exhaustive testing is infeasible, reliability must be built into the software during the design and implementation phases. Unfortunately, many of the software engineering techniques and tools used for conventional software are of little use in the development of knowledge-based systems. Therefore, research at Langley is focused on developing a set of guidelines, methods, and prototype validation tools for building highly reliable, knowledge-based systems. The use of a comprehensive methodology for building highly reliable, knowledge-based systems should significantly decrease the time needed for testing and simulation. A proven record of delivering reliable systems at the beginning of the highly visible testing and simulation phases is crucial to the acceptance of knowledge-based systems in critical applications.

  19. Independent verification and validation for Space Shuttle flight software

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The Committee for Review of Oversight Mechanisms for Space Shuttle Software was asked by the National Aeronautics and Space Administration's (NASA) Office of Space Flight to determine the need to continue independent verification and validation (IV&V) for Space Shuttle flight software. The Committee found that the current IV&V process is necessary to maintain NASA's stringent safety and quality requirements for man-rated vehicles. Therefore, the Committee does not support NASA's plan to eliminate funding for the IV&V effort in fiscal year 1993. The Committee believes that the Space Shuttle software development process is not adequate without IV&V and that elimination of IV&V as currently practiced will adversely affect the overall quality and safety of the software, both now and in the future. Furthermore, the Committee was told that no organization within NASA has the expertise or the manpower to replace the current IV&V function in a timely fashion, nor will building this expertise elsewhere necessarily reduce cost. Thus, the Committee does not recommend moving IV&V functions to other organizations within NASA unless the current IV&V is maintained for as long as it takes to build comparable expertise in the replacing organization.

  20. BioSPICE: access to the most current computational tools for biologists.

    PubMed

    Garvey, Thomas D; Lincoln, Patrick; Pedersen, Charles John; Martin, David; Johnson, Mark

    2003-01-01

    The goal of the BioSPICE program is to create a framework that provides biologists access to the most current computational tools. At the program midpoint, the BioSPICE member community has produced a software system that comprises contributions from approximately 20 participating laboratories integrated under the BioSPICE Dashboard and a methodology for continued software integration. These contributed software modules are the BioSPICE Dashboard, a graphical environment that combines Open Agent Architecture and NetBeans software technologies in a coherent, biologist-friendly user interface. The current Dashboard permits data sources, models, simulation engines, and output displays provided by different investigators and running on different machines to work together across a distributed, heterogeneous network. Among several other features, the Dashboard enables users to create graphical workflows by configuring and connecting available BioSPICE components. Anticipated future enhancements to BioSPICE include a notebook capability that will permit researchers to browse and compile data to support model building, a biological model repository, and tools to support the development, control, and data reduction of wet-lab experiments. In addition to the BioSPICE software products, a project website supports information exchange and community building.

  1. AirNow Information Management System - Global Earth Observation System of Systems Data Processor for Real-Time Air Quality Data Products

    NASA Astrophysics Data System (ADS)

    Haderman, M.; Dye, T. S.; White, J. E.; Dickerson, P.; Pasch, A. N.; Miller, D. S.; Chan, A. C.

    2012-12-01

    Built upon the success of the U.S. Environmental Protection Agency's (EPA) AirNow program (www.AirNow.gov), the AirNow-International (AirNow-I) system contains an enhanced suite of software programs that process and quality control real-time air quality and environmental data and distribute customized maps, files, and data feeds. The goals of the AirNow-I program are similar to those of the successful U.S. program and include fostering the exchange of environmental data; making advances in air quality knowledge and applications; and building a community of people, organizations, and decision makers in environmental management. In 2010, Shanghai became the first city in China to run this state-of-the-art air quality data management and notification system. AirNow-I consists of a suite of modules (software programs and schedulers) centered on a database. One such module is the Information Management System (IMS), which can automatically produce maps and other data products through the use of GIS software to provide the most current air quality information to the public. Developed with Global Earth Observation System of Systems (GEOSS) interoperability in mind, IMS is based on non-proprietary standards, with preference to formal international standards. The system depends on data and information providers accepting and implementing a set of interoperability arrangements, including technical specifications for collecting, processing, storing, and disseminating shared data, metadata, and products. In particular, the specifications include standards for service-oriented architecture and web-based interfaces, such as a web mapping service (WMS), web coverage service (WCS), web feature service (WFS), sensor web services, and Really Simple Syndication (RSS) feeds. IMS is flexible, open, redundant, and modular. It also allows the merging of data grids to create complex grids that show comprehensive air quality conditions. For example, the AirNow Satellite Data Processor (ASDP) was recently developed to merge PM2.5 estimates from National Aeronautics and Space Administration (NASA) satellite data and AirNow observational data, creating more precise maps and gridded data products for under-monitored areas. The ASDP can easily incorporate other data feeds, including fire and smoke locations, to build enhanced real-time air quality data products. In this presentation, we provide an overview of the features and functions of IMS, an explanation of how data moves through IMS, the rationale of the system architecture, and highlights of the ASDP as an example of the modularity and scalability of IMS.

  2. The NASA Software Research Infusion Initiative: Successful Technology Transfer for Software Assurance

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G.; Pressburger, Thomas; Markosian, Lawrence; Feather, Martin S.

    2006-01-01

    New processes, methods and tools are constantly appearing in the field of software engineering. Many of these augur great potential in improving software development processes, resulting in higher quality software with greater levels of assurance. However, there are a number of obstacles that impede their infusion into software development practices. These are the recurring obstacles common to many forms of research. Practitioners cannot readily identify the emerging techniques that may most benefit them, and cannot afford to risk time and effort in evaluating and experimenting with them while there is still uncertainty about whether they will have payoff in this particular context. Similarly, researchers cannot readily identify those practitioners whose problems would be amenable to their techniques and lack the feedback from practical applications necessary to help them to evolve their techniques to make them more likely to be successful. This paper describes an ongoing effort conducted by a software engineering research infusion team, and the NASA Research Infusion Initiative, established by NASA s Software Engineering Initiative, to overcome these obstacles.

  3. What Are HyperCard? (Part 2).

    ERIC Educational Resources Information Center

    Marcus, Stephen

    1989-01-01

    Presents the second article in a two-part series on HyperCard materials (computer software used to build structures that create patterns and connections) designed for English and language arts classes. Suggests assignments for use with early HyperCard software that can be adapted to a variety of nonverbal "stackware." (MM)

  4. The Use of Geogebra Software as a Calculus Teaching and Learning Tool

    ERIC Educational Resources Information Center

    Nobre, Cristiane Neri; Meireles, Magali Rezende Gouvêa; Vieira, Niltom, Jr.; de Resende, Mônica Neli; da Costa, Lucivânia Ester; da Rocha, Rejane Corrêa

    2016-01-01

    Information and Communication Technologies (ICT) in education provide a new learning environment where the student builds his own knowledge, allowing his visualization and experimentation. This study evaluated the Geogebra software in the learning process of Calculus. It was observed that the proposed activities helped in the graphical…

  5. Engaging New Software.

    ERIC Educational Resources Information Center

    Allen, Denise

    1994-01-01

    Reviews three educational computer software products: (1) a compact disc-read only memory (CD-ROM) bundle of five mathematics programs from the Apple Education Series; (2) "Sammy's Science House," with science activities for preschool through second grade (Edmark); and (3) "The Cat Came Back," an interactive CD-ROM game designed to build language…

  6. Preventing Exploits Against Software of Uncertain Provenance (PEASOUP)

    DTIC Science & Technology

    2015-05-01

    Together with Strata’s fine-grained confinement, they build up a robust sandbox environment for SOUP that can resist most kind of exploits...Guide to the World’s Most Popular Disassembler. 2008: No Starch Press. 60. Christey, S., 2011 CWE/SANS top 25 most dangerous software errors

  7. Organization of functional interaction of corporate information systems

    NASA Astrophysics Data System (ADS)

    Safronov, V. V.; Barabanov, V. F.; Podvalniy, S. L.; Nuzhnyy, A. M.

    2018-03-01

    In this article the methods of specialized software systems integration are analyzed and the concept of seamless integration of production decisions is offered. In view of this concept developed structural and functional schemes of the specialized software are shown. The proposed schemes and models are improved for a machine-building enterprise.

  8. Building Databases for Education. ERIC Digest.

    ERIC Educational Resources Information Center

    Klausmeier, Jane A.

    This digest provides a brief explanation of what a database is; explains how a database can be used; identifies important factors that should be considered when choosing database management system software; and provides citations to sources for finding reviews and evaluations of database management software. The digest is concerned primarily with…

  9. Locking Down the Software Development Environment

    DTIC Science & Technology

    2014-12-01

    OpenSSL code [13]. The OpenSSL software is, as the name implies, open source, a result of many developers coding beginning in 1998 using the C...programming language to build crypto services. OpenSSL is used widely both on the Internet and in firmware [13], further delaying the ability of many

  10. Constraint-Driven Software Design: An Escape from the Waterfall Model.

    ERIC Educational Resources Information Center

    de Hoog, Robert; And Others

    1994-01-01

    Presents the principles of a development methodology for software design based on a nonlinear, product-driven approach that integrates quality aspects. Two examples are given to show that the flexibility needed for building high quality systems leads to integrated development environments in which methodology, product, and tools are closely…

  11. Institutional Logics, Indie Software Developers and Platform Governance

    ERIC Educational Resources Information Center

    Qiu, Yixin

    2013-01-01

    This two-essay dissertation aims to study institutional logics in the context of Apple's independent third-party software developers. In essay 1, I investigate the embedded agency aspect of the institutional logics theory. It builds on the premise that logics constrain preferences, interests and behaviors of individuals and organizations, thereby…

  12. Improving Mathematics Learning of Kindergarten Students through Computer-Assisted Instruction

    ERIC Educational Resources Information Center

    Foster, Matthew E.; Anthony, Jason L.; Clements, Doug H.; Sarama, Julie; Williams, Jeffrey M.

    2016-01-01

    This study evaluated the effects of a mathematics software program, the Building Blocks software suite, on young children's mathematics performance. Participants included 247 Kindergartners from 37 classrooms in 9 schools located in low-income communities. Children within classrooms were randomly assigned to receive 21 weeks of computer-assisted…

  13. Four Pillars for Improving the Quality of Safety-Critical Software-Reliant Systems

    DTIC Science & Technology

    2013-04-01

    Studies of safety-critical software-reliant systems developed using the current practices of build-then-test show that requirements and architecture ... design defects make up approximately 70% of all defects, many system level related to operational quality attributes, and 80% of these defects are

  14. [Construction of educational software about personality disorders].

    PubMed

    Botti, Nadja Cristiane Lappann; Carneiro, Ana Luíza Marques; Almeida, Camila Souza; Pereira, Cíntia Braga Silva

    2011-01-01

    The study describes the experience of building educational software in the area of mental health. The software was developed to enable the nursing student identify personality disorders. In this process, we applied the pedagogical framework of Vygotsky and the theoretical framework of the diagnostic criteria defined by DSM-IV. From these references were identified personality disorders characters in stories and / or children's movies. The software development bank was built with multimedia graphics data, sound and explanatory. The software developed like educational game like questions with increasing levels of difficulty. The software was developed with Microsoft Office PowerPoint 2007. It is believed in the validity of this strategy for teaching-learning to the area of mental health nursing.

  15. Thermal dynamic simulation of wall for building energy efficiency under varied climate environment

    NASA Astrophysics Data System (ADS)

    Wang, Xuejin; Zhang, Yujin; Hong, Jing

    2017-08-01

    Aiming at different kind of walls in five cities of different zoning for thermal design, using thermal instantaneous response factors method, the author develops software to calculation air conditioning cooling load temperature, thermal response factors, and periodic response factors. On the basis of the data, the author gives the net work analysis about the influence of dynamic thermal of wall on air-conditioning load and thermal environment in building of different zoning for thermal design regional, and put forward the strategy how to design thermal insulation and heat preservation wall base on dynamic thermal characteristic of wall under different zoning for thermal design regional. And then provide the theory basis and the technical references for the further study on the heat preservation with the insulation are in the service of energy saving wall design. All-year thermal dynamic load simulating and energy consumption analysis for new energy-saving building is very important in building environment. This software will provide the referable scientific foundation for all-year new thermal dynamic load simulation, energy consumption analysis, building environment systems control, carrying through farther research on thermal particularity and general particularity evaluation for new energy -saving walls building. Based on which, we will not only expediently design system of building energy, but also analyze building energy consumption and carry through scientific energy management. The study will provide the referable scientific foundation for carrying through farther research on thermal particularity and general particularity evaluation for new energy saving walls building.

  16. Implementation of AN Unmanned Aerial Vehicle System for Large Scale Mapping

    NASA Astrophysics Data System (ADS)

    Mah, S. B.; Cryderman, C. S.

    2015-08-01

    Unmanned Aerial Vehicles (UAVs), digital cameras, powerful personal computers, and software have made it possible for geomatics professionals to capture aerial photographs and generate digital terrain models and orthophotographs without using full scale aircraft or hiring mapping professionals. This has been made possible by the availability of miniaturized computers and sensors, and software which has been driven, in part, by the demand for this technology in consumer items such as smartphones. The other force that is in play is the increasing number of Do-It-Yourself (DIY) people who are building UAVs as a hobby or for professional use. Building a UAV system for mapping is an alternative to purchasing a turnkey system. This paper describes factors to be considered when building a UAV mapping system, the choices made, and the test results of a project using this completed system.

  17. Automating the design of scientific computing software

    NASA Technical Reports Server (NTRS)

    Kant, Elaine

    1992-01-01

    SINAPSE is a domain-specific software design system that generates code from specifications of equations and algorithm methods. This paper describes the system's design techniques (planning in a space of knowledge-based refinement and optimization rules), user interaction style (user has option to control decision making), and representation of knowledge (rules and objects). It also summarizes how the system knowledge has evolved over time and suggests some issues in building software design systems to facilitate reuse.

  18. Compositional Specification of Software Architecture

    NASA Technical Reports Server (NTRS)

    Penix, John; Lau, Sonie (Technical Monitor)

    1998-01-01

    This paper describes our experience using parameterized algebraic specifications to model properties of software architectures. The goal is to model the decomposition of requirements independent of the style used to implement the architecture. We begin by providing an overview of the role of architecture specification in software development. We then describe how architecture specifications are build up from component and connector specifications and give an overview of insights gained from a case study used to validate the method.

  19. Analysis of sulfates on low molecular weight heparin using mass spectrometry: structural characterization of enoxaparin.

    PubMed

    Gupta, Rohitesh; Ponnusamy, Moorthy P

    2018-05-31

    Structural characterization of low molecular weight heparin (LMWH) is critical to meet biosimilarity standards. In this context, the review focuses on structural analysis of labile sulfates attached to the side-groups of LMWH using mass spectrometry. A comprehensive review of this topic will help readers to identify key strategies for tackling the problem related to sulfate loss. At the same time, various mass spectrometry techniques are presented to facilitate compositional analysis of LMWH, mainly enoxaparin. Areas covered: This review summarizes findings on mass spectrometry application for LMWH, including modulation of sulfates, using enzymology and sample preparation approaches. Furthermore, popular open-source software packages for automated spectral data interpretation are also discussed. Successful use of LC/MS can decipher structural composition for LMWH and help evaluate their sameness or biosimilarity with the innovator molecule. Overall, the literature has been searched using PubMed by typing various search queries such as 'enoxaparin', 'mass spectrometry', 'low molecular weight heparin', 'structural characterization', etc. Expert commentary: This section highlights clinically relevant areas that need improvement to achieve satisfactory commercialization of LMWHs. It also primarily emphasizes the advancements in instrumentation related to mass spectrometry, and discusses building automated software for data interpretation and analysis.

  20. An Integrated Crustal Dynamics Simulator

    NASA Astrophysics Data System (ADS)

    Xing, H. L.; Mora, P.

    2007-12-01

    Numerical modelling offers an outstanding opportunity to gain an understanding of the crustal dynamics and complex crustal system behaviour. This presentation provides our long-term and ongoing effort on finite element based computational model and software development to simulate the interacting fault system for earthquake forecasting. A R-minimum strategy based finite-element computational model and software tool, PANDAS, for modelling 3-dimensional nonlinear frictional contact behaviour between multiple deformable bodies with the arbitrarily-shaped contact element strategy has been developed by the authors, which builds up a virtual laboratory to simulate interacting fault systems including crustal boundary conditions and various nonlinearities (e.g. from frictional contact, materials, geometry and thermal coupling). It has been successfully applied to large scale computing of the complex nonlinear phenomena in the non-continuum media involving the nonlinear frictional instability, multiple material properties and complex geometries on supercomputers, such as the South Australia (SA) interacting fault system, South California fault model and Sumatra subduction model. It has been also extended and to simulate the hot fractured rock (HFR) geothermal reservoir system in collaboration of Geodynamics Ltd which is constructing the first geothermal reservoir system in Australia and to model the tsunami generation induced by earthquakes. Both are supported by Australian Research Council.

  1. Avoid the four perils of CRM.

    PubMed

    Rigby, Darrell K; Reichheld, Frederick F; Schefter, Phil

    2002-02-01

    Customer relationship management is one of the hottest management tools today. But more than half of all CRM initiatives fail to produce the anticipated results. Why? And what can companies do to reverse that negative trend? The authors--three senior Bain consultants--have spent the past ten years analyzing customer-loyalty initiatives, both successful and unsuccessful, at more than 200 companies in a wide range of industries. They've found that CRM backfires in part because executives don't understand what they are implementing, let alone how much it will cost or how long it will take. The authors' research unveiled four common pitfalls that managers stumble into when trying to implement CRM. Each pitfall is a consequence of a single flawed assumption--that CRM is software that will automatically manage customer relationships. It isn't. Rather, CRM is the creation of customer strategies and processes to build customer loyalty, which are then supported by the technology. This article looks at best practices in CRM at several companies, including the New York Times Company, Square D, GE Capital, Grand Expeditions, and BMC Software. It provides an intellectual framework for any company that wants to start a CRM program or turn around a failing one.

  2. Instructional Software and Attention Disorders: A Tool for Teachers.

    ERIC Educational Resources Information Center

    Bice, Joe E.; And Others

    This handbook provides information on 31 software programs designed to instruct students with attention disorders in individual and group settings. The most successful applications of instructional software are identified, and six broad categories of instructional software are discussed. Twenty-one strategies for teaching students with attention…

  3. Application of the Software as a Service Model to the Control of Complex Building Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stadler, Michael; Donadee, Jonathan; Marnay, Chris

    2011-03-17

    In an effort to create broad access to its optimization software, Lawrence Berkeley National Laboratory (LBNL), in collaboration with the University of California at Davis (UC Davis) and OSISoft, has recently developed a Software as a Service (SaaS) Model for reducing energy costs, cutting peak power demand, and reducing carbon emissions for multipurpose buildings. UC Davis currently collects and stores energy usage data from buildings on its campus. Researchers at LBNL sought to demonstrate that a SaaS application architecture could be built on top of this data system to optimize the scheduling of electricity and heat delivery in the building.more » The SaaS interface, known as WebOpt, consists of two major parts: a) the investment& planning and b) the operations module, which builds on the investment& planning module. The operational scheduling and load shifting optimization models within the operations module use data from load prediction and electrical grid emissions models to create an optimal operating schedule for the next week, reducing peak electricity consumption while maintaining quality of energy services. LBNL's application also provides facility managers with suggested energy infrastructure investments for achieving their energy cost and emission goals based on historical data collected with OSISoft's system. This paper describes these models as well as the SaaS architecture employed by LBNL researchers to provide asset scheduling services to UC Davis. The peak demand, emissions, and cost implications of the asset operation schedule and investments suggested by this optimization model are analysed.« less

  4. Application of the Software as a Service Model to the Control of Complex Building Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stadler, Michael; Donadee, Jon; Marnay, Chris

    2011-03-18

    In an effort to create broad access to its optimization software, Lawrence Berkeley National Laboratory (LBNL), in collaboration with the University of California at Davis (UC Davis) and OSISoft, has recently developed a Software as a Service (SaaS) Model for reducing energy costs, cutting peak power demand, and reducing carbon emissions for multipurpose buildings. UC Davis currently collects and stores energy usage data from buildings on its campus. Researchers at LBNL sought to demonstrate that a SaaS application architecture could be built on top of this data system to optimize the scheduling of electricity and heat delivery in the building.more » The SaaS interface, known as WebOpt, consists of two major parts: a) the investment& planning and b) the operations module, which builds on the investment& planning module. The operational scheduling and load shifting optimization models within the operations module use data from load prediction and electrical grid emissions models to create an optimal operating schedule for the next week, reducing peak electricity consumption while maintaining quality of energy services. LBNL's application also provides facility managers with suggested energy infrastructure investments for achieving their energy cost and emission goals based on historical data collected with OSISoft's system. This paper describes these models as well as the SaaS architecture employed by LBNL researchers to provide asset scheduling services to UC Davis. The peak demand, emissions, and cost implications of the asset operation schedule and investments suggested by this optimization model are analyzed.« less

  5. The Web Resource Collaboration Center

    ERIC Educational Resources Information Center

    Dunlap, Joanna C.

    2004-01-01

    The Web Resource Collaboration Center (WRCC) is a web-based tool developed to help software engineers build their own web-based learning and performance support systems. Designed using various online communication and collaboration technologies, the WRCC enables people to: (1) build a learning and professional development resource that provides…

  6. BESTEST-EX | Buildings | NREL

    Science.gov Websites

    method for testing home energy audit software and associated calibration methods. BESTEST-EX is one of Energy Analysis Model Calibration Methods. When completed, the ANSI/RESNET SMOT will specify test procedures for evaluating calibration methods used in conjunction with predicting building energy use and

  7. System engineering and management in a large and diverse multinational consortium

    NASA Astrophysics Data System (ADS)

    Wright, David; O'Sullivan, Brian; Thatcher, John; Renouf, Ian; Wright, Gillian; Wells, Martyn; Glasse, Alistair; Grozinger, Ulrich; Sykes, Jon; Smith, Dave; Eccleston, Paul; Shaughnessy, Bryan

    2008-07-01

    This paper elaborates the system engineering methods that are being successfully employed within the European Consortium (EC) to deliver the Optical System of the Mid Infa-Red Instrument (MIRI) to the James Webb Space Telescope (JWST). The EC is a Consortium of 21 institutes located in 10 European countries and, at instrument level, it works in a 50/50 partnership with JPL who are providing the instrument cooler, software and detector systems. The paper will describe how the system engineering approach has been based upon proven principles used in the space industry but applied in a tailored way that best accommodates the differences in international practices and standards with a primary aim of ensuring a cost-effective solution which supports all science requirements for the mission. The paper will recall how the system engineering has been managed from the definition of the system requirements in early phase B, through the successful Critical Design Review at the end of phase C and up to the test and flight build activities that are presently in progress. Communication and coordination approaches will also be discussed.

  8. Distribution of a Generic Mission Planning and Scheduling Toolkit for Astronomical Spacecraft

    NASA Technical Reports Server (NTRS)

    Kleiner, Steven C.

    1996-01-01

    Work is progressing as outlined in the proposal for this contract. A working planning and scheduling system has been documented and packaged and made available to the WIRE Small Explorer group at JPL, the FUSE group at JHU, the NASA/GSFC Laboratory for Astronomy and Solar Physics and the Advanced Planning and Scheduling Branch at STScI. The package is running successfully on the WIRE computer system. It is expected that the WIRE will reuse significant portions of the SWAS code in its system. This scheduling system itself was tested successfully against the spacecraft hardware in December 1995. A fully automatic scheduling module has been developed and is being added to the toolkit. In order to maximize reuse, the code is being reorganized during the current build into object-oriented class libraries. A paper describing the toolkit has been written and is included in the software distribution. We have experienced interference between the export and production versions of the toolkit. We will be requesting permission to reprogram funds in order to purchase a standalone PC onto which to offload the export version.

  9. The Principles for Successful Scientific Data Management Revisited

    NASA Astrophysics Data System (ADS)

    Walker, R. J.; King, T. A.; Joy, S. P.

    2005-12-01

    It has been 23 years since the National Research Council's Committee on Data Management and Computation (CODMAC) published its famous list of principles for successful scientific data management that have provided the framework for modern space science data management. CODMAC outlined seven principles: 1. Scientific Involvement in all aspects of space science missions. 2. Scientific Oversight of all scientific data-management activities. 3. Data Availability - Validated data should be made available to the scientific community in a timely manner. They should include appropriate ancillary data, and complete documentation. 4. Facilities - A proper balance between cost and scientific productivity should be maintained. 5. Software - Transportable well documented software should be available to process and analyze the data. 6. Scientific Data Storage - The data should be preserved in retrievable form. 7. Data System Funding - Adequate data funding should be made available at the outset of missions and protected from overruns. In this paper we will review the lessons learned in trying to apply these principles to space derived data. The Planetary Data System created the concept of data curation to carry out the CODMAC principles. Data curators are scientists and technologists who work directly with the mission scientists to create data products. The efficient application of the CODMAC principles requires that data curators and the mission team start early in a mission to plan for data access and archiving. To build the data products the planetary discipline adopted data access and documentation standards and has adhered to them. The data curators and mission team work together to produce data products and make them available. However even with early planning and agreement on standards the needs of the science community frequently far exceed the available resources. This is especially true for smaller principal investigator run missions. We will argue that one way to make data systems for small missions more effective is for the data curators to provide software tools to help develop the mission data system.

  10. Successful Strategies for Planning a Green Building.

    ERIC Educational Resources Information Center

    Browning, William D.

    2003-01-01

    Presents several strategies for successful green building on campus: develop a set of clear environmental performance goals (buildings as pedagogical tools, climate-neutral operations, maximized human performance), use Leadership in Energy and Environmental Design (LEED) as a gauge of performance, and use the project to reform the campus building…

  11. Software Development Offshoring Competitiveness: A Case Study of ASEAN Countries

    ERIC Educational Resources Information Center

    Bui, Minh Q.

    2011-01-01

    With the success of offshoring within the American software industry, corporate executives are moving their software developments overseas. The member countries of the Association of Southeast Asian Nations (ASEAN) have become a preferred destination. However, there is a lack of published studies on the region's software competitiveness in…

  12. Using component technology to facilitate external software reuse in ground-based planning systems

    NASA Technical Reports Server (NTRS)

    Chase, A.

    2003-01-01

    APGEN (Activity Plan GENerator - 314), a multi-mission planning tool, must interface with external software to vest serve its users. AP-GEN's original method for incorporating external software, the User-Defined library mechanism, has been very successful in allowing APGEN users access to external software functionality.

  13. WIRM: An Open Source Toolkit for Building Biomedical Web Applications

    PubMed Central

    Jakobovits, Rex M.; Rosse, Cornelius; Brinkley, James F.

    2002-01-01

    This article describes an innovative software toolkit that allows the creation of web applications that facilitate the acquisition, integration, and dissemination of multimedia biomedical data over the web, thereby reducing the cost of knowledge sharing. There is a lack of high-level web application development tools suitable for use by researchers, clinicians, and educators who are not skilled programmers. Our Web Interfacing Repository Manager (WIRM) is a software toolkit that reduces the complexity of building custom biomedical web applications. WIRM’s visual modeling tools enable domain experts to describe the structure of their knowledge, from which WIRM automatically generates full-featured, customizable content management systems. PMID:12386108

  14. Design Automation in Synthetic Biology.

    PubMed

    Appleton, Evan; Madsen, Curtis; Roehner, Nicholas; Densmore, Douglas

    2017-04-03

    Design automation refers to a category of software tools for designing systems that work together in a workflow for designing, building, testing, and analyzing systems with a target behavior. In synthetic biology, these tools are called bio-design automation (BDA) tools. In this review, we discuss the BDA tools areas-specify, design, build, test, and learn-and introduce the existing software tools designed to solve problems in these areas. We then detail the functionality of some of these tools and show how they can be used together to create the desired behavior of two types of modern synthetic genetic regulatory networks. Copyright © 2017 Cold Spring Harbor Laboratory Press; all rights reserved.

  15. Ontology for Life-Cycle Modeling of Water Distribution Systems: Model View Definition

    DTIC Science & Technology

    2013-06-01

    Research and Development Center, Construction Engineering Research Laboratory (ERDC-CERL) to develop a life-cycle building model have resulted in the...Laboratory (ERDC-CERL) to develop a life-cycle building model have resulted in the definition of a “core” building information model that contains...developed experimental BIM models us- ing commercial off-the-shelf (COTS) software. Those models represent three types of typical low-rise Army

  16. Commercial Building Energy Saver, API

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hong, Tianzhen; Piette, Mary; Lee, Sang Hoon

    2015-08-27

    The CBES API provides Application Programming Interface to a suite of functions to improve energy efficiency of buildings, including building energy benchmarking, preliminary retrofit analysis using a pre-simulation database DEEP, and detailed retrofit analysis using energy modeling with the EnergyPlus simulation engine. The CBES API is used to power the LBNL CBES Web App. It can be adopted by third party developers and vendors into their software tools and platforms.

  17. CosmoQuest Transient Tracker: Opensource Photometry & Astrometry software

    NASA Astrophysics Data System (ADS)

    Myers, Joseph L.; Lehan, Cory; Gay, Pamela; Richardson, Matthew; CosmoQuest Team

    2018-01-01

    CosmoQuest is moving from online citizen science, to observational astronomy with the creation of Transient Trackers. This open source software is designed to identify asteroids and other transient/variable objects in image sets. Transient Tracker’s features in final form will include: astrometric and photometric solutions, identification of moving/transient objects, identification of variable objects, and lightcurve analysis. In this poster we present our initial, v0.1 release and seek community input.This software builds on the existing NIH funded ImageJ libraries. Creation of this suite of opensource image manipulation routines is lead by Wayne Rasband and is released primarily under the MIT license. In this release, we are building on these libraries to add source identification for point / point-like sources, and to do astrometry. Our materials released under the Apache 2.0 license on github (http://github.com/CosmoQuestTeam) and documentation can be found at http://cosmoquest.org/TransientTracker.

  18. Introduction to Financial Projection Models. Business Management Instructional Software.

    ERIC Educational Resources Information Center

    Pomeroy, Robert W., III

    This guidebook and teacher's guide accompany a personal computer software program and introduce the key elements of financial projection modeling to project the financial statements of an industrial enterprise. The student will then build a model on an electronic spreadsheet. The guidebook teaches the purpose of a financial model and the steps…

  19. The MDE Diploma: First International Postgraduate Specialization in Model-Driven Engineering

    ERIC Educational Resources Information Center

    Cabot, Jordi; Tisi, Massimo

    2011-01-01

    Model-Driven Engineering (MDE) is changing the way we build, operate, and maintain our software-intensive systems. Several projects using MDE practices are reporting significant improvements in quality and performance but, to be able to handle these projects, software engineers need a set of technical and interpersonal skills that are currently…

  20. A UNIMARC Bibliographic Format Database for ABCD

    ERIC Educational Resources Information Center

    Megnigbeto, Eustache

    2012-01-01

    Purpose: ABCD is a web-based open and free software suite for library management derived from the UNESCO CDS/ISIS software technology. The first version was launched officially in December 2009 with a MARC 21 bibliographic format database. This paper aims to detail the building of the UNIMARC bibliographic format database for ABCD.…

  1. Default Parallels Plesk Panel Page

    Science.gov Websites

    services that small businesses want and need. Our software includes key building blocks of cloud service virtualized servers Service Provider Products Parallels® Automation Hosting, SaaS, and cloud computing , the leading hosting automation software. You see this page because there is no Web site at this

  2. Generic Software Architecture for Launchers

    NASA Astrophysics Data System (ADS)

    Carre, Emilien; Gast, Philippe; Hiron, Emmanuel; Leblanc, Alain; Lesens, David; Mescam, Emmanuelle; Moro, Pierre

    2015-09-01

    The definition and reuse of generic software architecture for launchers is not so usual for several reasons: the number of European launcher families is very small (Ariane 5 and Vega for these last decades); the real time constraints (reactivity and determinism needs) are very hard; low levels of versatility are required (implying often an ad hoc development of the launcher mission). In comparison, satellites are often built on a generic platform made up of reusable hardware building blocks (processors, star-trackers, gyroscopes, etc.) and reusable software building blocks (middleware, TM/TC, On Board Control Procedure, etc.). If some of these reasons are still valid (e.g. the limited number of development), the increase of the available CPU power makes today an approach based on a generic time triggered middleware (ensuring the full determinism of the system) and a centralised mission and vehicle management (offering more flexibility in the design and facilitating the long term maintenance) achievable. This paper presents an example of generic software architecture which could be envisaged for future launchers, based on the previously described principles and supported by model driven engineering and automatic code generation.

  3. Genoviz Software Development Kit: Java tool kit for building genomics visualization applications.

    PubMed

    Helt, Gregg A; Nicol, John W; Erwin, Ed; Blossom, Eric; Blanchard, Steven G; Chervitz, Stephen A; Harmon, Cyrus; Loraine, Ann E

    2009-08-25

    Visualization software can expose previously undiscovered patterns in genomic data and advance biological science. The Genoviz Software Development Kit (SDK) is an open source, Java-based framework designed for rapid assembly of visualization software applications for genomics. The Genoviz SDK framework provides a mechanism for incorporating adaptive, dynamic zooming into applications, a desirable feature of genome viewers. Visualization capabilities of the Genoviz SDK include automated layout of features along genetic or genomic axes; support for user interactions with graphical elements (Glyphs) in a map; a variety of Glyph sub-classes that promote experimentation with new ways of representing data in graphical formats; and support for adaptive, semantic zooming, whereby objects change their appearance depending on zoom level and zooming rate adapts to the current scale. Freely available demonstration and production quality applications, including the Integrated Genome Browser, illustrate Genoviz SDK capabilities. Separation between graphics components and genomic data models makes it easy for developers to add visualization capability to pre-existing applications or build new applications using third-party data models. Source code, documentation, sample applications, and tutorials are available at http://genoviz.sourceforge.net/.

  4. A Study of the Use of Ontologies for Building Computer-Aided Control Engineering Self-Learning Educational Software

    NASA Astrophysics Data System (ADS)

    García, Isaías; Benavides, Carmen; Alaiz, Héctor; Alonso, Angel

    2013-08-01

    This paper describes research on the use of knowledge models (ontologies) for building computer-aided educational software in the field of control engineering. Ontologies are able to represent in the computer a very rich conceptual model of a given domain. This model can be used later for a number of purposes in different software applications. In this study, domain ontology about the field of lead-lag compensator design has been built and used for automatic exercise generation, graphical user interface population and interaction with the user at any level of detail, including explanations about why things occur. An application called Onto-CELE (ontology-based control engineering learning environment) uses the ontology for implementing a learning environment that can be used for self and lifelong learning purposes. The experience has shown that the use of knowledge models as the basis for educational software applications is capable of showing students the whole complexity of the analysis and design processes at any level of detail. A practical experience with postgraduate students has shown the mentioned benefits and possibilities of the approach.

  5. Software interface verifier

    NASA Technical Reports Server (NTRS)

    Soderstrom, Tomas J.; Krall, Laura A.; Hope, Sharon A.; Zupke, Brian S.

    1994-01-01

    A Telos study of 40 recent subsystem deliveries into the DSN at JPL found software interface testing to be the single most expensive and error-prone activity, and the study team suggested creating an automated software interface test tool. The resulting Software Interface Verifier (SIV), which was funded by NASA/JPL and created by Telos, employed 92 percent software reuse to quickly create an initial version which incorporated early user feedback. SIV is now successfully used by developers for interface prototyping and unit testing, by test engineers for formal testing, and by end users for non-intrusive data flow tests in the operational environment. Metrics, including cost, are included. Lessons learned include the need for early user training. SIV is ported to many platforms and can be successfully used or tailored by other NASA groups.

  6. Using software metrics and software reliability models to attain acceptable quality software for flight and ground support software for avionic systems

    NASA Technical Reports Server (NTRS)

    Lawrence, Stella

    1992-01-01

    This paper is concerned with methods of measuring and developing quality software. Reliable flight and ground support software is a highly important factor in the successful operation of the space shuttle program. Reliability is probably the most important of the characteristics inherent in the concept of 'software quality'. It is the probability of failure free operation of a computer program for a specified time and environment.

  7. Applying object-oriented software engineering at the BaBar collaboration

    NASA Astrophysics Data System (ADS)

    Jacobsen, Bob; BaBar Collaboration Reconstruction Software Group

    1997-02-01

    The BaBar experiment at SLAC will start taking data in 1999. We are attempting to build its reconstruction software using good software engineering practices, including the use of object-oriented technology. We summarize our experience to date with analysis and design activities, training, CASE and documentation tools, C++ programming practice and similar topics. The emphasis is on the practical issues of simultaneously introducing new techniques to a large collaboration while under a deadline for system delivery.

  8. Building Energy Monitoring and Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hong, Tianzhen; Feng, Wei; Lu, Alison

    This project aimed to develop a standard methodology for building energy data definition, collection, presentation, and analysis; apply the developed methods to a standardized energy monitoring platform, including hardware and software, to collect and analyze building energy use data; and compile offline statistical data and online real-time data in both countries for fully understanding the current status of building energy use. This helps decode the driving forces behind the discrepancy of building energy use between the two countries; identify gaps and deficiencies of current building energy monitoring, data collection, and analysis; and create knowledge and tools to collect and analyzemore » good building energy data to provide valuable and actionable information for key stakeholders.« less

  9. Building a virtual network in a community health research training program.

    PubMed

    Lau, F; Hayward, R

    2000-01-01

    To describe the experiences, lessons, and implications of building a virtual network as part of a two-year community health research training program in a Canadian province. An action research field study in which 25 health professionals from 17 health regions participated in a seven-week training course on health policy, management, economics, research methods, data analysis, and computer technology. The participants then returned to their regions to apply the knowledge in different community health research projects. Ongoing faculty consultations and support were provided as needed. Each participant was given a notebook computer with the necessary software, Internet access, and technical support for two years, to access information resources, engage in group problem solving, share ideas and knowledge, and collaborate on projects. Data collected over two years consisted of program documents, records of interviews with participants and staff, meeting notes, computer usage statistics, automated online surveys, computer conference postings, program Web site, and course feedback. The analysis consisted of detailed review and comparison of the data from different sources. NUD*IST was then used to validate earlier study findings. The ten key lessons are that role clarity, technology vision, implementation staging, protected time, just-in-time training, ongoing facilitation, work integration, participatory design, relationship building, and the demonstration of results are essential ingredients for building a successful network. This study provides a descriptive model of the processes involved in developing, in the community health setting, virtual networks that can be used as the basis for future research and as a practical guide for managers.

  10. Crawling The Web for Libre: Selecting, Integrating, Extending and Releasing Open Source Software

    NASA Astrophysics Data System (ADS)

    Truslove, I.; Duerr, R. E.; Wilcox, H.; Savoie, M.; Lopez, L.; Brandt, M.

    2012-12-01

    Libre is a project developed by the National Snow and Ice Data Center (NSIDC). Libre is devoted to liberating science data from its traditional constraints of publication, location, and findability. Libre embraces and builds on the notion of making knowledge freely available, and both Creative Commons licensed content and Open Source Software are crucial building blocks for, as well as required deliverable outcomes of the project. One important aspect of the Libre project is to discover cryospheric data published on the internet without prior knowledge of the location or even existence of that data. Inspired by well-known search engines and their underlying web crawling technologies, Libre has explored tools and technologies required to build a search engine tailored to allow users to easily discover geospatial data related to the polar regions. After careful consideration, the Libre team decided to base its web crawling work on the Apache Nutch project (http://nutch.apache.org). Nutch is "an open source web-search software project" written in Java, with good documentation, a significant user base, and an active development community. Nutch was installed and configured to search for the types of data of interest, and the team created plugins to customize the default Nutch behavior to better find and categorize these data feeds. This presentation recounts the Libre team's experiences selecting, using, and extending Nutch, and working with the Nutch user and developer community. We will outline the technical and organizational challenges faced in order to release the project's software as Open Source, and detail the steps actually taken. We distill these experiences into a set of heuristics and recommendations for using, contributing to, and releasing Open Source Software.

  11. Towards Archetypes-Based Software Development

    NASA Astrophysics Data System (ADS)

    Piho, Gunnar; Roost, Mart; Perkins, David; Tepandi, Jaak

    We present a framework for the archetypes based engineering of domains, requirements and software (Archetypes-Based Software Development, ABD). An archetype is defined as a primordial object that occurs consistently and universally in business domains and in business software systems. An archetype pattern is a collaboration of archetypes. Archetypes and archetype patterns are used to capture conceptual information into domain specific models that are utilized by ABD. The focus of ABD is on software factories - family-based development artefacts (domain specific languages, patterns, frameworks, tools, micro processes, and others) that can be used to build the family members. We demonstrate the usage of ABD for developing laboratory information management system (LIMS) software for the Clinical and Biomedical Proteomics Group, at the Leeds Institute of Molecular Medicine, University of Leeds.

  12. Common Object Library Description

    DTIC Science & Technology

    2012-08-01

    Information Modeling ( BIM ) technology to be successful, it must be consistently applied across many projects, by many teams. The National Building Information ...distribution is unlimited. 13. SUPPLEMENTARY NOTES 14. ABSTRACT For Building Information Modeling ( BIM ) technology to be successful, it must be... BIM standards and for future research projects. 15. SUBJECT TERMS building information modeling ( BIM ), object

  13. Panoramic-image-based rendering solutions for visualizing remote locations via the web

    NASA Astrophysics Data System (ADS)

    Obeysekare, Upul R.; Egts, David; Bethmann, John

    2000-05-01

    With advances in panoramic image-based rendering techniques and the rapid expansion of web advertising, new techniques are emerging for visualizing remote locations on the WWW. Success of these techniques depends on how easy and inexpensive it is to develop a new type of web content that provides pseudo 3D visualization at home, 24-hours a day. Furthermore, the acceptance of this new visualization medium depends on the effectiveness of the familiarization tools by a segment of the population that was never exposed to this type of visualization. This paper addresses various hardware and software solutions available to collect, produce, and view panoramic content. While cost and effectiveness of building the content is being addressed using a few commercial hardware solutions, effectiveness of familiarization tools is evaluated using a few sample data sets.

  14. pySPACE—a signal processing and classification environment in Python

    PubMed Central

    Krell, Mario M.; Straube, Sirko; Seeland, Anett; Wöhrle, Hendrik; Teiwes, Johannes; Metzen, Jan H.; Kirchner, Elsa A.; Kirchner, Frank

    2013-01-01

    In neuroscience large amounts of data are recorded to provide insights into cerebral information processing and function. The successful extraction of the relevant signals becomes more and more challenging due to increasing complexities in acquisition techniques and questions addressed. Here, automated signal processing and machine learning tools can help to process the data, e.g., to separate signal and noise. With the presented software pySPACE (http://pyspace.github.io/pyspace), signal processing algorithms can be compared and applied automatically on time series data, either with the aim of finding a suitable preprocessing, or of training supervised algorithms to classify the data. pySPACE originally has been built to process multi-sensor windowed time series data, like event-related potentials from the electroencephalogram (EEG). The software provides automated data handling, distributed processing, modular build-up of signal processing chains and tools for visualization and performance evaluation. Included in the software are various algorithms like temporal and spatial filters, feature generation and selection, classification algorithms, and evaluation schemes. Further, interfaces to other signal processing tools are provided and, since pySPACE is a modular framework, it can be extended with new algorithms according to individual needs. In the presented work, the structural hierarchies are described. It is illustrated how users and developers can interface the software and execute offline and online modes. Configuration of pySPACE is realized with the YAML format, so that programming skills are not mandatory for usage. The concept of pySPACE is to have one comprehensive tool that can be used to perform complete signal processing and classification tasks. It further allows to define own algorithms, or to integrate and use already existing libraries. PMID:24399965

  15. pySPACE-a signal processing and classification environment in Python.

    PubMed

    Krell, Mario M; Straube, Sirko; Seeland, Anett; Wöhrle, Hendrik; Teiwes, Johannes; Metzen, Jan H; Kirchner, Elsa A; Kirchner, Frank

    2013-01-01

    In neuroscience large amounts of data are recorded to provide insights into cerebral information processing and function. The successful extraction of the relevant signals becomes more and more challenging due to increasing complexities in acquisition techniques and questions addressed. Here, automated signal processing and machine learning tools can help to process the data, e.g., to separate signal and noise. With the presented software pySPACE (http://pyspace.github.io/pyspace), signal processing algorithms can be compared and applied automatically on time series data, either with the aim of finding a suitable preprocessing, or of training supervised algorithms to classify the data. pySPACE originally has been built to process multi-sensor windowed time series data, like event-related potentials from the electroencephalogram (EEG). The software provides automated data handling, distributed processing, modular build-up of signal processing chains and tools for visualization and performance evaluation. Included in the software are various algorithms like temporal and spatial filters, feature generation and selection, classification algorithms, and evaluation schemes. Further, interfaces to other signal processing tools are provided and, since pySPACE is a modular framework, it can be extended with new algorithms according to individual needs. In the presented work, the structural hierarchies are described. It is illustrated how users and developers can interface the software and execute offline and online modes. Configuration of pySPACE is realized with the YAML format, so that programming skills are not mandatory for usage. The concept of pySPACE is to have one comprehensive tool that can be used to perform complete signal processing and classification tasks. It further allows to define own algorithms, or to integrate and use already existing libraries.

  16. artdaq: DAQ software development made simple

    NASA Astrophysics Data System (ADS)

    Biery, Kurt; Flumerfelt, Eric; Freeman, John; Ketchum, Wesley; Lukhanin, Gennadiy; Rechenmacher, Ron

    2017-10-01

    For a few years now, the artdaq data acquisition software toolkit has provided numerous experiments with ready-to-use components which allow for rapid development and deployment of DAQ systems. Developed within the Fermilab Scientific Computing Division, artdaq provides data transfer, event building, run control, and event analysis functionality. This latter feature includes built-in support for the art event analysis framework, allowing experiments to run art modules for real-time filtering, compression, disk writing and online monitoring. As art, also developed at Fermilab, is also used for offline analysis, a major advantage of artdaq is that it allows developers to easily switch between developing online and offline software. artdaq continues to be improved. Support for an alternate mode of running whereby data from some subdetector components are only streamed if requested has been added; this option will reduce unnecessary DAQ throughput. Real-time reporting of DAQ metrics has been implemented, along with the flexibility to choose the format through which experiments receive the reports; these formats include the Ganglia, Graphite and syslog software packages, along with flat ASCII files. Additionally, work has been performed investigating more flexible modes of online monitoring, including the capability to run multiple online monitoring processes on different hosts, each running its own set of art modules. Finally, a web-based GUI interface through which users can configure details of their DAQ system has been implemented, increasing the ease of use of the system. Already successfully deployed on the LArlAT, DarkSide-50, DUNE 35ton and Mu2e experiments, artdaq will be employed for SBND and is a strong candidate for use on ICARUS and protoDUNE. With each experiment comes new ideas for how artdaq can be made more flexible and powerful. The above improvements will be described, along with potential ideas for the future.

  17. Leveraging OpenStudio's Application Programming Interfaces: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Long, N.; Ball, B.; Goldwasser, D.

    2013-11-01

    OpenStudio development efforts have been focused on providing Application Programming Interfaces (APIs) where users are able to extend OpenStudio without the need to compile the open source libraries. This paper will discuss the basic purposes and functionalities of the core libraries that have been wrapped with APIs including the Building Model, Results Processing, Advanced Analysis, UncertaintyQuantification, and Data Interoperability through Translators. Several building energy modeling applications have been produced using OpenStudio's API and Software Development Kits (SDK) including the United States Department of Energy's Asset ScoreCalculator, a mobile-based audit tool, an energy design assistance reporting protocol, and a portfolio scalemore » incentive optimization analysismethodology. Each of these software applications will be discussed briefly and will describe how the APIs were leveraged for various uses including high-level modeling, data transformations from detailed building audits, error checking/quality assurance of models, and use of high-performance computing for mass simulations.« less

  18. Scalable software architectures for decision support.

    PubMed

    Musen, M A

    1999-12-01

    Interest in decision-support programs for clinical medicine soared in the 1970s. Since that time, workers in medical informatics have been particularly attracted to rule-based systems as a means of providing clinical decision support. Although developers have built many successful applications using production rules, they also have discovered that creation and maintenance of large rule bases is quite problematic. In the 1980s, several groups of investigators began to explore alternative programming abstractions that can be used to build decision-support systems. As a result, the notions of "generic tasks" and of reusable problem-solving methods became extremely influential. By the 1990s, academic centers were experimenting with architectures for intelligent systems based on two classes of reusable components: (1) problem-solving methods--domain-independent algorithms for automating stereotypical tasks--and (2) domain ontologies that captured the essential concepts (and relationships among those concepts) in particular application areas. This paper highlights how developers can construct large, maintainable decision-support systems using these kinds of building blocks. The creation of domain ontologies and problem-solving methods is the fundamental end product of basic research in medical informatics. Consequently, these concepts need more attention by our scientific community.

  19. Highlights of X-Stack ExM Deliverable Swift/T

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wozniak, Justin M.

    Swift/T is a key success from the ExM: System support for extreme-scale, many-task applications1 X-Stack project, which proposed to use concurrent dataflow as an innovative programming model to exploit extreme parallelism in exascale computers. The Swift/T component of the project reimplemented the Swift language from scratch to allow applications that compose scientific modules together to be build and run on available petascale computers (Blue Gene, Cray). Swift/T does this via a new compiler and runtime that generates and executes the application as an MPI program. We assume that mission-critical emerging exascale applications will be composed as scalable applications using existingmore » software components, connected by data dependencies. Developers wrap native code fragments using a higherlevel language, then build composite applications to form a computational experiment. This exemplifies hierarchical concurrency: lower-level messaging libraries are used for fine-grained parallelism; highlevel control is used for inter-task coordination. These patterns are best expressed with dataflow, but static DAGs (i.e., other workflow languages) limit the applications that can be built; they do not provide the expressiveness of Swift, such as conditional execution, iteration, and recursive functions.« less

  20. The relationships between software publications and software systems

    NASA Astrophysics Data System (ADS)

    Hogg, David W.

    2017-01-01

    When we build software systems or software tools for astronomy, we sometimes do and sometimes don't also write and publish standard scientific papers about those software systems. I will discuss the pros and cons of writing such publications. There are impacts of writing such papers immediately (they can affect the design and structure of the software project itself), in the short term (they can promote adoption and legitimize the software), in the medium term (they can provide a platform for all the literature's mechanisms for citation, criticism, and reuse), and in the long term (they can preserve ideas that are embodied in the software, possibly on timescales much longer than the lifetime of any software context). I will argue that as important as pure software contributions are to astronomy—and I am both a preacher and a practitioner—software contributions are even more valuable when they are associated with traditional scientific publications. There are exceptions and complexities of course, which I will discuss.

  1. Building quality into medical product software design.

    PubMed

    Mallory, S R

    1993-01-01

    The software engineering and quality assurance disciplines are a requisite to the design of safe and effective software-based medical devices. It is in the areas of software methodology and process that the most beneficial application of these disciplines to software development can be made. Software is a product of complex operations and methodologies and is not amenable to the traditional electromechanical quality assurance processes. Software quality must be built in by the developers, with the software verification and validation engineers acting as the independent instruments for ensuring compliance with performance objectives and with development and maintenance standards. The implementation of a software quality assurance program is a complex process involving management support, organizational changes, and new skill sets, but the benefits are profound. Its rewards provide safe, reliable, cost-effective, maintainable, and manageable software, which may significantly speed the regulatory review process and therefore potentially shorten the overall time to market. The use of a trial project can greatly facilitate the learning process associated with the first-time application of a software quality assurance program.

  2. Software Literacy and Student Learning in the Tertiary Environment: Powerpoint and Beyond

    ERIC Educational Resources Information Center

    Khoo, Elaine; Hight, Craig; Cowie, Bronwen; Torrens, Rob; Ferrarelli, Lisabeth

    2014-01-01

    In this paper, we explore the relationship between student success in acquiring software literacy and students' broader engagement and understanding of knowledge across different disciplines. We report on the first phase of a project that examines software literacies associated with Microsoft PowerPoint as a common software package encountered and…

  3. The first object oriented monitor for intravenous anesthesia.

    PubMed

    Cantraine, F R; Coussaert, E J

    2000-01-01

    To describe the design and implementation of "INFUSION TOOLBOX," a software tool to control and monitor multiple intravenous drug infusions simultaneously using pharmacokinetic and pharmacodynamic principles. INFUSION TOOLBOX has been designed to present a graphical interface. Object Oriented design was used and the software was implemented using Smalltalk, to run on a PC. Basic tools are available to manage patient, drugs, pumps and reports. These tools are the PatientPanel, the DrugPanel, the PumpPanel and the HistoryPanel. The screen is built dynamically. The panels may be collapsed or closed to avoid a crowded display. We also built control panels such as the Target ControlPanel which calculates the best infusion sequence to bring the drug concentration in the plasma compartment to a preset value. Before drug delivery, the user enters the patient's data, selects a drug, enters its dilution factor and chooses a pharmacokinetic model. The calculated plasma concentration is continually displayed and updated. The anesthetist may ask for the history of the delivery to obtain a graphic report or to add events to the logbook. A panel targeting the effect is used when a pharmacodynamic model is known. Data files for drugs, pumps and surgery are upgradable. By creating a resizeable ControlPanel we enable the anesthetist to display the information he wishes, when he wishes it. The available panels are diverse enough to meet the anesthetist needs; they may be adapted to the drug used, pumps used and surgery. It is the anesthetist who builds dynamically its different control screens. By adopting an evolutionary solution model we have achieved considerable success in building our drug delivery monitor. In addition we have gained valuable insight into the anesthesia information domain that will allow us to further enhance and expand the system.

  4. Common characteristics of open source software development and applicability for drug discovery: a systematic review.

    PubMed

    Ardal, Christine; Alstadsæter, Annette; Røttingen, John-Arne

    2011-09-28

    Innovation through an open source model has proven to be successful for software development. This success has led many to speculate if open source can be applied to other industries with similar success. We attempt to provide an understanding of open source software development characteristics for researchers, business leaders and government officials who may be interested in utilizing open source innovation in other contexts and with an emphasis on drug discovery. A systematic review was performed by searching relevant, multidisciplinary databases to extract empirical research regarding the common characteristics and barriers of initiating and maintaining an open source software development project. Common characteristics to open source software development pertinent to open source drug discovery were extracted. The characteristics were then grouped into the areas of participant attraction, management of volunteers, control mechanisms, legal framework and physical constraints. Lastly, their applicability to drug discovery was examined. We believe that the open source model is viable for drug discovery, although it is unlikely that it will exactly follow the form used in software development. Hybrids will likely develop that suit the unique characteristics of drug discovery. We suggest potential motivations for organizations to join an open source drug discovery project. We also examine specific differences between software and medicines, specifically how the need for laboratories and physical goods will impact the model as well as the effect of patents.

  5. Bim and Gis: when Parametric Modeling Meets Geospatial Data

    NASA Astrophysics Data System (ADS)

    Barazzetti, L.; Banfi, F.

    2017-12-01

    Geospatial data have a crucial role in several projects related to infrastructures and land management. GIS software are able to perform advanced geospatial analyses, but they lack several instruments and tools for parametric modelling typically available in BIM. At the same time, BIM software designed for buildings have limited tools to handle geospatial data. As things stand at the moment, BIM and GIS could appear as complementary solutions, notwithstanding research work is currently under development to ensure a better level of interoperability, especially at the scale of the building. On the other hand, the transition from the local (building) scale to the infrastructure (where geospatial data cannot be neglected) has already demonstrated that parametric modelling integrated with geoinformation is a powerful tool to simplify and speed up some phases of the design workflow. This paper reviews such mixed approaches with both simulated and real examples, demonstrating that integration is already a reality at specific scales, which are not dominated by "pure" GIS or BIM. The paper will also demonstrate that some traditional operations carried out with GIS software are also available in parametric modelling software for BIM, such as transformation between reference systems, DEM generation, feature extraction, and geospatial queries. A real case study is illustrated and discussed to show the advantage of a combined use of both technologies. BIM and GIS integration can generate greater usage of geospatial data in the AECOO (Architecture, Engineering, Construction, Owner and Operator) industry, as well as new solutions for parametric modelling with additional geoinformation.

  6. Modeling a distributed environment for a petroleum reservoir engineering application with software product line

    NASA Astrophysics Data System (ADS)

    de Faria Scheidt, Rafael; Vilain, Patrícia; Dantas, M. A. R.

    2014-10-01

    Petroleum reservoir engineering is a complex and interesting field that requires large amount of computational facilities to achieve successful results. Usually, software environments for this field are developed without taking care out of possible interactions and extensibilities required by reservoir engineers. In this paper, we present a research work which it is characterized by the design and implementation based on a software product line model for a real distributed reservoir engineering environment. Experimental results indicate successfully the utilization of this approach for the design of distributed software architecture. In addition, all components from the proposal provided greater visibility of the organization and processes for the reservoir engineers.

  7. Efficient Software Systems for Cardio Surgical Departments

    NASA Astrophysics Data System (ADS)

    Fountoukis, S. G.; Diomidous, M. J.

    2009-08-01

    Herein, the design implementation and deployment of an object oriented software system, suitable for the monitoring of cardio surgical departments, is investigated. Distributed design architectures are applied and the implemented software system can be deployed on distributed infrastructures. The software is flexible and adaptable to any cardio surgical environment regardless of the department resources used. The system exploits the relations and the interdependency of the successive bed positions that the patients occupy at the different health care units during their stay in a cardio surgical department, to determine bed availabilities and to perform patient scheduling and instant rescheduling whenever necessary. It also aims to successful monitoring of the workings of the cardio surgical departments in an efficient manner.

  8. Success Rates by Software Development Methodology in Information Technology Project Management: A Quantitative Analysis

    ERIC Educational Resources Information Center

    Wright, Gerald P.

    2013-01-01

    Despite over half a century of Project Management research, project success rates are still too low. Organizations spend a tremendous amount of valuable resources on Information Technology projects and seek to maximize the utility gained from their efforts. The author investigated the impact of software development methodology choice on ten…

  9. NASA's TReK Project: A Case Study in Using the Spiral Model of Software Development

    NASA Technical Reports Server (NTRS)

    Hendrix, T. Dean; Schneider, Michelle P.

    1998-01-01

    Software development projects face numerous challenges that threaten their successful completion. Whether it is not enough money, too little time, or a case of "requirements creep" that has turned into a full sprint, projects must meet these challenges or face possible disastrous consequences. A robust, yet flexible process model can provide a mechanism through which software development teams can meet these challenges head on and win. This article describes how the spiral model has been successfully tailored to a specific project and relates some notable results to date.

  10. Building on prior knowledge without building it in.

    PubMed

    Hansen, Steven S; Lampinen, Andrew K; Suri, Gaurav; McClelland, James L

    2017-01-01

    Lake et al. propose that people rely on "start-up software," "causal models," and "intuitive theories" built using compositional representations to learn new tasks more efficiently than some deep neural network models. We highlight the many drawbacks of a commitment to compositional representations and describe our continuing effort to explore how the ability to build on prior knowledge and to learn new tasks efficiently could arise through learning in deep neural networks.

  11. Study on Earthquake Emergency Evacuation Drill Trainer Development

    NASA Astrophysics Data System (ADS)

    ChangJiang, L.

    2016-12-01

    With the improvement of China's urbanization, to ensure people survive the earthquake needs scientific routine emergency evacuation drills. Drawing on cellular automaton, shortest path algorithm and collision avoidance, we designed a model of earthquake emergency evacuation drill for school scenes. Based on this model, we made simulation software for earthquake emergency evacuation drill. The software is able to perform the simulation of earthquake emergency evacuation drill by building spatial structural model and selecting the information of people's location grounds on actual conditions of constructions. Based on the data of simulation, we can operate drilling in the same building. RFID technology could be used here for drill data collection which read personal information and send it to the evacuation simulation software via WIFI. Then the simulation software would contrast simulative data with the information of actual evacuation process, such as evacuation time, evacuation path, congestion nodes and so on. In the end, it would provide a contrastive analysis report to report assessment result and optimum proposal. We hope the earthquake emergency evacuation drill software and trainer can provide overall process disposal concept for earthquake emergency evacuation drill in assembly occupancies. The trainer can make the earthquake emergency evacuation more orderly, efficient, reasonable and scientific to fulfill the increase in coping capacity of urban hazard.

  12. Development and Demonstration of a Method to Evaluate Bio-Sampling Strategies Using Building Simulation and Sample Planning Software

    PubMed Central

    Dols, W. Stuart; Persily, Andrew K.; Morrow, Jayne B.; Matzke, Brett D.; Sego, Landon H.; Nuffer, Lisa L.; Pulsipher, Brent A.

    2010-01-01

    In an effort to validate and demonstrate response and recovery sampling approaches and technologies, the U.S. Department of Homeland Security (DHS), along with several other agencies, have simulated a biothreat agent release within a facility at Idaho National Laboratory (INL) on two separate occasions in the fall of 2007 and the fall of 2008. Because these events constitute only two realizations of many possible scenarios, increased understanding of sampling strategies can be obtained by virtually examining a wide variety of release and dispersion scenarios using computer simulations. This research effort demonstrates the use of two software tools, CONTAM, developed by the National Institute of Standards and Technology (NIST), and Visual Sample Plan (VSP), developed by Pacific Northwest National Laboratory (PNNL). The CONTAM modeling software was used to virtually contaminate a model of the INL test building under various release and dissemination scenarios as well as a range of building design and operation parameters. The results of these CONTAM simulations were then used to investigate the relevance and performance of various sampling strategies using VSP. One of the fundamental outcomes of this project was the demonstration of how CONTAM and VSP can be used together to effectively develop sampling plans to support the various stages of response to an airborne chemical, biological, radiological, or nuclear event. Following such an event (or prior to an event), incident details and the conceptual site model could be used to create an ensemble of CONTAM simulations which model contaminant dispersion within a building. These predictions could then be used to identify priority area zones within the building and then sampling designs and strategies could be developed based on those zones. PMID:27134782

  13. Development and Demonstration of a Method to Evaluate Bio-Sampling Strategies Using Building Simulation and Sample Planning Software.

    PubMed

    Dols, W Stuart; Persily, Andrew K; Morrow, Jayne B; Matzke, Brett D; Sego, Landon H; Nuffer, Lisa L; Pulsipher, Brent A

    2010-01-01

    In an effort to validate and demonstrate response and recovery sampling approaches and technologies, the U.S. Department of Homeland Security (DHS), along with several other agencies, have simulated a biothreat agent release within a facility at Idaho National Laboratory (INL) on two separate occasions in the fall of 2007 and the fall of 2008. Because these events constitute only two realizations of many possible scenarios, increased understanding of sampling strategies can be obtained by virtually examining a wide variety of release and dispersion scenarios using computer simulations. This research effort demonstrates the use of two software tools, CONTAM, developed by the National Institute of Standards and Technology (NIST), and Visual Sample Plan (VSP), developed by Pacific Northwest National Laboratory (PNNL). The CONTAM modeling software was used to virtually contaminate a model of the INL test building under various release and dissemination scenarios as well as a range of building design and operation parameters. The results of these CONTAM simulations were then used to investigate the relevance and performance of various sampling strategies using VSP. One of the fundamental outcomes of this project was the demonstration of how CONTAM and VSP can be used together to effectively develop sampling plans to support the various stages of response to an airborne chemical, biological, radiological, or nuclear event. Following such an event (or prior to an event), incident details and the conceptual site model could be used to create an ensemble of CONTAM simulations which model contaminant dispersion within a building. These predictions could then be used to identify priority area zones within the building and then sampling designs and strategies could be developed based on those zones.

  14. Swarming Robot Design, Construction and Software Implementation

    NASA Technical Reports Server (NTRS)

    Stolleis, Karl A.

    2014-01-01

    In this paper is presented an overview of the hardware design, construction overview, software design and software implementation for a small, low-cost robot to be used for swarming robot development. In addition to the work done on the robot, a full simulation of the robotic system was developed using Robot Operating System (ROS) and its associated simulation. The eventual use of the robots will be exploration of evolving behaviors via genetic algorithms and builds on the work done at the University of New Mexico Biological Computation Lab.

  15. Porting the Starlink Software Collection to GNU Autotools

    NASA Astrophysics Data System (ADS)

    Gray, N.; Jenness, T.; Allan, A.; Berry, D. S.; Currie, M. J.; Draper, P. W.; Taylor, M. B.; Cavanagh, B.

    2005-12-01

    The Starlink software collection currently runs on three different Unix platforms and contains around 100 separate software items, totaling 2.5 million lines of code, in a mixture of languages. We have changed the build system from a hand-maintained collection of makefiles with hard-wired OS variants to a scheme involving feature-discovery via GNU Autoconf. As a result of this work, we have already ported the collection to Mac OS X and Cygwin. This had some unexpected benefits and costs, and valuable lessons.

  16. A Decision Support System for Planning, Control and Auditing of DoD Software Cost Estimation.

    DTIC Science & Technology

    1986-03-01

    is frequently used in U. S. Air Force software cost estimates. Barry Boehm’s Constructive Cost Estimation Model (COCOMO) was recently selected for use...are considered basic to the proper development of software. Pressman , [Ref. 11], addresses these basic elements in a manner which attempts to integrate...H., Jr., and Carlson, Eric D., Building E fective Decision SUDDOrt Systems, Prentice-Hal, EnglewoodNJ, 1982 11. Pressman , Roger S., o A Practioner’s A

  17. Static and Dynamic Analysis in Design of Exoskeleton Structure

    NASA Astrophysics Data System (ADS)

    Ivánkova, Ol'ga; Méri, Dávid; Vojteková, Eva

    2017-10-01

    This paper introduces a numerical experiment of creating the load bearing system of a high rise building. When designing the high-rise building, it is always an important task to find the right proportion between the height of the building and its perceptive width from the various angles of street view. Investigated high rise building in this article was designed according to these criteria. The load bearing structure of the analysed object consists of a reinforced core, plates and steel tubes of an exoskeleton. Eight models of the building were created using the spatial variant of FEM in Scia Engineer Software. Individual models varied in number and dimensions of diagrids in the exoskeleton. In the models, loadings due to the own weight, weight of external glass cladding, and due to the wind according to the Standard, have been considered. The building was loaded by wind load from all four main directions with respect to its shape. Wind load was calculated using the 3D wind generator, which is a part of the Scia Engineer Software. For each model the static analysis was performed. Its most important criterion was the maximum or minimum horizontal displacement (rotation) of the highest point of the building. This displacement was compared with the limit values of the displacement of the analysed high-rise building. By step-by-step adding diagrids and optimizing their dimensions the building model was obtained that complied with the criteria of the Limit Serviceability State. The last model building was assessed also for the Ultimate Limit State. This model was loaded also by seismic loads for comparison with the load due to the wind.

  18. Fault Tree Analysis Application for Safety and Reliability

    NASA Technical Reports Server (NTRS)

    Wallace, Dolores R.

    2003-01-01

    Many commercial software tools exist for fault tree analysis (FTA), an accepted method for mitigating risk in systems. The method embedded in the tools identifies a root as use in system components, but when software is identified as a root cause, it does not build trees into the software component. No commercial software tools have been built specifically for development and analysis of software fault trees. Research indicates that the methods of FTA could be applied to software, but the method is not practical without automated tool support. With appropriate automated tool support, software fault tree analysis (SFTA) may be a practical technique for identifying the underlying cause of software faults that may lead to critical system failures. We strive to demonstrate that existing commercial tools for FTA can be adapted for use with SFTA, and that applied to a safety-critical system, SFTA can be used to identify serious potential problems long before integrator and system testing.

  19. Building Real World Domain-Specific Social Network Websites as a Capstone Project

    ERIC Educational Resources Information Center

    Yue, Kwok-Bun; De Silva, Dilhar; Kim, Dan; Aktepe, Mirac; Nagle, Stewart; Boerger, Chris; Jain, Anubha; Verma, Sunny

    2009-01-01

    This paper describes our experience of using Content Management Software (CMS), specifically Joomla, to build a real world domain-specific social network site (SNS) as a capstone project for graduate information systems and computer science students. As Web 2.0 technologies become increasingly important in driving business application development,…

  20. ecoSmart landscapes: a versatile SaaS platform for green infrastructure applications in urban environments

    Treesearch

    Greg McPherson; Qingfu Xiao; Joe Purohit; Mark Dietenberger; Charles (C.R.) Boardman; Jim Simpson; Paula Peper

    2014-01-01

    The urban environment offers significant opportunities to improve sustainability and optimize water resources. Historically, research and software applications have been focused on the built environment (buildings). Cost-effective, practical tools that can assess the impact of different landscape configurations and their interactions with buildings have not been widely...

  1. Building Phylogenetic Trees from DNA Sequence Data: Investigating Polar Bear and Giant Panda Ancestry.

    ERIC Educational Resources Information Center

    Maier, Caroline Alexandra

    2001-01-01

    Presents an activity in which students seek answers to questions about evolutionary relationships by using genetic databases and bioinformatics software. Students build genetic distance matrices and phylogenetic trees based on molecular sequence data using web-based resources. Provides a flowchart of steps involved in accessing, retrieving, and…

  2. A high-resolution radiation hybrid map of the bovine genome

    USDA-ARS?s Scientific Manuscript database

    We are building high-resolution radiation hybrid maps of all 29 bovine autosomes and chromosome X, using a 58,000-marker genotyping assay, and a 12,000-rad whole-genome radiation hybrid (RH) panel. To accommodate the large number of markers, and to automate the map building procedure, a software pip...

  3. Techniques and Tools for Trustworthy Composition of Pre-Designed Embedded Software Components

    DTIC Science & Technology

    2012-07-01

    following option choices. 1. A plain vanilla pi-trie algorithm set to build the entire pi-trie. 2. A pi-trie algorithm filtered for positive prime...implicates only. 3. A plain vanilla pi-trie algorithm to build the entire pi-trie, but recognize variable-disjoint subformulas. 4. A pi-trie

  4. USING THE ECLPSS SOFTWARE ENVIRONMENT TO BUILD A SPATIALLY EXPLICIT COMPONENT-BASED MODEL OF OZONE EFFECTS ON FOREST ECOSYSTEMS. (R827958)

    EPA Science Inventory

    We have developed a modeling framework to support grid-based simulation of ecosystems at multiple spatial scales, the Ecological Component Library for Parallel Spatial Simulation (ECLPSS). ECLPSS helps ecologists to build robust spatially explicit simulations of ...

  5. Effects of a Preschool Mathematics Curriculum: Summative Research on the "Building Blocks" Project

    ERIC Educational Resources Information Center

    Clements, Douglas H.; Sarama, Julie

    2007-01-01

    This study evaluated the efficacy of a preschool mathematics program based on a comprehensive model of developing research-based software and print curricula. Building Blocks, funded by the National Science Foundation, is a curriculum development project focused on creating research-based, technology-enhanced mathematics materials for pre-K…

  6. A support architecture for reliable distributed computing systems

    NASA Technical Reports Server (NTRS)

    Dasgupta, Partha; Leblanc, Richard J., Jr.

    1988-01-01

    The Clouds project is well underway to its goal of building a unified distributed operating system supporting the object model. The operating system design uses the object concept of structuring software at all levels of the system. The basic operating system was developed and work is under progress to build a usable system.

  7. GACT: a Genome build and Allele definition Conversion Tool for SNP imputation and meta-analysis in genetic association studies.

    PubMed

    Sulovari, Arvis; Li, Dawei

    2014-07-19

    Genome-wide association studies (GWAS) have successfully identified genes associated with complex human diseases. Although much of the heritability remains unexplained, combining single nucleotide polymorphism (SNP) genotypes from multiple studies for meta-analysis will increase the statistical power to identify new disease-associated variants. Meta-analysis requires same allele definition (nomenclature) and genome build among individual studies. Similarly, imputation, commonly-used prior to meta-analysis, requires the same consistency. However, the genotypes from various GWAS are generated using different genotyping platforms, arrays or SNP-calling approaches, resulting in use of different genome builds and allele definitions. Incorrect assumptions of identical allele definition among combined GWAS lead to a large portion of discarded genotypes or incorrect association findings. There is no published tool that predicts and converts among all major allele definitions. In this study, we have developed a tool, GACT, which stands for Genome build and Allele definition Conversion Tool, that predicts and inter-converts between any of the common SNP allele definitions and between the major genome builds. In addition, we assessed several factors that may affect imputation quality, and our results indicated that inclusion of singletons in the reference had detrimental effects while ambiguous SNPs had no measurable effect. Unexpectedly, exclusion of genotypes with missing rate > 0.001 (40% of study SNPs) showed no significant decrease of imputation quality (even significantly higher when compared to the imputation with singletons in the reference), especially for rare SNPs. GACT is a new, powerful, and user-friendly tool with both command-line and interactive online versions that can accurately predict, and convert between any of the common allele definitions and between genome builds for genome-wide meta-analysis and imputation of genotypes from SNP-arrays or deep-sequencing, particularly for data from the dbGaP and other public databases. http://www.uvm.edu/genomics/software/gact.

  8. Software Piracy among Technology Education Students: Investigating Property Rights in a Culture of Innovation

    ERIC Educational Resources Information Center

    Teston, George

    2008-01-01

    When asked about individual perceptions of "technology," 68% of Americans primarily equate the term to the computer. Although this perception under represents the true breadth of the field, the statistic does speak to the ubiquitous role the computer plays across many technology disciplines. Software has become the building block of all major…

  9. A Comparison of the Effects of Lego TC Logo and Problem Solving Software on Elementary Students' Problem Solving Skills.

    ERIC Educational Resources Information Center

    Palumbo, Debra L; Palumbo, David B.

    1993-01-01

    Computer-based problem-solving software exposure was compared to Lego TC LOGO instruction. Thirty fifth graders received either Lego LOGO instruction, which couples Lego building block activities with LOGO computer programming, or instruction with various problem-solving computer programs. Although both groups showed significant progress, the Lego…

  10. The Application of Software Safety to the Constellation Program Launch Control System

    NASA Technical Reports Server (NTRS)

    Kania, James; Hill, Janice

    2011-01-01

    The application of software safety practices on the LCS project resulted in the successful implementation of the NASA Software Safety Standard NASA-STD-8719.138 and CxP software safety requirements. The GOP-GEN-GSW-011 Hazard Report was the first report developed at KSC to identify software hazard causes and their controls. This approach can be applied to similar large software - intensive systems where loss of control can lead to a hazard.

  11. Data-Driven Decision Making as a Tool to Improve Software Development Productivity

    ERIC Educational Resources Information Center

    Brown, Mary Erin

    2013-01-01

    The worldwide software project failure rate, based on a survey of information technology software manager's view of user satisfaction, product quality, and staff productivity, is estimated to be between 24% and 36% and software project success has not kept pace with the advances in hardware. The problem addressed by this study was the limited…

  12. Installing a Local Copy of the Reactome Web Site and Knowledgebase

    PubMed Central

    McKay, Sheldon J; Weiser, Joel

    2015-01-01

    The Reactome project builds, maintains, and publishes a knowledgebase of biological pathways. The information in the knowledgebase is gathered from the experts in the field, peer reviewed, and edited by Reactome editorial staff and then published to the Reactome Web site, http://www.reactome.org (see UNIT 8.7; Croft et al., 2013). The Reactome software is open source and builds on top of other open-source or freely available software. Reactome data and code can be freely downloaded in its entirety and the Web site installed locally. This allows for more flexible interrogation of the data and also makes it possible to add one’s own information to the knowledgebase. PMID:26087747

  13. Photogrammetric 3d Building Reconstruction from Thermal Images

    NASA Astrophysics Data System (ADS)

    Maset, E.; Fusiello, A.; Crosilla, F.; Toldo, R.; Zorzetto, D.

    2017-08-01

    This paper addresses the problem of 3D building reconstruction from thermal infrared (TIR) images. We show that a commercial Computer Vision software can be used to automatically orient sequences of TIR images taken from an Unmanned Aerial Vehicle (UAV) and to generate 3D point clouds, without requiring any GNSS/INS data about position and attitude of the images nor camera calibration parameters. Moreover, we propose a procedure based on Iterative Closest Point (ICP) algorithm to create a model that combines high resolution and geometric accuracy of RGB images with the thermal information deriving from TIR images. The process can be carried out entirely by the aforesaid software in a simple and efficient way.

  14. Property Specification Patterns for intelligence building software

    NASA Astrophysics Data System (ADS)

    Chun, Seungsu

    2018-03-01

    In this paper, through the property specification pattern research for Modal MU(μ) logical aspects present a single framework based on the pattern of intelligence building software. In this study, broken down by state property specification pattern classification of Dwyer (S) and action (A) and was subdivided into it again strong (A) and weaknesses (E). Through these means based on a hierarchical pattern classification of the property specification pattern analysis of logical aspects Mu(μ) was applied to the pattern classification of the examples used in the actual model checker. As a result, not only can a more accurate classification than the existing classification systems were easy to create and understand the attributes specified.

  15. Use of a quality improvement tool, the prioritization matrix, to identify and prioritize triage software algorithm enhancement.

    PubMed

    North, Frederick; Varkey, Prathiba; Caraballo, Pedro; Vsetecka, Darlene; Bartel, Greg

    2007-10-11

    Complex decision support software can require significant effort in maintenance and enhancement. A quality improvement tool, the prioritization matrix, was successfully used to guide software enhancement of algorithms in a symptom assessment call center.

  16. Building Energy Management Open Source Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    This is the repository for Building Energy Management Open Source Software (BEMOSS), which is an open source operating system that is engineered to improve sensing and control of equipment in small- and medium-sized commercial buildings. BEMOSS offers the following key features: (1) Open source, open architecture – BEMOSS is an open source operating system that is built upon VOLTTRON – a distributed agent platform developed by Pacific Northwest National Laboratory (PNNL). BEMOSS was designed to make it easy for hardware manufacturers to seamlessly interface their devices with BEMOSS. Software developers can also contribute to adding additional BEMOSS functionalities and applications.more » (2) Plug & play – BEMOSS was designed to automatically discover supported load controllers (including smart thermostats, VAV/RTUs, lighting load controllers and plug load controllers) in commercial buildings. (3) Interoperability – BEMOSS was designed to work with load control devices form different manufacturers that operate on different communication technologies and data exchange protocols. (4) Cost effectiveness – Implementation of BEMOSS deemed to be cost-effective as it was built upon a robust open source platform that can operate on a low-cost single-board computer, such as Odroid. This feature could contribute to its rapid deployment in small- or medium-sized commercial buildings. (5) Scalability and ease of deployment – With its multi-node architecture, BEMOSS provides a distributed architecture where load controllers in a multi-floor and high occupancy building could be monitored and controlled by multiple single-board computers hosting BEMOSS. This makes it possible for a building engineer to deploy BEMOSS in one zone of a building, be comfortable with its operation, and later on expand the deployment to the entire building to make it more energy efficient. (6) Ability to provide local and remote monitoring – BEMOSS provides both local and remote monitoring ability with role-based access control. (7) Security – In addition to built-in security features provided by VOLTTRON, BEMOSS provides enhanced security features, including BEMOSS discovery approval process, encrypted core-to-node communication, thermostat anti-tampering feature and many more. (8) Support from the Advisory Committee – BEMOSS was developed in consultation with an advisory committee from the beginning of the project. BEMOSS advisory committee comprises representatives from 22 organizations from government and industry.« less

  17. Urban Earthquake Shaking and Loss Assessment

    NASA Astrophysics Data System (ADS)

    Hancilar, U.; Tuzun, C.; Yenidogan, C.; Zulfikar, C.; Durukal, E.; Erdik, M.

    2009-04-01

    This study, conducted under the JRA-3 component of the EU NERIES Project, develops a methodology and software (ELER) for the rapid estimation of earthquake shaking and losses the Euro-Mediterranean region. This multi-level methodology developed together with researchers from Imperial College, NORSAR and ETH-Zurich is capable of incorporating regional variability and sources of uncertainty stemming from ground motion predictions, fault finiteness, site modifications, inventory of physical and social elements subjected to earthquake hazard and the associated vulnerability relationships. GRM Risk Management, Inc. of Istanbul serves as sub-contractor tor the coding of the ELER software. The methodology encompasses the following general steps: 1. Finding of the most likely location of the source of the earthquake using regional seismotectonic data base and basic source parameters, and if and when possible, by the estimation of fault rupture parameters from rapid inversion of data from on-line stations. 2. Estimation of the spatial distribution of selected ground motion parameters through region specific ground motion attenuation relationships and using shear wave velocity distributions.(Shake Mapping) 4. Incorporation of strong ground motion and other empirical macroseismic data for the improvement of Shake Map 5. Estimation of the losses (damage, casualty and economic) at different levels of sophistication (0, 1 and 2) that commensurate with the availability of inventory of human built environment (Loss Mapping) Level 2 analysis of the ELER Software (similar to HAZUS and SELENA) is essentially intended for earthquake risk assessment (building damage, consequential human casualties and macro economic loss quantifiers) in urban areas. The basic Shake Mapping is similar to the Level 0 and Level 1 analysis however, options are available for more sophisticated treatment of site response through externally entered data and improvement of the shake map through incorporation of accelerometric and other macroseismic data (similar to the USGS ShakeMap System). The building inventory data for the Level 2 analysis will consist of grid (geo-cell) based urban building and demographic inventories. For building grouping the European building typology developed within the EU-FP5 RISK-EU project is used. The building vulnerability/fragility relationships to be used can be user selected from a list of applicable relationships developed on the basis of a comprehensive study, Both empirical and analytical relationships (based on the Coefficient Method, Equivalent Linearization Method and the Reduction Factor Method of analysis) can be employed. Casualties in Level 2 analysis are estimated based on the number of buildings in different damaged states and the casualty rates for each building type and damage level. Modifications to the casualty rates can be used if necessary. ELER Level 2 analysis will include calculation of direct monetary losses as a result building damage that will allow for repair-cost estimations and specific investigations associated with earthquake insurance applications (PML and AAL estimations). ELER Level 2 analysis loss results obtained for Istanbul for a scenario earthquake using different techniques will be presented with comparisons using different earthquake damage assessment software. The urban earthquake shaking and loss information is intented for dissemination in a timely manner to related agencies for the planning and coordination of the post-earthquake emergency response. However the same software can also be used for scenario earthquake loss estimation, related Monte-Carlo type simulations and eathquake insurance applications.

  18. The Characteristics of Successful MOOCs in the Fields of Software, Science, and Management, According to Students' Perception

    ERIC Educational Resources Information Center

    Holstein, Simona; Cohen, Anat

    2016-01-01

    The characteristics of successful MOOCs were explored in this study. Thousands of student reviews regarding five xMOOCs (Massive Open Online Course) in the fields of software, science, and management were extracted from the Coursetalk website and analyzed by quantitative and qualitative methods using the Garrison, Anderson, and Archer (2000)…

  19. LANDIS 4.0 users guide. LANDIS: a spatially explicit model of forest landscape disturbance, management, and succession

    Treesearch

    Hong S. He; Wei Li; Brian R. Sturtevant; Jian Yang; Bo Z. Shang; Eric J. Gustafson; David J. Mladenoff

    2005-01-01

    LANDIS 4.0 is new-generation software that simulates forest landscape change over large spatial and temporal scales. It is used to explore how disturbances, succession, and management interact to determine forest composition and pattern. Also describes software architecture, model assumptions and provides detailed instructions on the use of the model.

  20. Building a Snow Data Management System using Open Source Software (and IDL)

    NASA Astrophysics Data System (ADS)

    Goodale, C. E.; Mattmann, C. A.; Ramirez, P.; Hart, A. F.; Painter, T.; Zimdars, P. A.; Bryant, A.; Brodzik, M.; Skiles, M.; Seidel, F. C.; Rittger, K. E.

    2012-12-01

    At NASA's Jet Propulsion Laboratory free and open source software is used everyday to support a wide range of projects, from planetary to climate to research and development. In this abstract I will discuss the key role that open source software has played in building a robust science data processing pipeline for snow hydrology research, and how the system is also able to leverage programs written in IDL, making JPL's Snow Data System a hybrid of open source and proprietary software. Main Points: - The Design of the Snow Data System (illustrate how the collection of sub-systems are combined to create a complete data processing pipeline) - Discuss the Challenges of moving from a single algorithm on a laptop, to running 100's of parallel algorithms on a cluster of servers (lesson's learned) - Code changes - Software license related challenges - Storage Requirements - System Evolution (from data archiving, to data processing, to data on a map, to near-real-time products and maps) - Road map for the next 6 months (including how easily we re-used the snowDS code base to support the Airborne Snow Observatory Mission) Software in Use and their Software Licenses: IDL - Used for pre and post processing of data. Licensed under a proprietary software license held by Excelis. Apache OODT - Used for data management and workflow processing. Licensed under the Apache License Version 2. GDAL - Geospatial Data processing library used for data re-projection currently. Licensed under the X/MIT license. GeoServer - WMS Server. Licensed under the General Public License Version 2.0 Leaflet.js - Javascript web mapping library. Licensed under the Berkeley Software Distribution License. Python - Glue code and miscellaneous data processing support. Licensed under the Python Software Foundation License. Perl - Script wrapper for running the SCAG algorithm. Licensed under the General Public License Version 3. PHP - Front-end web application programming. Licensed under the PHP License Version 3.01

  1. An open-source software platform for data management, visualisation, model building and model sharing in water, energy and other resource modelling domains.

    NASA Astrophysics Data System (ADS)

    Knox, S.; Meier, P.; Mohammed, K.; Korteling, B.; Matrosov, E. S.; Hurford, A.; Huskova, I.; Harou, J. J.; Rosenberg, D. E.; Thilmant, A.; Medellin-Azuara, J.; Wicks, J.

    2015-12-01

    Capacity expansion on resource networks is essential to adapting to economic and population growth and pressures such as climate change. Engineered infrastructure systems such as water, energy, or transport networks require sophisticated and bespoke models to refine management and investment strategies. Successful modeling of such complex systems relies on good data management and advanced methods to visualize and share data.Engineered infrastructure systems are often represented as networks of nodes and links with operating rules describing their interactions. Infrastructure system management and planning can be abstracted to simulating or optimizing new operations and extensions of the network. By separating the data storage of abstract networks from manipulation and modeling we have created a system where infrastructure modeling across various domains is facilitated.We introduce Hydra Platform, a Free Open Source Software designed for analysts and modelers to store, manage and share network topology and data. Hydra Platform is a Python library with a web service layer for remote applications, called Apps, to connect. Apps serve various functions including network or results visualization, data export (e.g. into a proprietary format) or model execution. This Client-Server architecture allows users to manipulate and share centrally stored data. XML templates allow a standardised description of the data structure required for storing network data such that it is compatible with specific models.Hydra Platform represents networks in an abstract way and is therefore not bound to a single modeling domain. It is the Apps that create domain-specific functionality. Using Apps researchers from different domains can incorporate different models within the same network enabling cross-disciplinary modeling while minimizing errors and streamlining data sharing. Separating the Python library from the web layer allows developers to natively expand the software or build web-based apps in other languages for remote functionality. Partner CH2M is developing a commercial user-interface for Hydra Platform however custom interfaces and visualization tools can be built. Hydra Platform is available on GitHub while Apps will be shared on a central repository.

  2. Implementing Production Grids

    NASA Technical Reports Server (NTRS)

    Johnston, William E.; Ziobarth, John (Technical Monitor)

    2002-01-01

    We have presented the essence of experience gained in building two production Grids, and provided some of the global context for this work. As the reader might imagine, there were a lot of false starts, refinements to the approaches and to the software, and several substantial integration projects (SRB and Condor integrated with Globus) to get where we are today. However, the point of this paper is to try and make it substantially easier for others to get to the point where Information Power Grids (IPG) and the DOE Science Grids are today. This is what is needed in order to move us toward the vision of a common cyber infrastructure for science. The author would also like to remind the readers that this paper primarily represents the actual experiences that resulted from specific architectural and software choices during the design and implementation of these two Grids. The choices made were dictated by the criteria laid out in section 1. There is a lot more Grid software available today that there was four years ago, and various of these packages are being integrated into IPG and the DOE Grids. However, the foundation choices of Globus, SRB, and Condor would not be significantly different today than they were four years ago. Nonetheless, if the GGF is successful in its work - and we have every reason to believe that it will be - then in a few years we will see that the 28 functions provided by these packages will be defined in terms of protocols and MIS, and there will be several robust implementations available for each of the basic components, especially the Grid Common Services. The impact of the emerging Web Grid Services work is not yet clear. It will likely have a substantial impact on building higher level services, however it is the opinion of the author that this will in no way obviate the need for the Grid Common Services. These are the foundation of Grids, and the focus of almost all of the operational and persistent infrastructure aspects of Grids.

  3. Implementation of Epic Beaker Anatomic Pathology at an Academic Medical Center.

    PubMed

    Blau, John Larry; Wilford, Joseph D; Dane, Susan K; Karandikar, Nitin J; Fuller, Emily S; Jacobsmeier, Debbie J; Jans, Melissa A; Horning, Elisabeth A; Krasowski, Matthew D; Ford, Bradley A; Becker, Kent R; Beranek, Jeanine M; Robinson, Robert A

    2017-01-01

    Beaker is a relatively new laboratory information system (LIS) offered by Epic Systems Corporation as part of its suite of health-care software and bundled with its electronic medical record, EpicCare. It is divided into two modules, Beaker anatomic pathology (Beaker AP) and Beaker Clinical Pathology. In this report, we describe our experience implementing Beaker AP version 2014 at an academic medical center with a go-live date of October 2015. This report covers preimplementation preparations and challenges beginning in September 2014, issues discovered soon after go-live in October 2015, and some post go-live optimizations using data from meetings, debriefings, and the project closure document. We share specific issues that we encountered during implementation, including difficulties with the proposed frozen section workflow, developing a shared specimen source dictionary, and implementation of the standard Beaker workflow in large institution with trainees. We share specific strategies that we used to overcome these issues for a successful Beaker AP implementation. Several areas of the laboratory-required adaptation of the default Beaker build parameters to meet the needs of the workflow in a busy academic medical center. In a few areas, our laboratory was unable to use the Beaker functionality to support our workflow, and we have continued to use paper or have altered our workflow. In spite of several difficulties that required creative solutions before go-live, the implementation has been successful based on satisfaction surveys completed by pathologists and others who use the software. However, optimization of Beaker workflows has continued to be an ongoing process after go-live to the present time. The Beaker AP LIS can be successfully implemented at an academic medical center but requires significant forethought, creative adaptation, and continued shared management of the ongoing product by institutional and departmental information technology staff as well as laboratory managers to meet the needs of the laboratory.

  4. Remote software upload techniques in future vehicles and their performance analysis

    NASA Astrophysics Data System (ADS)

    Hossain, Irina

    Updating software in vehicle Electronic Control Units (ECUs) will become a mandatory requirement for a variety of reasons, for examples, to update/fix functionality of an existing system, add new functionality, remove software bugs and to cope up with ITS infrastructure. Software modules of advanced vehicles can be updated using Remote Software Upload (RSU) technique. The RSU employs infrastructure-based wireless communication technique where the software supplier sends the software to the targeted vehicle via a roadside Base Station (BS). However, security is critically important in RSU to avoid any disasters due to malfunctions of the vehicle or to protect the proprietary algorithms from hackers, competitors or people with malicious intent. In this thesis, a mechanism of secure software upload in advanced vehicles is presented which employs mutual authentication of the software provider and the vehicle using a pre-shared authentication key before sending the software. The software packets are sent encrypted with a secret key along with the Message Digest (MD). In order to increase the security level, it is proposed the vehicle to receive more than one copy of the software along with the MD in each copy. The vehicle will install the new software only when it receives more than one identical copies of the software. In order to validate the proposition, analytical expressions of average number of packet transmissions for successful software update is determined. Different cases are investigated depending on the vehicle's buffer size and verification methods. The analytical and simulation results show that it is sufficient to send two copies of the software to the vehicle to thwart any security attack while uploading the software. The above mentioned unicast method for RSU is suitable when software needs to be uploaded to a single vehicle. Since multicasting is the most efficient method of group communication, updating software in an ECU of a large number of vehicles could benefit from it. However, like the unicast RSU, the security requirements of multicast communication, i.e., authenticity, confidentiality and integrity of the software transmitted and access control of the group members is challenging. In this thesis, an infrastructure-based mobile multicasting for RSU in vehicle ECUs is proposed where an ECU receives the software from a remote software distribution center using the road side BSs as gateways. The Vehicular Software Distribution Network (VSDN) is divided into small regions administered by a Regional Group Manager (RGM). Two multicast Group Key Management (GKM) techniques are proposed based on the degree of trust on the BSs named Fully-trusted (FT) and Semi-trusted (ST) systems. Analytical models are developed to find the multicast session establishment latency and handover latency for these two protocols. The average latency to perform mutual authentication of the software vendor and a vehicle, and to send the multicast session key by the software provider during multicast session initialization, and the handoff latency during multicast session is calculated. Analytical and simulation results show that the link establishment latency per vehicle of our proposed schemes is in the range of few seconds and the ST system requires few ms higher time than the FT system. The handoff latency is also in the range of few seconds and in some cases ST system requires less handoff time than the FT system. Thus, it is possible to build an efficient GKM protocol without putting too much trust on the BSs.

  5. Means of escape provisions and evacuation simulation of public building in Malaysia and Singapore

    NASA Astrophysics Data System (ADS)

    Samad, Muna Hanim Abdul; Taib, Nooriati; Ying, Choo Siew

    2017-10-01

    The Uniform Building By-law 1984 of Malaysia is the legal document governing fire safety requirements in buildings. Its prescriptive nature has made the requirements out dated from the viewpoint of current performance based approach in most developed countries. The means of escape provisions is a critical requirement to safeguard occupants' safety in fire especially in public buildings. As stipulated in the UBBL 1984, the means of escape provisions includes sufficient escape routes, travel distance, protection of escape routes, etc. designated as means to allow occupants to escape within a safe period of time. This research aims at investigating the effectiveness of those provisions in public buildings during evacuation process involving massive crowd during emergencies. This research includes a scenario-based study on evacuation processes using two software i.e. PyroSim, a crowd modelling software to conduct smoke study and Pathfinder to stimulate evacuation model of building in Malaysia and Singapore as comparative study. The results show that the buildings used as case study were designed according to Malaysian UBBL 1984 and Singapore Firecode, 2013 respectively provide relative safe means of escape. The simulations of fire and smoke and coupled with simulation of evacuation have demonstrated that although there are adequate exits designated according to fire requirements, the impact of the geometry of atriums on the behavior of fire and smoke have significant effect on escape time especially for unfamiliar user of the premises.

  6. Study of fault tolerant software technology for dynamic systems

    NASA Technical Reports Server (NTRS)

    Caglayan, A. K.; Zacharias, G. L.

    1985-01-01

    The major aim of this study is to investigate the feasibility of using systems-based failure detection isolation and compensation (FDIC) techniques in building fault-tolerant software and extending them, whenever possible, to the domain of software fault tolerance. First, it is shown that systems-based FDIC methods can be extended to develop software error detection techniques by using system models for software modules. In particular, it is demonstrated that systems-based FDIC techniques can yield consistency checks that are easier to implement than acceptance tests based on software specifications. Next, it is shown that systems-based failure compensation techniques can be generalized to the domain of software fault tolerance in developing software error recovery procedures. Finally, the feasibility of using fault-tolerant software in flight software is investigated. In particular, possible system and version instabilities, and functional performance degradation that may occur in N-Version programming applications to flight software are illustrated. Finally, a comparative analysis of N-Version and recovery block techniques in the context of generic blocks in flight software is presented.

  7. Analysis of Traffic Signals on a Software-Defined Network for Detection and Classification of a Man-in-the-Middle Attack

    DTIC Science & Technology

    2017-09-01

    unique characteristics of reported anomalies in the collected traffic signals to build a classification framework. Other cyber events, such as a...Furthermore, we identify unique characteristics of reported anomalies in the collected traffic signals to build a classification framework. Other cyber...2]. The applications build flow rules using network topology information provided by the control plane [1]. Since the control plane is able to

  8. Impact of external conditions on energy consumption in industrial halls

    NASA Astrophysics Data System (ADS)

    Żabnieńśka-Góra, Alina

    2017-11-01

    The energy demand for heating the halls buildings is high. The impact on this may have the technology of production, building construction and technology requirements (HVAC systems). The isolation of the external partitions, the location of the object in relation to the surrounding buildings and the degree of the interior insolation (windows and skylights) are important in the context of energy consumption. The article discusses the impact of external conditions, wind and sunlight on energy demand in the industrial hall. The building model was prepared in IDA ICE 4.0 simulation software. Model validation was done based on measurements taken in the analyzed building.

  9. Aircraft Design Software

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Successful commercialization of the AirCraft SYNThesis (ACSYNT) tool has resulted in the creation of Phoenix Integration, Inc. ACSYNT has been exclusively licensed to the company, an outcome of a seven year, $3 million effort to provide unique software technology to a focused design engineering market. Ames Research Center formulated ACSYNT and in working with the Virginia Polytechnic Institute CAD Laboratory, began to design and code a computer-aided design for ACSYNT. Using a Joint Sponsored Research Agreement, Ames formed an industry-government-university alliance to improve and foster research and development for the software. As a result of the ACSYNT Institute, the software is becoming a predominant tool for aircraft conceptual design. ACSYNT has been successfully applied to high- speed civil transport configuration, subsonic transports, and supersonic fighters.

  10. Beyond Wiki to Judgewiki for Transparent Climate Change Decisions

    NASA Astrophysics Data System (ADS)

    Capron, M. E.

    2008-12-01

    Climate Change is like the prisoner's dilemma, a zero-sum game, or cheating in sports. Everyone and every country is tempted to selfishly maintain or advance their standard of living. The tremendous difference between standards of living amplifies the desire to opt out of Climate Change solutions adverse to economic competitiveness. Climate Change is also exceedingly complex. No one person, one organization, one country, or partial collection of countries has the capacity and the global support needed to make decisions on Climate Change solutions. There are thousands of potential actions, tens of thousands of known and unknown environmental and economic impacts. Some actions are belatedly found to be unsustainable beyond token volumes, corn ethanol or soy-biodiesel for example. Mankind can address human nature and complexity with a globally transparent information and decision process available to all 7 billion of us. We need a process that builds trust and simplifies complexity. Fortunately, we have the Internet for trust building communication and computers to simplify complexity. Mankind can produce new software tailored to the challenge. We would combine group information collection software (a wiki) with a decision-matrix (a judge), market forecasting, and video games to produce the tool mankind needs for trust building transparent decisions on Climate Change actions. The resulting software would be a judgewiki.

  11. Infusing Reliability Techniques into Software Safety Analysis

    NASA Technical Reports Server (NTRS)

    Shi, Ying

    2015-01-01

    Software safety analysis for a large software intensive system is always a challenge. Software safety practitioners need to ensure that software related hazards are completely identified, controlled, and tracked. This paper discusses in detail how to incorporate the traditional reliability techniques into the entire software safety analysis process. In addition, this paper addresses how information can be effectively shared between the various practitioners involved in the software safety analyses. The author has successfully applied the approach to several aerospace applications. Examples are provided to illustrate the key steps of the proposed approach.

  12. DefenseLink Special: Travels with Gates, January 2007

    Science.gov Websites

    that has been achieved there, said Defense Secretary Robert M. Gates. Story Gates to Build on Success , and Defense Secretary Robert M. Gates wants to build on those successes. Story Afghan Army Makes

  13. Common characteristics of open source software development and applicability for drug discovery: a systematic review

    PubMed Central

    2011-01-01

    Background Innovation through an open source model has proven to be successful for software development. This success has led many to speculate if open source can be applied to other industries with similar success. We attempt to provide an understanding of open source software development characteristics for researchers, business leaders and government officials who may be interested in utilizing open source innovation in other contexts and with an emphasis on drug discovery. Methods A systematic review was performed by searching relevant, multidisciplinary databases to extract empirical research regarding the common characteristics and barriers of initiating and maintaining an open source software development project. Results Common characteristics to open source software development pertinent to open source drug discovery were extracted. The characteristics were then grouped into the areas of participant attraction, management of volunteers, control mechanisms, legal framework and physical constraints. Lastly, their applicability to drug discovery was examined. Conclusions We believe that the open source model is viable for drug discovery, although it is unlikely that it will exactly follow the form used in software development. Hybrids will likely develop that suit the unique characteristics of drug discovery. We suggest potential motivations for organizations to join an open source drug discovery project. We also examine specific differences between software and medicines, specifically how the need for laboratories and physical goods will impact the model as well as the effect of patents. PMID:21955914

  14. A Comparison of Two Commercial Volumetry Software Programs in the Analysis of Pulmonary Ground-Glass Nodules: Segmentation Capability and Measurement Accuracy

    PubMed Central

    Kim, Hyungjin; Lee, Sang Min; Lee, Hyun-Ju; Goo, Jin Mo

    2013-01-01

    Objective To compare the segmentation capability of the 2 currently available commercial volumetry software programs with specific segmentation algorithms for pulmonary ground-glass nodules (GGNs) and to assess their measurement accuracy. Materials and Methods In this study, 55 patients with 66 GGNs underwent unenhanced low-dose CT. GGN segmentation was performed by using 2 volumetry software programs (LungCARE, Siemens Healthcare; LungVCAR, GE Healthcare). Successful nodule segmentation was assessed visually and morphologic features of GGNs were evaluated to determine factors affecting segmentation by both types of software. In addition, the measurement accuracy of the software programs was investigated by using an anthropomorphic chest phantom containing simulated GGNs. Results The successful nodule segmentation rate was significantly higher in LungCARE (90.9%) than in LungVCAR (72.7%) (p = 0.012). Vascular attachment was a negatively influencing morphologic feature of nodule segmentation for both software programs. As for measurement accuracy, mean relative volume measurement errors in nodules ≥ 10 mm were 14.89% with LungCARE and 19.96% with LungVCAR. The mean relative attenuation measurement errors in nodules ≥ 10 mm were 3.03% with LungCARE and 5.12% with LungVCAR. Conclusion LungCARE shows significantly higher segmentation success rates than LungVCAR. Measurement accuracy of volume and attenuation of GGNs is acceptable in GGNs ≥ 10 mm by both software programs. PMID:23901328

  15. Collaboration in Global Software Engineering Based on Process Description Integration

    NASA Astrophysics Data System (ADS)

    Klein, Harald; Rausch, Andreas; Fischer, Edward

    Globalization is one of the big trends in software development. Development projects need a variety of different resources with appropriate expert knowledge to be successful. More and more of these resources are nowadays obtained from specialized organizations and countries all over the world, varying in development approaches, processes, and culture. As seen with early outsourcing attempts, collaboration may fail due to these differences. Hence, the major challenge in global software engineering is to streamline collaborating organizations towards a successful conjoint development. Based on typical collaboration scenarios, this paper presents a structured approach to integrate processes in a comprehensible way.

  16. Support for comprehensive reuse

    NASA Technical Reports Server (NTRS)

    Basili, V. R.; Rombach, H. D.

    1991-01-01

    Reuse of products, processes, and other knowledge will be the key to enable the software industry to achieve the dramatic improvement in productivity and quality required to satisfy the anticipated growing demands. Although experience shows that certain kinds of reuse can be successful, general success has been elusive. A software life-cycle technology which allows comprehensive reuse of all kinds of software-related experience could provide the means to achieving the desired order-of-magnitude improvements. A comprehensive framework of models, model-based characterization schemes, and support mechanisms for better understanding, evaluating, planning, and supporting all aspects of reuse are introduced.

  17. Evaluating Different Green School Building Designs for Albania: Indoor Thermal Comfort, Energy Use Analysis with Solar Systems

    NASA Astrophysics Data System (ADS)

    Dalvi, Ambalika Rajendra

    Improving the conditions of schools in many parts of the world is gradually acquiring importance. The Green School movement is an integral part of this effort since it aims at improving indoor environmental conditions. This would in turn, enhance student- learning while minimizing adverse environmental impact through energy efficiency of comfort-related HVAC and lighting systems. This research, which is a part of a larger research project, aims at evaluating different school building designs in Albania in terms of energy use and indoor thermal comfort, and identify energy efficient options of existing schools. We start by identifying three different climate zones in Albania; Coastal (Durres), Hill/Pre-mountainous (Tirana), mountainous (Korca). Next, two prototypical school building designs are identified from the existing stock. Numerous scenarios are then identified for analysis which consists of combinations of climate zone, building type, building orientation, building upgrade levels, presence of renewable energy systems (solar photovoltaic and solar water heater). The existing building layouts, initially outlined in CAD software and then imported into a detailed building energy software program (eQuest) to perform annual simulations for all scenarios. The research also predicted indoor thermal comfort conditions of the various scenarios on the premise that windows could be opened to provide natural ventilation cooling when appropriate. This study also estimated the energy generated from solar photovoltaic systems and solar water heater systems when placed on the available roof area to determine the extent to which they are able to meet the required electric loads (plug and lights) and building heating loads respectively. The results showed that there is adequate indoor comfort without the need for mechanical cooling for the three climate zones, and that only heating is needed during the winter months.

  18. Defect measurement and analysis of JPL ground software: a case study

    NASA Technical Reports Server (NTRS)

    Powell, John D.; Spagnuolo, John N., Jr.

    2004-01-01

    Ground software systems at JPL must meet high assurance standards while remaining on schedule due to relatively immovable launch dates for spacecraft that will be controlled by such systems. Toward this end, the Software Quality Improvement (SQI) project's Measurement and Benchmarking (M&B) team is collecting and analyzing defect data of JPL ground system software projects to build software defect prediction models. The aim of these models is to improve predictability with regard to software quality activities. Predictive models will quantitatively define typical trends for JPL ground systems as well as Critical Discriminators (CDs) to provide explanations for atypical deviations from the norm at JPL. CDs are software characteristics that can be estimated or foreseen early in a software project's planning. Thus, these CDs will assist in planning for the predicted degree to which software quality activities for a project are likely to deviation from the normal JPL ground system based on pasted experience across the lab.

  19. Coordination and organization of security software process for power information application environment

    NASA Astrophysics Data System (ADS)

    Wang, Qiang

    2017-09-01

    As an important part of software engineering, the software process decides the success or failure of software product. The design and development feature of security software process is discussed, so is the necessity and the present significance of using such process. Coordinating the function software, the process for security software and its testing are deeply discussed. The process includes requirement analysis, design, coding, debug and testing, submission and maintenance. In each process, the paper proposed the subprocesses to support software security. As an example, the paper introduces the above process into the power information platform.

  20. A Prototype for the Support of Integrated Software Process Development and Improvement

    NASA Astrophysics Data System (ADS)

    Porrawatpreyakorn, Nalinpat; Quirchmayr, Gerald; Chutimaskul, Wichian

    An efficient software development process is one of key success factors for quality software. Not only can the appropriate establishment but also the continuous improvement of integrated project management and of the software development process result in efficiency. This paper hence proposes a software process maintenance framework which consists of two core components: an integrated PMBOK-Scrum model describing how to establish a comprehensive set of project management and software engineering processes and a software development maturity model advocating software process improvement. Besides, a prototype tool to support the framework is introduced.

  1. REVEAL: Software Documentation and Platform Migration

    NASA Technical Reports Server (NTRS)

    Wilson, Michael A.; Veibell, Victoir T.; Freudinger, Lawrence C.

    2008-01-01

    The Research Environment for Vehicle Embedded Analysis on Linux (REVEAL) is reconfigurable data acquisition software designed for network-distributed test and measurement applications. In development since 2001, it has been successfully demonstrated in support of a number of actual missions within NASA s Suborbital Science Program. Improvements to software configuration control were needed to properly support both an ongoing transition to operational status and continued evolution of REVEAL capabilities. For this reason the project described in this report targets REVEAL software source documentation and deployment of the software on a small set of hardware platforms different from what is currently used in the baseline system implementation. This report specifically describes the actions taken over a ten week period by two undergraduate student interns and serves as a final report for that internship. The topics discussed include: the documentation of REVEAL source code; the migration of REVEAL to other platforms; and an end-to-end field test that successfully validates the efforts.

  2. A Study of Clinically Related Open Source Software Projects

    PubMed Central

    Hogarth, Michael A.; Turner, Stuart

    2005-01-01

    Open source software development has recently gained significant interest due to several successful mainstream open source projects. This methodology has been proposed as being similarly viable and beneficial in the clinical application domain as well. However, the clinical software development venue differs significantly from the mainstream software venue. Existing clinical open source projects have not been well characterized nor formally studied so the ‘fit’ of open source in this domain is largely unknown. In order to better understand the open source movement in the clinical application domain, we undertook a study of existing open source clinical projects. In this study we sought to characterize and classify existing clinical open source projects and to determine metrics for their viability. This study revealed several findings which we believe could guide the healthcare community in its quest for successful open source clinical software projects. PMID:16779056

  3. Open Source Molecular Modeling

    PubMed Central

    Pirhadi, Somayeh; Sunseri, Jocelyn; Koes, David Ryan

    2016-01-01

    The success of molecular modeling and computational chemistry efforts are, by definition, dependent on quality software applications. Open source software development provides many advantages to users of modeling applications, not the least of which is that the software is free and completely extendable. In this review we categorize, enumerate, and describe available open source software packages for molecular modeling and computational chemistry. PMID:27631126

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Katipamula, Srinivas; Gowri, Krishnan; Hernandez, George

    This paper describes one such reference process that can be deployed to provide continuous automated conditioned-based maintenance management for buildings that have BIM, a building automation system (BAS) and a computerized maintenance management software (CMMS) systems. The process can be deployed using an open source transactional network platform, VOLTTRON™, designed for distributed sensing and controls and supports both energy efficiency and grid services.

  5. Socio-Cognitive Dynamics of Knowledge Building in the Work of 9- and 10-Year-Olds

    ERIC Educational Resources Information Center

    Zhang, Jianwei; Scardamalia, Marlene; Lamon, Mary; Messina, Richard; Reeve, Richard

    2007-01-01

    This study examines four months of online discourse of 22 Grade 4 students engaged in efforts to advance their understanding of optics. Their work is part of a school-wide knowledge building initiative, the essence of which is giving students collective responsibility for idea improvement. This goal is supported by software--Knowledge…

  6. USE OF ROUGH SETS AND SPECTRAL DATA FOR BUILDING PREDICTIVE MODELS OF REACTION RATE CONSTANTS

    EPA Science Inventory

    A model for predicting the log of the rate constants for alkaline hydrolysis of organic esters has been developed with the use of gas-phase min-infrared library spectra and a rule-building software system based on the mathematical theory of rough sets. A diverse set of 41 esters ...

  7. How to Break through Techno-Shock and Build Multi-Media Units.

    ERIC Educational Resources Information Center

    Sutz, Rachel; Warren, Maria W.; Williams, Holly

    1998-01-01

    Describes how three teachers learned about using Hyperstudio (presentation software), constructed a Web page, and created an original film as part of a unit on Florida writers. Recommends three major strategies for learning a new technology: choose the literary works, then the technology; build on your strengths; and learn to talk to the…

  8. LEGOS: Object-based software components for mission-critical systems. Final report, June 1, 1995--December 31, 1997

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1998-08-01

    An estimated 85% of the installed base of software is a custom application with a production quantity of one. In practice, almost 100% of military software systems are custom software. Paradoxically, the marginal costs of producing additional units are near zero. So why hasn`t the software market, a market with high design costs and low productions costs evolved like other similar custom widget industries, such as automobiles and hardware chips? The military software industry seems immune to market pressures that have motivated a multilevel supply chain structure in other widget industries: design cost recovery, improve quality through specialization, and enablemore » rapid assembly from purchased components. The primary goal of the ComponentWare Consortium (CWC) technology plan was to overcome barriers to building and deploying mission-critical information systems by using verified, reusable software components (Component Ware). The adoption of the ComponentWare infrastructure is predicated upon a critical mass of the leading platform vendors` inevitable adoption of adopting emerging, object-based, distributed computing frameworks--initially CORBA and COM/OLE. The long-range goal of this work is to build and deploy military systems from verified reusable architectures. The promise of component-based applications is to enable developers to snap together new applications by mixing and matching prefabricated software components. A key result of this effort is the concept of reusable software architectures. A second important contribution is the notion that a software architecture is something that can be captured in a formal language and reused across multiple applications. The formalization and reuse of software architectures provide major cost and schedule improvements. The Unified Modeling Language (UML) is fast becoming the industry standard for object-oriented analysis and design notation for object-based systems. However, the lack of a standard real-time distributed object operating system, lack of a standard Computer-Aided Software Environment (CASE) tool notation and lack of a standard CASE tool repository has limited the realization of component software. The approach to fulfilling this need is the software component factory innovation. The factory approach takes advantage of emerging standards such as UML, CORBA, Java and the Internet. The key technical innovation of the software component factory is the ability to assemble and test new system configurations as well as assemble new tools on demand from existing tools and architecture design repositories.« less

  9. Space Shuttle Usage of z/OS

    NASA Technical Reports Server (NTRS)

    Green, Jan

    2009-01-01

    This viewgraph presentation gives a detailed description of the avionics associated with the Space Shuttle's data processing system and its usage of z/OS. The contents include: 1) Mission, Products, and Customers; 2) Facility Overview; 3) Shuttle Data Processing System; 4) Languages and Compilers; 5) Application Tools; 6) Shuttle Flight Software Simulator; 7) Software Development and Build Tools; and 8) Fun Facts and Acronyms.

  10. Model-It: A Case Study of Learner-Centered Software Design for Supporting Model Building.

    ERIC Educational Resources Information Center

    Jackson, Shari L.; Stratford, Steven J.; Krajcik, Joseph S.; Soloway, Elliot

    Learner-centered software design (LCSD) guides the design of tasks, tools, and interfaces in order to support the unique needs of learners: growth, diversity and motivation. This paper presents a framework for LCSD and describes a case study of its application to the ScienceWare Model-It, a learner-centered tool to support scientific modeling and…

  11. NoSQL Data Store Technologies

    DTIC Science & Technology

    2014-09-01

    NoSQL Data Store Technologies John Klein, Software Engineering Institute Patrick Donohoe, Software Engineering Institute Neil Ernst...REPORT TYPE N/A 3. DATES COVERED 4. TITLE AND SUBTITLE NoSQL Data Store Technologies 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT...distribute data 4. Data Replication – determines how a NoSQL database facilitates reliable, high performance data replication to build

  12. Intent Specifications: An Approach to Building Human-Centered Specifications

    NASA Technical Reports Server (NTRS)

    Leveson, Nancy G.

    1999-01-01

    This paper examines and proposes an approach to writing software specifications, based on research in systems theory, cognitive psychology, and human-machine interaction. The goal is to provide specifications that support human problem solving and the tasks that humans must perform in software development and evolution. A type of specification, called intent specifications, is constructed upon this underlying foundation.

  13. Statistical Theory for the "RCT-YES" Software: Design-Based Causal Inference for RCTs. NCEE 2015-4011

    ERIC Educational Resources Information Center

    Schochet, Peter Z.

    2015-01-01

    This report presents the statistical theory underlying the "RCT-YES" software that estimates and reports impacts for RCTs for a wide range of designs used in social policy research. The report discusses a unified, non-parametric design-based approach for impact estimation using the building blocks of the Neyman-Rubin-Holland causal…

  14. The Development and Evaluation of Software to Foster Professional Development in Educational Assessment

    ERIC Educational Resources Information Center

    Benton, Morgan C.

    2008-01-01

    This dissertation sought to answer the question: Is it possible to build a software tool that will allow teachers to write better multiple-choice questions? The thesis proceeded from the finding that the quality of teaching is very influential in the amount that students learn. A basic premise of this research, then, is that improving teachers…

  15. The Design and Evaluation of Class Exercises as Active Learning Tools in Software Verification and Validation

    ERIC Educational Resources Information Center

    Wu, Peter Y.; Manohar, Priyadarshan A.; Acharya, Sushil

    2016-01-01

    It is well known that interesting questions can stimulate thinking and invite participation. Class exercises are designed to make use of questions to engage students in active learning. In a project toward building a community skilled in software verification and validation (SV&V), we critically review and further develop course materials in…

  16. Adopting Open-Source Software Applications in U. S. Higher Education: A Cross-Disciplinary Review of the Literature

    ERIC Educational Resources Information Center

    van Rooij, Shahron Williams

    2009-01-01

    Higher Education institutions in the United States are considering Open Source software applications such as the Moodle and Sakai course management systems and the Kuali financial system to build integrated learning environments that serve both academic and administrative needs. Open Source is presumed to be more flexible and less costly than…

  17. Open Source Software and Design-Based Research Symbiosis in Developing 3D Virtual Learning Environments: Examples from the iSocial Project

    ERIC Educational Resources Information Center

    Schmidt, Matthew; Galyen, Krista; Laffey, James; Babiuch, Ryan; Schmidt, Carla

    2014-01-01

    Design-based research (DBR) and open source software are both acknowledged as potentially productive ways for advancing learning technologies. These approaches have practical benefits for the design and development process and for building and leveraging community to augment and sustain design and development. This report presents a case study of…

  18. Seismic performance for vertical geometric irregularity frame structures

    NASA Astrophysics Data System (ADS)

    Ismail, R.; Mahmud, N. A.; Ishak, I. S.

    2018-04-01

    This research highlights the result of vertical geometric irregularity frame structures. The aid of finite element analysis software, LUSAS was used to analyse seismic performance by focusing particularly on type of irregular frame on the differences in height floors and continued in the middle of the building. Malaysia’s building structures were affected once the earthquake took place in the neighbouring country such as Indonesia (Sumatera Island). In Malaysia, concrete is widely used in building construction and limited tension resistance to prevent it. Analysing structural behavior with horizontal and vertical static load is commonly analyses by using the Plane Frame Analysis. The case study of this research is to determine the stress and displacement in the seismic response under this type of irregular frame structures. This study is based on seven-storey building of Clinical Training Centre located in Sungai Buloh, Selayang, Selangor. Since the largest earthquake occurs in Acheh, Indonesia on December 26, 2004, the data was recorded and used in conducting this research. The result of stress and displacement using IMPlus seismic analysis in LUSAS Modeller Software under the seismic response of a formwork frame system states that the building is safe to withstand the ground and in good condition under the variation of seismic performance.

  19. Design, Development and Pre-Flight Testing of the Communications, Navigation, and Networking Reconfigurable Testbed (Connect) to Investigate Software Defined Radio Architecture on the International Space Station

    NASA Technical Reports Server (NTRS)

    Over, Ann P.; Barrett, Michael J.; Reinhart, Richard C.; Free, James M.; Cikanek, Harry A., III

    2011-01-01

    The Communication Navigation and Networking Reconfigurable Testbed (CoNNeCT) is a NASA-sponsored mission, which will investigate the usage of Software Defined Radios (SDRs) as a multi-function communication system for space missions. A softwaredefined radio system is a communication system in which typical components of the system (e.g., modulators) are incorporated into software. The software-defined capability allows flexibility and experimentation in different modulation, coding and other parameters to understand their effects on performance. This flexibility builds inherent redundancy and flexibility into the system for improved operational efficiency, real-time changes to space missions and enhanced reliability/redundancy. The CoNNeCT Project is a collaboration between industrial radio providers and NASA. The industrial radio providers are providing the SDRs and NASA is designing, building and testing the entire flight system. The flight system will be integrated on the Express Logistics Carrier (ELC) on the International Space Station (ISS) after launch on the H-IIB Transfer Vehicle in 2012. This paper provides an overview of the technology research objectives, payload description, design challenges and pre-flight testing results.

  20. Analysis of Factors Influencing Building Refurbishment Project Performance

    NASA Astrophysics Data System (ADS)

    Ishak, Nurfadzillah; Aswad Ibrahim, Fazdliel; Azizi Azizan, Muhammad

    2018-03-01

    Presently, the refurbishment approach becomes favourable as it creates opportunities to incorporate sustainable value with other building improvement. In this regard, this approach needs to be implemented due to the issues on overwhelming ratio of existing building to new construction, which also can contribute to the environmental problem. Refurbishment principles imply to minimize the environmental impact and upgrading the performance of an existing building to meet new requirements. In theoretically, building project's performance has a direct bearing on related to its potential for project success. However, in refurbishment building projects, the criteria for measure are become wider because the projects are a complex and multi-dimensional which encompassing many factors which reflect to the nature of works. Therefore, this impetus could be achieve by examine the direct empirical relationship between critical success factors (CSFs) and complexity factors (CFs) during managing the project in relation to delivering success on project performance. The research findings will be expected as the basis of future research in establish appropriate framework that provides information on managing refurbishment building projects and enhancing the project management competency for a better-built environment.

Top