Sample records for existing legacy software

  1. Software Safety Risk in Legacy Safety-Critical Computer Systems

    NASA Technical Reports Server (NTRS)

    Hill, Janice; Baggs, Rhoda

    2007-01-01

    Safety-critical computer systems must be engineered to meet system and software safety requirements. For legacy safety-critical computer systems, software safety requirements may not have been formally specified during development. When process-oriented software safety requirements are levied on a legacy system after the fact, where software development artifacts don't exist or are incomplete, the question becomes 'how can this be done?' The risks associated with only meeting certain software safety requirements in a legacy safety-critical computer system must be addressed should such systems be selected as candidates for reuse. This paper proposes a method for ascertaining formally, a software safety risk assessment, that provides measurements for software safety for legacy systems which may or may not have a suite of software engineering documentation that is now normally required. It relies upon the NASA Software Safety Standard, risk assessment methods based upon the Taxonomy-Based Questionnaire, and the application of reverse engineering CASE tools to produce original design documents for legacy systems.

  2. Software archeology: a case study in software quality assurance and design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Macdonald, John M; Lloyd, Jane A; Turner, Cameron J

    2009-01-01

    Ideally, quality is designed into software, just as quality is designed into hardware. However, when dealing with legacy systems, demonstrating that the software meets required quality standards may be difficult to achieve. As the need to demonstrate the quality of existing software was recognized at Los Alamos National Laboratory (LANL), an effort was initiated to uncover and demonstrate that legacy software met the required quality standards. This effort led to the development of a reverse engineering approach referred to as software archaeology. This paper documents the software archaeology approaches used at LANL to document legacy software systems. A case studymore » for the Robotic Integrated Packaging System (RIPS) software is included.« less

  3. Web-Based Environment for Maintaining Legacy Software

    NASA Technical Reports Server (NTRS)

    Tigges, Michael; Thompson, Nelson; Orr, Mark; Fox, Richard

    2007-01-01

    Advanced Tool Integration Environment (ATIE) is the name of both a software system and a Web-based environment created by the system for maintaining an archive of legacy software and expertise involved in developing the legacy software. ATIE can also be used in modifying legacy software and developing new software. The information that can be encapsulated in ATIE includes experts documentation, input and output data of tests cases, source code, and compilation scripts. All of this information is available within a common environment and retained in a database for ease of access and recovery by use of powerful search engines. ATIE also accommodates the embedment of supporting software that users require for their work, and even enables access to supporting commercial-off-the-shelf (COTS) software within the flow of the experts work. The flow of work can be captured by saving the sequence of computer programs that the expert uses. A user gains access to ATIE via a Web browser. A modern Web-based graphical user interface promotes efficiency in the retrieval, execution, and modification of legacy code. Thus, ATIE saves time and money in the support of new and pre-existing programs.

  4. Enabling Air Force Satellite Ground System Automation Through Software Engineering

    DTIC Science & Technology

    US Air Force satellite ground stations require significant manpower to operate due to their fragmented legacy architectures . To improve operating...daily operations, but also the development, maintainability, and the extensibility of such systems. This thesis researches challenges to Air Force...satellite automation: 1) existing architecture of legacy systems, 2) space segment diversity, and 3) unclear definition and scoping of the term, automation

  5. A Heuristic for Improving Legacy Software Quality during Maintenance: An Empirical Case Study

    ERIC Educational Resources Information Center

    Sale, Michael John

    2017-01-01

    Many organizations depend on the functionality of mission-critical legacy software and the continued maintenance of this software is vital. Legacy software is defined here as software that contains no testing suite, is often foreign to the developer performing the maintenance, lacks meaningful documentation, and over time, has become difficult to…

  6. DataRocket: Interactive Visualisation of Data Structures

    NASA Astrophysics Data System (ADS)

    Parkes, Steve; Ramsay, Craig

    2010-08-01

    CodeRocket is a software engineering tool that provides cognitive support to the software engineer for reasoning about a method or procedure and for documenting the resulting code [1]. DataRocket is a software engineering tool designed to support visualisation and reasoning about program data structures. DataRocket is part of the CodeRocket family of software tools developed by Rapid Quality Systems [2] a spin-out company from the Space Technology Centre at the University of Dundee. CodeRocket and DataRocket integrate seamlessly with existing architectural design and coding tools and provide extensive documentation with little or no effort on behalf of the software engineer. Comprehensive, abstract, detailed design documentation is available early on in a project so that it can be used for design reviews with project managers and non expert stakeholders. Code and documentation remain fully synchronised even when changes are implemented in the code without reference to the existing documentation. At the end of a project the press of a button suffices to produce the detailed design document. Existing legacy code can be easily imported into CodeRocket and DataRocket to reverse engineer detailed design documentation making legacy code more manageable and adding substantially to its value. This paper introduces CodeRocket. It then explains the rationale for DataRocket and describes the key features of this new tool. Finally the major benefits of DataRocket for different stakeholders are considered.

  7. Bringing Legacy Visualization Software to Modern Computing Devices via Application Streaming

    NASA Astrophysics Data System (ADS)

    Fisher, Ward

    2014-05-01

    Planning software compatibility across forthcoming generations of computing platforms is a problem commonly encountered in software engineering and development. While this problem can affect any class of software, data analysis and visualization programs are particularly vulnerable. This is due in part to their inherent dependency on specialized hardware and computing environments. A number of strategies and tools have been designed to aid software engineers with this task. While generally embraced by developers at 'traditional' software companies, these methodologies are often dismissed by the scientific software community as unwieldy, inefficient and unnecessary. As a result, many important and storied scientific software packages can struggle to adapt to a new computing environment; for example, one in which much work is carried out on sub-laptop devices (such as tablets and smartphones). Rewriting these packages for a new platform often requires significant investment in terms of development time and developer expertise. In many cases, porting older software to modern devices is neither practical nor possible. As a result, replacement software must be developed from scratch, wasting resources better spent on other projects. Enabled largely by the rapid rise and adoption of cloud computing platforms, 'Application Streaming' technologies allow legacy visualization and analysis software to be operated wholly from a client device (be it laptop, tablet or smartphone) while retaining full functionality and interactivity. It mitigates much of the developer effort required by other more traditional methods while simultaneously reducing the time it takes to bring the software to a new platform. This work will provide an overview of Application Streaming and how it compares against other technologies which allow scientific visualization software to be executed from a remote computer. We will discuss the functionality and limitations of existing application streaming frameworks and how a developer might prepare their software for application streaming. We will also examine the secondary benefits realized by moving legacy software to the cloud. Finally, we will examine the process by which a legacy Java application, the Integrated Data Viewer (IDV), is to be adapted for tablet computing via Application Streaming.

  8. Report of experiments and evidence for ASC L2 milestone 4467 : demonstration of a legacy application's path to exascale.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Curry, Matthew L.; Ferreira, Kurt Brian; Pedretti, Kevin Thomas Tauke

    2012-03-01

    This report documents thirteen of Sandia's contributions to the Computational Systems and Software Environment (CSSE) within the Advanced Simulation and Computing (ASC) program between fiscal years 2009 and 2012. It describes their impact on ASC applications. Most contributions are implemented in lower software levels allowing for application improvement without source code changes. Improvements are identified in such areas as reduced run time, characterizing power usage, and Input/Output (I/O). Other experiments are more forward looking, demonstrating potential bottlenecks using mini-application versions of the legacy codes and simulating their network activity on Exascale-class hardware. The purpose of this report is to provemore » that the team has completed milestone 4467-Demonstration of a Legacy Application's Path to Exascale. Cielo is expected to be the last capability system on which existing ASC codes can run without significant modifications. This assertion will be tested to determine where the breaking point is for an existing highly scalable application. The goal is to stretch the performance boundaries of the application by applying recent CSSE RD in areas such as resilience, power, I/O, visualization services, SMARTMAP, lightweight LWKs, virtualization, simulation, and feedback loops. Dedicated system time reservations and/or CCC allocations will be used to quantify the impact of system-level changes to extend the life and performance of the ASC code base. Finally, a simulation of anticipated exascale-class hardware will be performed using SST to supplement the calculations. Determine where the breaking point is for an existing highly scalable application: Chapter 15 presented the CSSE work that sought to identify the breaking point in two ASC legacy applications-Charon and CTH. Their mini-app versions were also employed to complete the task. There is no single breaking point as more than one issue was found with the two codes. The results were that applications can expect to encounter performance issues related to the computing environment, system software, and algorithms. Careful profiling of runtime performance will be needed to identify the source of an issue, in strong combination with knowledge of system software and application source code.« less

  9. Multicore Considerations for Legacy Flight Software Migration

    NASA Technical Reports Server (NTRS)

    Vines, Kenneth; Day, Len

    2013-01-01

    In this paper we will discuss potential benefits and pitfalls when considering a migration from an existing single core code base to a multicore processor implementation. The results of this study present options that should be considered before migrating fault managers, device handlers and tasks with time-constrained requirements to a multicore flight software environment. Possible future multicore test bed demonstrations are also discussed.

  10. Multigeneration data migration from legacy systems

    NASA Astrophysics Data System (ADS)

    Ratib, Osman M.; Liu, Brent J.; Kho, Hwa T.; Tao, Wenchao; Wang, Cun; McCoy, J. Michael

    2003-05-01

    The migration of image data from different generations of legacy archive systems represents a technical challenge and in incremental cost in transitions to newer generations of PACS. UCLA medical center has elected to completely replace the existing PACS infrastructure encompassing several generations of legacy systems by a new commercial system providing enterprise-wide image management and communication. One of the most challenging parts of the project was the migration of large volumes of legacy images into the new system. Planning of the migration required the development of specialized software and hardware, and included different phases of data mediation from existing databases to the new PACS database prior to the migration of the image data. The project plan included a detailed analysis of resources and cost of data migration to optimize the process and minimize the delay of a hybrid operation where the legacy systems need to remain operational. Our analysis and project planning showed that the data migration represents the most critical path in the process of PACS renewal. Careful planning and optimization of the project timeline and resources allocated is critical to minimize the financial impact and the time delays that such migrations can impose on the implementation plan.

  11. Software Intensive Systems Data Quality and Estimation Research in Support of Future Defense Cost Analysis

    DTIC Science & Technology

    2012-03-13

    Legacy Maintenance and Brownfield Development 6.6.6 Agile and Kanban Development 6.6.7 Putting It All Together at the Large-Project or Enterprise Level...NDI)-intensive systems Ultrahigh software system assurance; Legacy maintenance and brownfield development; and Agile and kanban development. This...be furnished by NDI components or may need to be developed for special systems. Legacy Maintenance and Brownfield Development Fewer and fewer software

  12. Mission Analysis, Operations, and Navigation Toolkit Environment (Monte) Version 040

    NASA Technical Reports Server (NTRS)

    Sunseri, Richard F.; Wu, Hsi-Cheng; Evans, Scott E.; Evans, James R.; Drain, Theodore R.; Guevara, Michelle M.

    2012-01-01

    Monte is a software set designed for use in mission design and spacecraft navigation operations. The system can process measurement data, design optimal trajectories and maneuvers, and do orbit determination, all in one application. For the first time, a single software set can be used for mission design and navigation operations. This eliminates problems due to different models and fidelities used in legacy mission design and navigation software. The unique features of Monte 040 include a blowdown thruster model for GRAIL (Gravity Recovery and Interior Laboratory) with associated pressure models, as well as an updated, optimalsearch capability (COSMIC) that facilitated mission design for ARTEMIS. Existing legacy software lacked the capabilities necessary for these two missions. There is also a mean orbital element propagator and an osculating to mean element converter that allows long-term orbital stability analysis for the first time in compiled code. The optimized trajectory search tool COSMIC allows users to place constraints and controls on their searches without any restrictions. Constraints may be user-defined and depend on trajectory information either forward or backwards in time. In addition, a long-term orbit stability analysis tool (morbiter) existed previously as a set of scripts on top of Monte. Monte is becoming the primary tool for navigation operations, a core competency at JPL. The mission design capabilities in Monte are becoming mature enough for use in project proposals as well as post-phase A mission design. Monte has three distinct advantages over existing software. First, it is being developed in a modern paradigm: object- oriented C++ and Python. Second, the software has been developed as a toolkit, which allows users to customize their own applications and allows the development team to implement requirements quickly, efficiently, and with minimal bugs. Finally, the software is managed in accordance with the CMMI (Capability Maturity Model Integration), where it has been ap praised at maturity level 3.

  13. A Software Safety Risk Taxonomy for Use in Retrospective Safety Cases

    NASA Technical Reports Server (NTRS)

    Hill, Janice L.

    2007-01-01

    Safety standards contain technical and process-oriented safely requirements. The best time to include these requirements is early in the development lifecycle of the system. When software safety requirements are levied on a legacy system after the fact, a retrospective safety case will need to be constructed for the software in the system. This can be a difficult task because there may be few to no art facts available to show compliance to the software safely requirements. The risks associated with not meeting safely requirements in a legacy safely-critical computer system must be addressed to give confidence for reuse. This paper introduces a proposal for a software safely risk taxonomy for legacy safely-critical computer systems, by specializing the Software Engineering Institute's 'Software Development Risk Taxonomy' with safely elements and attributes.

  14. Evolution of the Space Shuttle Primary Avionics Software and Avionics for Shuttle Derived Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Ferguson, Roscoe C.

    2011-01-01

    As a result of recommendation from the Augustine Panel, the direction for Human Space Flight has been altered from the original plan referred to as Constellation. NASA s Human Exploration Framework Team (HEFT) proposes the use of a Shuttle Derived Heavy Lift Launch Vehicle (SDLV) and an Orion derived spacecraft (salvaged from Constellation) to support a new flexible direction for space exploration. The SDLV must be developed within an environment of a constrained budget and a preferred fast development schedule. Thus, it has been proposed to utilize existing assets from the Shuttle Program to speed development at a lower cost. These existing assets should not only include structures such as external tanks or solid rockets, but also the Flight Software which has traditionally been a "long pole" in new development efforts. The avionics and software for the Space Shuttle was primarily developed in the 70 s and considered state of the art for that time. As one may argue that the existing avionics and flight software may be too outdated to support the new SDLV effort, this is a fallacy if they can be evolved over time into a "modern avionics" platform. The technology may be outdated, but the avionics concepts and flight software algorithms are not. The reuse of existing avionics and software also allows for the reuse of development, verification, and operations facilities. The keyword is evolve in that these assets can support the fast development of such a vehicle, but then be gradually evolved over time towards more modern platforms as budget and schedule permits. The "gold" of the flight software is the "control loop" algorithms of the vehicle. This is the Guidance, Navigation, and Control (GNC) software algorithms. This software is typically the most expensive to develop, test, and verify. Thus, the approach is to preserve the GNC flight software, while first evolving the supporting software (such as Command and Data Handling, Caution and Warning, Telemetry, etc.). This can be accomplished by gradually removing the "support software" from the legacy flight software leaving only the GNC algorithms. The "support software" could be re-developed for modern platforms, while leaving the GNC algorithms to execute on technology compatible with the legacy system. It is also possible to package the GNC algorithms into an emulated version of the original computer (via Field Programmable Gate Arrays or FPGAs), thus becoming a "GNC on a Chip" solution where it could live forever to be embedded in modern avionics platforms.

  15. Traceability of Software Safety Requirements in Legacy Safety Critical Systems

    NASA Technical Reports Server (NTRS)

    Hill, Janice L.

    2007-01-01

    How can traceability of software safety requirements be created for legacy safety critical systems? Requirements in safety standards are imposed most times during contract negotiations. On the other hand, there are instances where safety standards are levied on legacy safety critical systems, some of which may be considered for reuse for new applications. Safety standards often specify that software development documentation include process-oriented and technical safety requirements, and also require that system and software safety analyses are performed supporting technical safety requirements implementation. So what can be done if the requisite documents for establishing and maintaining safety requirements traceability are not available?

  16. Evolving software reengineering technology for the emerging innovative-competitive era

    NASA Technical Reports Server (NTRS)

    Hwang, Phillip Q.; Lock, Evan; Prywes, Noah

    1994-01-01

    This paper reports on a multi-tool commercial/military environment combining software Domain Analysis techniques with Reusable Software and Reengineering of Legacy Software. It is based on the development of a military version for the Department of Defense (DOD). The integrated tools in the military version are: Software Specification Assistant (SSA) and Software Reengineering Environment (SRE), developed by Computer Command and Control Company (CCCC) for Naval Surface Warfare Center (NSWC) and Joint Logistics Commanders (JLC), and the Advanced Research Project Agency (ARPA) STARS Software Engineering Environment (SEE) developed by Boeing for NAVAIR PMA 205. The paper describes transitioning these integrated tools to commercial use. There is a critical need for the transition for the following reasons: First, to date, 70 percent of programmers' time is applied to software maintenance. The work of these users has not been facilitated by existing tools. The addition of Software Reengineering will also facilitate software maintenance and upgrading. In fact, the integrated tools will support the entire software life cycle. Second, the integrated tools are essential to Business Process Reengineering, which seeks radical process innovations to achieve breakthrough results. Done well, process reengineering delivers extraordinary gains in process speed, productivity and profitability. Most importantly, it discovers new opportunities for products and services in collaboration with other organizations. Legacy computer software must be changed rapidly to support innovative business processes. The integrated tools will provide commercial organizations important competitive advantages. This, in turn, will increase employment by creating new business opportunities. Third, the integrated system will produce much higher quality software than use of the tools separately. The reason for this is that producing or upgrading software requires keen understanding of extremely complex applications which is facilitated by the integrated tools. The radical savings in the time and cost associated with software, due to use of CASE tools that support combined Reuse of Software and Reengineering of Legacy Code, will add an important impetus to improving the automation of enterprises. This will be reflected in continuing operations, as well as in innovating new business processes. The proposed multi-tool software development is based on state of the art technology, which will be further advanced through the use of open systems for adding new tools and experience in their use.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wagner, Robert; Rivers, Wilmer

    any single computer program for seismic data analysis will not have all the capabilities needed to study reference events, since hese detailed studies will be highly specialized. It may be necessary to develop and test new algorithms, and then these special ;odes must be integrated with existing software to use their conventional data-processing routines. We have investigated two neans of establishing communications between the legacy and new codes: CORBA and XML/SOAP Web services. We have nvestigated making new Java code communicate with a legacy C-language program, geotool, running under Linux. Both methods vere successful, but both were difficult to implement.more » C programs on UNIX/Linux are poorly supported for Web services, compared vith the Java and .NET languages and platforms. Easier-to-use middleware will be required for scientists to construct distributed applications as easily as stand-alone ones. Considerable difficulty was encountered in modifying geotool, and this problem shows he need to use component-based user interfaces instead of large C-language codes where changes to one part of the program nay introduce side effects into other parts. We have nevertheless made bug fixes and enhancements to that legacy program, but t remains difficult to expand it through communications with external software.« less

  18. Computing Legacy Software Behavior to Understand Functionality and Security Properties: An IBM/370 Demonstration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Linger, Richard C; Pleszkoch, Mark G; Prowell, Stacy J

    Organizations maintaining mainframe legacy software can benefit from code modernization and incorporation of security capabilities to address the current threat environment. Oak Ridge National Laboratory is developing the Hyperion system to compute the behavior of software as a means to gain understanding of software functionality and security properties. Computation of functionality is critical to revealing security attributes, which are in fact specialized functional behaviors of software. Oak Ridge is collaborating with MITRE Corporation to conduct a demonstration project to compute behavior of legacy IBM Assembly Language code for a federal agency. The ultimate goal is to understand functionality and securitymore » vulnerabilities as a basis for code modernization. This paper reports on the first phase, to define functional semantics for IBM Assembly instructions and conduct behavior computation experiments.« less

  19. Web Time-Management Tool

    NASA Technical Reports Server (NTRS)

    2000-01-01

    Oak Grove Reactor, developed by Oak Grove Systems, is a new software program that allows users to integrate workflow processes. It can be used with portable communication devices. The software can join e-mail, calendar/scheduling and legacy applications into one interactive system via the web. Priority tasks and due dates are organized and highlighted to keep the user up to date with developments. Reactor works with existing software and few new skills are needed to use it. Using a web browser, a user can can work on something while other users can work on the same procedure or view its status while it is being worked on at another site. The software was developed by the Jet Propulsion Lab and originally put to use at Johnson Space Center.

  20. Solving the Software Legacy Problem with RISA

    NASA Astrophysics Data System (ADS)

    Ibarra, A.; Gabriel, C.

    2012-09-01

    Nowadays hardware and system infrastructure evolve on time scales much shorter than the typical duration of space astronomy missions. Data processing software capabilities have to evolve to preserve the scientific return during the entire experiment life time. Software preservation is a key issue that has to be tackled before the end of the project to keep the data usable over many years. We present RISA (Remote Interface to Science Analysis) as a solution to decouple data processing software and infrastructure life-cycles, using JAVA applications and web-services wrappers to existing software. This architecture employs embedded SAS in virtual machines assuring a homogeneous job execution environment. We will also present the first studies to reactivate the data processing software of the EXOSAT mission, the first ESA X-ray astronomy mission launched in 1983, using the generic RISA approach.

  1. The Planck Legacy Archive

    NASA Astrophysics Data System (ADS)

    Dupac, X.; Arviset, C.; Fernandez Barreiro, M.; Lopez-Caniego, M.; Tauber, J.

    2015-12-01

    The Planck Collaboration has released in 2015 their second major dataset through the Planck Legacy Archive (PLA). It includes cosmological, Extragalactic and Galactic science data in temperature (intensity) and polarization. Full-sky maps are provided with unprecedented angular resolution and sensitivity, together with a large number of ancillary maps, catalogues (generic, SZ clusters and Galactic cold clumps), time-ordered data and other information. The extensive cosmological likelihood package allows cosmologists to fully explore the plausible parameters of the Universe. A new web-based PLA user interface is made public since Dec. 2014, allowing easier and faster access to all Planck data, and replacing the previous Java-based software. Numerous additional improvements to the PLA are also being developed through the so-called PLA Added-Value Interface, making use of an external contract with the Planetek Hellas and Expert Analytics software companies. This will allow users to process time-ordered data into sky maps, separate astrophysical components in existing maps, simulate the microwave and infrared sky through the Planck Sky Model, and use a number of other functionalities.

  2. Software ``Best'' Practices: Agile Deconstructed

    NASA Astrophysics Data System (ADS)

    Fraser, Steven

    This workshop will explore the intersection of agility and software development in a world of legacy code-bases and large teams. Organizations with hundreds of developers and code-bases exceeding a million or tens of millions of lines of code are seeking new ways to expedite development while retaining and attracting staff who desire to apply “agile” methods. This is a situation where specific agile practices may be embraced outside of their usual zone of applicability. Here is where practitioners must understand both what “best practices” already exist in the organization - and how they might be improved or modified by applying “agile” approaches.

  3. RT 6 - Software Intensive Systems Data Quality and Estimation Research in Support of Future Defense Cost Analysis

    DTIC Science & Technology

    2012-03-13

    Brownfield Development 6.6.6 Agile and Kanban Development 6.6.7 Putting It All Together at the Large-Project or Enterprise Level 6.7 References 7...Ultrahigh software system assurance; Legacy maintenance and brownfield development; and Agile and kanban development. This chapter summarizes each...components or may need to be developed for special systems. Legacy Maintenance and Brownfield Development Fewer and fewer software-intensive systems have

  4. Team Software Development for Aerothermodynamic and Aerodynamic Analysis and Design

    NASA Technical Reports Server (NTRS)

    Alexandrov, N.; Atkins, H. L.; Bibb, K. L.; Biedron, R. T.; Carpenter, M. H.; Gnoffo, P. A.; Hammond, D. P.; Jones, W. T.; Kleb, W. L.; Lee-Rausch, E. M.

    2003-01-01

    A collaborative approach to software development is described. The approach employs the agile development techniques: project retrospectives, Scrum status meetings, and elements of Extreme Programming to efficiently develop a cohesive and extensible software suite. The software product under development is a fluid dynamics simulator for performing aerodynamic and aerothermodynamic analysis and design. The functionality of the software product is achieved both through the merging, with substantial rewrite, of separate legacy codes and the authorship of new routines. Examples of rapid implementation of new functionality demonstrate the benefits obtained with this agile software development process. The appendix contains a discussion of coding issues encountered while porting legacy Fortran 77 code to Fortran 95, software design principles, and a Fortran 95 coding standard.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    This software is a plug-in that interfaces between the Phoenix Integration's Model Center and the Base SAS 9.2 applications. The end use of the plug-in is to link input and output data that resides in SAS tables or MS SQL to and from "legacy" software programs without recoding. The potential end users are users who need to run legacy code and want data stored in a SQL database.

  6. Incremental Upgrade of Legacy Systems (IULS)

    DTIC Science & Technology

    2001-04-01

    analysis task employed SEI’s Feature-Oriented Domain Analysis methodology (see FODA reference) and included several phases: • Context Analysis • Establish...Legacy, new Host and upgrade system and software. The Feature Oriented Domain Analysis approach ( FODA , see SUM References) was used for this step...Feature-Oriented Domain Analysis ( FODA ) Feasibility Study (CMU/SEI-90-TR- 21, ESD-90-TR-222); Software Engineering Institute, Carnegie Mellon University

  7. Data-Proximate Analysis and Visualization in the Cloud using Cloudstream, an Open-Source Application Streaming Technology Stack

    NASA Astrophysics Data System (ADS)

    Fisher, W. I.

    2017-12-01

    The rise in cloud computing, coupled with the growth of "Big Data", has lead to a migration away from local scientific data storage. The increasing size of remote scientific data sets increase, however, makes it difficult for scientists to subject them to large-scale analysis and visualization. These large datasets can take an inordinate amount of time to download; subsetting is a potential solution, but subsetting services are not yet ubiquitous. Data providers may also pay steep prices, as many cloud providers meter data based on how much data leaves their cloud service. The solution to this problem is a deceptively simple one; move data analysis and visualization tools to the cloud, so that scientists may perform data-proximate analysis and visualization. This results in increased transfer speeds, while egress costs are lowered or completely eliminated. Moving standard desktop analysis and visualization tools to the cloud is enabled via a technique called "Application Streaming". This technology allows a program to run entirely on a remote virtual machine while still allowing for interactivity and dynamic visualizations. When coupled with containerization technology such as Docker, we are able to easily deploy legacy analysis and visualization software to the cloud whilst retaining access via a desktop, netbook, a smartphone, or the next generation of hardware, whatever it may be. Unidata has created a Docker-based solution for easily adapting legacy software for Application Streaming. This technology stack, dubbed Cloudstream, allows desktop software to run in the cloud with little-to-no effort. The docker container is configured by editing text files, and the legacy software does not need to be modified in any way. This work will discuss the underlying technologies used by Cloudstream, and outline how to use Cloudstream to run and access an existing desktop application to the cloud.

  8. ORAC: 21st Century Observing at UKIRT

    NASA Astrophysics Data System (ADS)

    Bridger, A.; Wright, G. S.; Tan, M.; Pickup, D. A.; Economou, F.; Currie, M. J.; Adamson, A. J.; Rees, N. P.; Purves, M. H.

    The Observatory Reduction and Acquisition Control system replaces all of the existing software which interacts with the observers at UKIRT. The aim is to improve observing efficiency with a set of integrated tools that take the user from pre-observing preparation, through the acquisition of observations to the reduction using a data-driven pipeline. ORAC is designed to be flexible and extensible, and is intended for use with all future UKIRT instruments, as well as existing telescope hardware and ``legacy'' instruments. It is also designed to allow integration with phase-1 and queue-scheduled observing tools in anticipation of possible future requirements. A brief overview of the project and its relationship to other systems is given. ORAC also re-uses much code from other systems and we discuss issues relating to the trade-off between reuse and the generation of new software specific to our requirements.

  9. Experiences integrating autonomous components and legacy systems into tsunami early warning systems

    NASA Astrophysics Data System (ADS)

    Reißland, S.; Herrnkind, S.; Guenther, M.; Babeyko, A.; Comoglu, M.; Hammitzsch, M.

    2012-04-01

    Fostered by and embedded in the general development of Information and Communication Technology (ICT) the evolution of Tsunami Early Warning Systems (TEWS) shows a significant development from seismic-centred to multi-sensor system architectures using additional sensors, e.g. sea level stations for the detection of tsunami waves and GPS stations for the detection of ground displacements. Furthermore, the design and implementation of a robust and scalable service infrastructure supporting the integration and utilisation of existing resources serving near real-time data not only includes sensors but also other components and systems offering services such as the delivery of feasible simulations used for forecasting in an imminent tsunami threat. In the context of the development of the German Indonesian Tsunami Early Warning System (GITEWS) and the project Distant Early Warning System (DEWS) a service platform for both sensor integration and warning dissemination has been newly developed and demonstrated. In particular, standards of the Open Geospatial Consortium (OGC) and the Organization for the Advancement of Structured Information Standards (OASIS) have been successfully incorporated. In the project Collaborative, Complex, and Critical Decision-Support in Evolving Crises (TRIDEC) new developments are used to extend the existing platform to realise a component-based technology framework for building distributed TEWS. This talk will describe experiences made in GITEWS, DEWS and TRIDEC while integrating legacy stand-alone systems and newly developed special-purpose software components into TEWS using different software adapters and communication strategies to make the systems work together in a corporate infrastructure. The talk will also cover task management and data conversion between the different systems. Practical approaches and software solutions for the integration of sensors, e.g. providing seismic and sea level data, and utilisation of special-purpose components, such as simulation systems, in TEWS will be presented.

  10. Transforming Legacy Systems to Obtain Information Superiority

    DTIC Science & Technology

    2001-01-01

    is imperative that innovative technologies be developed to enable legacy weapon systems to exploit the information revolution, achieve information ... dominance , and meet the required operational tempo. This paper presents an embedded-system architecture, open system middleware services, and a software

  11. SMART: Analyzing the Reuse Potential of Legacy Systems in Service- Oriented Architecture (SOA) Environments

    DTIC Science & Technology

    2009-04-09

    technical faculty for the Master in Software Engineering program at CMU. Grace holds a B.Sc. in Systems Engineering and an Executive MBA from Icesi...University in Cali, Colombia ; and a Master in Software Engineering from Carnegie Mellon University. 3 Version 1.7.3—SEI Webinar—April 2009 © 2009 Carnegie...Resources and Training SMART Report • http://www.sei.cmu.edu/publications/documents/08.reports/08tn008.html Public Courses • Migration of Legacy

  12. SQL-RAMS

    NASA Technical Reports Server (NTRS)

    Alfaro, Victor O.; Casey, Nancy J.

    2005-01-01

    SQL-RAMS (where "SQL" signifies Structured Query Language and "RAMS" signifies Rocketdyne Automated Management System) is a successor to the legacy version of RAMS -- a computer program used to manage all work, nonconformance, corrective action, and configuration management on rocket engines and ground support equipment at Stennis Space Center. The legacy version resided in the File-Maker Pro software system and was constructed in modules that could act as standalone programs. There was little or no integration among modules. Because of limitations on file-management capabilities in FileMaker Pro, and because of difficulty of integration of FileMaker Pro with other software systems for exchange of data using such industry standards as SQL, the legacy version of RAMS proved to be limited, and working to circumvent its limitations too time-consuming. In contrast, SQL-RAMS is an integrated SQL-server-based program that supports all data-exchange software industry standards. Whereas in the legacy version, it was necessary to access individual modules to gain insight into a particular workstatus document, SQL-RAMS provides access through a single-screen presentation of core modules. In addition, SQL-RAMS enables rapid and efficient filtering of displayed statuses by predefined categories and test numbers. SQL-RAMS is rich in functionality and encompasses significant improvements over the legacy system. It provides users the ability to perform many tasks, which in the past required administrator intervention. Additionally, many of the design limitations have been corrected, allowing for a robust application that is user centric.

  13. SQL-RAMS

    NASA Technical Reports Server (NTRS)

    Alfaro, Victor O.; Casey, Nancy J.

    2005-01-01

    SQL-RAMS (where "SQL" signifies Structured Query Language and "RAMS" signifies Rocketdyne Automated Management System) is a successor to the legacy version of RAMS a computer program used to manage all work, nonconformance, corrective action, and configuration management on rocket engines and ground support equipment at Stennis Space Center. The legacy version resided in the FileMaker Pro software system and was constructed in modules that could act as stand-alone programs. There was little or no integration among modules. Because of limitations on file-management capabilities in FileMaker Pro, and because of difficulty of integration of FileMaker Pro with other software systems for exchange of data using such industry standards as SQL, the legacy version of RAMS proved to be limited, and working to circumvent its limitations too time-consuming. In contrast, SQL-RAMS is an integrated SQL-server-based program that supports all data-exchange software industry standards. Whereas in the legacy version, it was necessary to access individual modules to gain insight to a particular work-status documents, SQL-RAMS provides access through a single-screen presentation of core modules. In addition, SQL-RAMS enable rapid and efficient filtering of displayed statuses by predefined categories and test numbers. SQL-RAMS is rich in functionality and encompasses significant improvements over the legacy system. It provides users the ability to perform many tasks which in the past required administrator intervention. Additionally many of the design limitations have been corrected allowing for a robust application that is user centric.

  14. Analyzing Language in Suicide Notes and Legacy Tokens.

    PubMed

    Egnoto, Michael J; Griffin, Darrin J

    2016-03-01

    Identifying precursors that will aid in the discovery of individuals who may harm themselves or others has long been a focus of scholarly research. This work set out to determine if it is possible to use the legacy tokens of active shooters and notes left from individuals who completed suicide to uncover signals that foreshadow their behavior. A total of 25 suicide notes and 21 legacy tokens were compared with a sample of over 20,000 student writings for a preliminary computer-assisted text analysis to determine what differences can be coded with existing computer software to better identify students who may commit self-harm or harm to others. The results support that text analysis techniques with the Linguistic Inquiry and Word Count (LIWC) tool are effective for identifying suicidal or homicidal writings as distinct from each other and from a variety of student writings in an automated fashion. Findings indicate support for automated identification of writings that were associated with harm to self, harm to others, and various other student writing products. This work begins to uncover the viability or larger scale, low cost methods of automatic detection for individuals suffering from harmful ideation.

  15. Controlling Infrastructure Costs: Right-Sizing the Mission Control Facility

    NASA Technical Reports Server (NTRS)

    Martin, Keith; Sen-Roy, Michael; Heiman, Jennifer

    2009-01-01

    Johnson Space Center's Mission Control Center is a space vehicle, space program agnostic facility. The current operational design is essentially identical to the original facility architecture that was developed and deployed in the mid-90's. In an effort to streamline the support costs of the mission critical facility, the Mission Operations Division (MOD) of Johnson Space Center (JSC) has sponsored an exploratory project to evaluate and inject current state-of-the-practice Information Technology (IT) tools, processes and technology into legacy operations. The general push in the IT industry has been trending towards a data-centric computer infrastructure for the past several years. Organizations facing challenges with facility operations costs are turning to creative solutions combining hardware consolidation, virtualization and remote access to meet and exceed performance, security, and availability requirements. The Operations Technology Facility (OTF) organization at the Johnson Space Center has been chartered to build and evaluate a parallel Mission Control infrastructure, replacing the existing, thick-client distributed computing model and network architecture with a data center model utilizing virtualization to provide the MCC Infrastructure as a Service. The OTF will design a replacement architecture for the Mission Control Facility, leveraging hardware consolidation through the use of blade servers, increasing utilization rates for compute platforms through virtualization while expanding connectivity options through the deployment of secure remote access. The architecture demonstrates the maturity of the technologies generally available in industry today and the ability to successfully abstract the tightly coupled relationship between thick-client software and legacy hardware into a hardware agnostic "Infrastructure as a Service" capability that can scale to meet future requirements of new space programs and spacecraft. This paper discusses the benefits and difficulties that a migration to cloud-based computing philosophies has uncovered when compared to the legacy Mission Control Center architecture. The team consists of system and software engineers with extensive experience with the MCC infrastructure and software currently used to support the International Space Station (ISS) and Space Shuttle program (SSP).

  16. Legacy systems: managing evolution through integration in a distributed and object-oriented computing environment.

    PubMed

    Lemaitre, D; Sauquet, D; Fofol, I; Tanguy, L; Jean, F C; Degoulet, P

    1995-01-01

    Legacy systems are crucial for organizations since they support key functionalities. But they become obsolete with aging and the apparition of new techniques. Managing their evolution is a key issue in software engineering. This paper presents a strategy that has been developed at Broussais University Hospital in Paris to make a legacy system devoted to the management of health care units evolve towards a new up-to-date software. A two-phase evolution pathway is described. The first phase consists in separating the interface from the data storage and application control and in using a communication channel between the individualized components. The second phase proposes to use an object-oriented DBMS in place of the homegrown system. An application example for the management of hypertensive patients is described.

  17. Software Safety Risk in Legacy Safety-Critical Computer Systems

    NASA Technical Reports Server (NTRS)

    Hill, Janice L.; Baggs, Rhoda

    2007-01-01

    Safety Standards contain technical and process-oriented safety requirements. Technical requirements are those such as "must work" and "must not work" functions in the system. Process-Oriented requirements are software engineering and safety management process requirements. Address the system perspective and some cover just software in the system > NASA-STD-8719.13B Software Safety Standard is the current standard of interest. NASA programs/projects will have their own set of safety requirements derived from the standard. Safety Cases: a) Documented demonstration that a system complies with the specified safety requirements. b) Evidence is gathered on the integrity of the system and put forward as an argued case. [Gardener (ed.)] c) Problems occur when trying to meet safety standards, and thus make retrospective safety cases, in legacy safety-critical computer systems.

  18. Reengineering legacy software to object-oriented systems

    NASA Technical Reports Server (NTRS)

    Pitman, C.; Braley, D.; Fridge, E.; Plumb, A.; Izygon, M.; Mears, B.

    1994-01-01

    NASA has a legacy of complex software systems that are becoming increasingly expensive to maintain. Reengineering is one approach to modemizing these systems. Object-oriented technology, other modem software engineering principles, and automated tools can be used to reengineer the systems and will help to keep maintenance costs of the modemized systems down. The Software Technology Branch at the NASA/Johnson Space Center has been developing and testing reengineering methods and tools for several years. The Software Technology Branch is currently providing training and consulting support to several large reengineering projects at JSC, including the Reusable Objects Software Environment (ROSE) project, which is reengineering the flight analysis and design system (over 2 million lines of FORTRAN code) into object-oriented C++. Many important lessons have been learned during the past years; one of these is that the design must never be allowed to diverge from the code during maintenance and enhancement. Future work on open, integrated environments to support reengineering is being actively planned.

  19. Legacy systems: managing evolution through integration in a distributed and object-oriented computing environment.

    PubMed Central

    Lemaitre, D.; Sauquet, D.; Fofol, I.; Tanguy, L.; Jean, F. C.; Degoulet, P.

    1995-01-01

    Legacy systems are crucial for organizations since they support key functionalities. But they become obsolete with aging and the apparition of new techniques. Managing their evolution is a key issue in software engineering. This paper presents a strategy that has been developed at Broussais University Hospital in Paris to make a legacy system devoted to the management of health care units evolve towards a new up-to-date software. A two-phase evolution pathway is described. The first phase consists in separating the interface from the data storage and application control and in using a communication channel between the individualized components. The second phase proposes to use an object-oriented DBMS in place of the homegrown system. An application example for the management of hypertensive patients is described. PMID:8563252

  20. Challenges and Demands on Automated Software Revision

    NASA Technical Reports Server (NTRS)

    Bonakdarpour, Borzoo; Kulkarni, Sandeep S.

    2008-01-01

    In the past three decades, automated program verification has undoubtedly been one of the most successful contributions of formal methods to software development. However, when verification of a program against a logical specification discovers bugs in the program, manual manipulation of the program is needed in order to repair it. Thus, in the face of existence of numerous unverified and un- certified legacy software in virtually any organization, tools that enable engineers to automatically verify and subsequently fix existing programs are highly desirable. In addition, since requirements of software systems often evolve during the software life cycle, the issue of incomplete specification has become a customary fact in many design and development teams. Thus, automated techniques that revise existing programs according to new specifications are of great assistance to designers, developers, and maintenance engineers. As a result, incorporating program synthesis techniques where an algorithm generates a program, that is correct-by-construction, seems to be a necessity. The notion of manual program repair described above turns out to be even more complex when programs are integrated with large collections of sensors and actuators in hostile physical environments in the so-called cyber-physical systems. When such systems are safety/mission- critical (e.g., in avionics systems), it is essential that the system reacts to physical events such as faults, delays, signals, attacks, etc, so that the system specification is not violated. In fact, since it is impossible to anticipate all possible such physical events at design time, it is highly desirable to have automated techniques that revise programs with respect to newly identified physical events according to the system specification.

  1. Software Assurance: Five Essential Considerations for Acquisition Officials

    DTIC Science & Technology

    2007-05-01

    May 2007 www.stsc.hill.af.mil 17 2 • address security concerns in the software development life cycle ( SDLC )? • Are there formal software quality...What threat modeling process, if any, is used when designing the software ? What analysis, design, and construction tools are used by your software design...the-shelf (COTS), government off-the-shelf (GOTS), open- source, embedded, and legacy software . Attackers exploit unintentional vulnerabil- ities or

  2. A TTC upgrade proposal using bidirectional 10G-PON FTTH technology

    NASA Astrophysics Data System (ADS)

    Kolotouros, D. M.; Baron, S.; Soos, C.; Vasey, F.

    2015-04-01

    A new generation FPGA-based Timing-Trigger and Control (TTC) system based on emerging Passive Optical Network (PON) technology is being proposed to replace the existing off-detector TTC system used by the LHC experiments. High split ratio, dynamic software partitioning, low and deterministic latency, as well as low jitter are required. Exploiting the latest available technologies allows delivering higher capacity together with bidirectionality, a feature absent from the legacy TTC system. This article focuses on the features and capabilities of the latest TTC-PON prototype based on 10G-PON FTTH components along with some metrics characterizing its performance.

  3. CCSDS Spacecraft Monitor and Control Mission Operations Interoperability Prototype

    NASA Technical Reports Server (NTRS)

    Lucord, Steve; Martinez, Lindolfo

    2009-01-01

    We are entering a new era in space exploration. Reduced operating budgets require innovative solutions to leverage existing systems to implement the capabilities of future missions. Custom solutions to fulfill mission objectives are no longer viable. Can NASA adopt international standards to reduce costs and increase interoperability with other space agencies? Can legacy systems be leveraged in a service oriented architecture (SOA) to further reduce operations costs? The Operations Technology Facility (OTF) at the Johnson Space Center (JSC) is collaborating with Deutsches Zentrum fur Luft- und Raumfahrt (DLR) to answer these very questions. The Mission Operations and Information Management Services Area (MOIMS) Spacecraft Monitor and Control (SM&C) Working Group within the Consultative Committee for Space Data Systems (CCSDS) is developing the Mission Operations standards to address this problem space. The set of proposed standards presents a service oriented architecture to increase the level of interoperability among space agencies. The OTF and DLR are developing independent implementations of the standards as part of an interoperability prototype. This prototype will address three key components: validation of the SM&C Mission Operations protocol, exploration of the Object Management Group (OMG) Data Distribution Service (DDS), and the incorporation of legacy systems in a SOA. The OTF will implement the service providers described in the SM&C Mission Operation standards to create a portal for interaction with a spacecraft simulator. DLR will implement the service consumers to perform the monitor and control of the spacecraft. The specifications insulate the applications from the underlying transport layer. We will gain experience with a DDS transport layer as we delegate responsibility to the middleware and explore transport bridges to connect disparate middleware products. A SOA facilitates the reuse of software components. The prototype will leverage the capabilities of existing legacy systems. Various custom applications and middleware solutions will be combined into one system providing the illusion of a set of homogenous services. This paper will document our journey as we implement the interoperability prototype. The team consists of software engineers with experience on the current command, telemetry and messaging systems that support the International Space Station (ISS) and Space Shuttle programs. Emphasis will be on the objectives, results and potential cost saving benefits.

  4. Product Engineering Class in the Software Safety Risk Taxonomy for Building Safety-Critical Systems

    NASA Technical Reports Server (NTRS)

    Hill, Janice; Victor, Daniel

    2008-01-01

    When software safety requirements are imposed on legacy safety-critical systems, retrospective safety cases need to be formulated as part of recertifying the systems for further use and risks must be documented and managed to give confidence for reusing the systems. The SEJ Software Development Risk Taxonomy [4] focuses on general software development issues. It does not, however, cover all the safety risks. The Software Safety Risk Taxonomy [8] was developed which provides a construct for eliciting and categorizing software safety risks in a straightforward manner. In this paper, we present extended work on the taxonomy for safety that incorporates the additional issues inherent in the development and maintenance of safety-critical systems with software. An instrument called a Software Safety Risk Taxonomy Based Questionnaire (TBQ) is generated containing questions addressing each safety attribute in the Software Safety Risk Taxonomy. Software safety risks are surfaced using the new TBQ and then analyzed. In this paper we give the definitions for the specialized Product Engineering Class within the Software Safety Risk Taxonomy. At the end of the paper, we present the tool known as the 'Legacy Systems Risk Database Tool' that is used to collect and analyze the data required to show traceability to a particular safety standard

  5. PyMidas: Interface from Python to Midas

    NASA Astrophysics Data System (ADS)

    Maisala, Sami; Oittinen, Tero

    2014-01-01

    PyMidas is an interface between Python and MIDAS, the major ESO legacy general purpose data processing system. PyMidas allows a user to exploit both the rich legacy of MIDAS software and the power of Python scripting in a unified interactive environment. PyMidas also allows the usage of other Python-based astronomical analysis systems such as PyRAF.

  6. Architecture for autonomy

    NASA Astrophysics Data System (ADS)

    Broten, Gregory S.; Monckton, Simon P.; Collier, Jack; Giesbrecht, Jared

    2006-05-01

    In 2002 Defence R&D Canada changed research direction from pure tele-operated land vehicles to general autonomy for land, air, and sea craft. The unique constraints of the military environment coupled with the complexity of autonomous systems drove DRDC to carefully plan a research and development infrastructure that would provide state of the art tools without restricting research scope. DRDC's long term objectives for its autonomy program address disparate unmanned ground vehicle (UGV), unattended ground sensor (UGS), air (UAV), and subsea and surface (UUV and USV) vehicles operating together with minimal human oversight. Individually, these systems will range in complexity from simple reconnaissance mini-UAVs streaming video to sophisticated autonomous combat UGVs exploiting embedded and remote sensing. Together, these systems can provide low risk, long endurance, battlefield services assuming they can communicate and cooperate with manned and unmanned systems. A key enabling technology for this new research is a software architecture capable of meeting both DRDC's current and future requirements. DRDC built upon recent advances in the computing science field while developing its software architecture know as the Architecture for Autonomy (AFA). Although a well established practice in computing science, frameworks have only recently entered common use by unmanned vehicles. For industry and government, the complexity, cost, and time to re-implement stable systems often exceeds the perceived benefits of adopting a modern software infrastructure. Thus, most persevere with legacy software, adapting and modifying software when and wherever possible or necessary -- adopting strategic software frameworks only when no justifiable legacy exists. Conversely, academic programs with short one or two year projects frequently exploit strategic software frameworks but with little enduring impact. The open-source movement radically changes this picture. Academic frameworks, open to public scrutiny and modification, now rival commercial frameworks in both quality and economic impact. Further, industry now realizes that open source frameworks can reduce cost and risk of systems engineering. This paper describes the Architecture for Autonomy implemented by DRDC and how this architecture meets DRDC's current needs. It also presents an argument for why this architecture should also satisfy DRDC's future requirements as well.

  7. Common Misconceptions About Service-Oriented Architecture

    DTIC Science & Technology

    2007-11-01

    addition, the architect(s) must make decisions on how services are implemented. Service implementations may involve developing new software , wrapping a...legacy software system, incor- porating services provided by third par- ties, or a combination of these options. Information about the quality attrib...temperature. However, there 28 CROSSTALK The Journal of Defense Software Engineering November 2007 Common Misconceptions About Service -Oriented

  8. NEXUS - Resilient Intelligent Middleware

    NASA Astrophysics Data System (ADS)

    Kaveh, N.; Hercock, R. Ghanea

    Service-oriented computing, a composition of distributed-object computing, component-based, and Web-based concepts, is becoming the widespread choice for developing dynamic heterogeneous software assets available as services across a network. One of the major strengths of service-oriented technologies is the high abstraction layer and large granularity level at which software assets are viewed compared to traditional object-oriented technologies. Collaboration through encapsulated and separately defined service interfaces creates a service-oriented environment, whereby multiple services can be linked together through their interfaces to compose a functional system. This approach enables better integration of legacy and non-legacy services, via wrapper interfaces, and allows for service composition at a more abstract level especially in cases such as vertical market stacks. The heterogeneous nature of service-oriented technologies and the granularity of their software components makes them a suitable computing model in the pervasive domain.

  9. A Research Agenda for Service-Oriented Architecture (SOA): Maintenance and Evolution of Service-Oriented Systems

    DTIC Science & Technology

    2010-03-01

    service consumers, and infrastructure. Techniques from any iterative and incremental software development methodology followed by the organiza- tion... Service -Oriented Architecture Environment (CMU/SEI-2008-TN-008). Software Engineering Institute, Carnegie Mellon University, 2008. http://www.sei.cmu.edu...Integrating Legacy Software into a Service Oriented Architecture.” Proceedings of the 10th European Conference on Software Maintenance (CSMR 2006). Bari

  10. Interfacing with Legacy using Remote Method Invocation

    NASA Technical Reports Server (NTRS)

    Howard, Scott M.

    1998-01-01

    The assignment described was enough to make a neophyte Java developer bolt for the door: provide a remote method for use by an applet which invokes a native method that wraps a function in an existing legacy library. The purpose of the remote method is to return an instance of a class object whose contents reflect the data structure returned by the legacy function. While embroiled in implementation, I would have spent the time wading through their JNI use group archive as well, but I couldn't seem to locate one. Subsequently, I made the decision to try to document my findings in order to assist others. Before we start on the class design, let's look at what the existing legacy code does. The C function to be called, Get-Legacy-Data, consists of two steps: an ASII file is read from the local disk and its contents are parsed into a Legacy_Type structure whose address is passed as an argument by the caller. The legacy code was compiled into a shared object library, legacy. so, using the IRIX 6.2 compiler and then loaded onto the Web server, a Silicon Graphics Indy station loaded with the IRIX 6.4 operating system. As far as the class design is concerned, the first thing required is a class to act as a template for the data structure returned by the legacy function. This class, JLegacy, declares a series of public instance variables which correspond to the members of Legacy_Type and provides a parameterless constructor. This constructor is never called, not even by the native method which allocates the object for return to the remote method. Next, the remote interface declaration for the remote object must be defined. In order for JLegacyRO to implement getJLegacy, JLegacyRO must interface with the existing legacy code through a native method, getn. getn is declared in the JLegacyRO class but implemented in C, just like the legacy code. getn returns a JLegacy instance and is declared static since its implementation is the same for all instances of the JLegacyRO class.

  11. One-Click Data Analysis Software for Science Operations

    NASA Astrophysics Data System (ADS)

    Navarro, Vicente

    2015-12-01

    One of the important activities of ESA Science Operations Centre is to provide Data Analysis Software (DAS) to enable users and scientists to process data further to higher levels. During operations and post-operations, Data Analysis Software (DAS) is fully maintained and updated for new OS and library releases. Nonetheless, once a Mission goes into the "legacy" phase, there are very limited funds and long-term preservation becomes more and more difficult. Building on Virtual Machine (VM), Cloud computing and Software as a Service (SaaS) technologies, this project has aimed at providing long-term preservation of Data Analysis Software for the following missions: - PIA for ISO (1995) - SAS for XMM-Newton (1999) - Hipe for Herschel (2009) - EXIA for EXOSAT (1983) Following goals have guided the architecture: - Support for all operations, post-operations and archive/legacy phases. - Support for local (user's computer) and cloud environments (ESAC-Cloud, Amazon - AWS). - Support for expert users, requiring full capabilities. - Provision of a simple web-based interface. This talk describes the architecture, challenges, results and lessons learnt gathered in this project.

  12. A Software Upgrade of the NASA Aeroheating Code "MINIVER"

    NASA Technical Reports Server (NTRS)

    Louderback, Pierce Mathew

    2013-01-01

    Computational Fluid Dynamics (CFD) is a powerful and versatile tool simulating fluid and thermal environments of launch and re-entry vehicles alike. Where it excels in power and accuracy, however, it lacks in speed. An alternative tool for this purpose is known as MINIVER, an aeroheating code widely used by NASA and within the aerospace industry. Capable of providing swift, reasonably accurate approximations of the fluid and thermal environment of launch vehicles, MINIVER is used where time is of the essence and accuracy need not be exact. However, MINIVER is an old, aging tool: running on a user-unfriendly, legacy command-line interface, it is difficult for it to keep pace with more modem software tools. Florida Institute of Technology was tasked with the construction of a new Graphical User Interface (GUI) that implemented the legacy version's capabilities and enhanced them with new tools and utilities. This thesis provides background to the legacy version of the program, the progression and final version of a modem user interface, and benchmarks to demonstrate its usefulness.

  13. Future Software Sizing Metrics and Estimation Challenges

    DTIC Science & Technology

    2011-07-01

    systems 4. Ultrahigh software system assurance 5. Legacy maintenance and Brownfield development 6. Agile and Lean/ Kanban development. This paper...refined as the design of the maintenance modifications or Brownfield re-engineering is determined. VII. 6. AGILE AND LEAN/ KANBAN DEVELOPMENT The...difficulties of software maintenance estimation can often be mitigated by using lean workflow management techniques such as Kanban [25]. In Kanban

  14. Legacy Systems Upgrade for Software Risk Assessment

    DTIC Science & Technology

    2001-12-01

    Nieto, T. R., Visual Basic 6: How to Program , Prentice Hall, Inc., 1999 Edson, David, Professional Development with Visio 2000: The...and the necessity for the program . Second, follow with a review of the enhancement algorithms and software extensions intended to add additional...taken prior to the programming . First, one must gain thorough understanding of the operation of VitéProject software and its interface with Visio

  15. Legacies in material flux: Structural changes before long-term studies

    Treesearch

    D.J. Bain; M.B. Green; J. Campbell; J. Chamblee; S. Chaoka; J. Fraterrigo; S. Kaushal; S. Martin; T. Jordan; T. Parolari; B. Sobczak; D. Weller; W. M. Wollheim; E. Boose; J. Duncan; G. Gettel; B. Hall; P. Kumar; J. Thompson; J. Vose; E. Elliott; D. Leigh

    2012-01-01

    Legacy effects of past land use and disturbance are increasingly recognized, yet consistent definitions of and criteria for defining them do not exist. To address this gap in biological- and ecosystem-assessment frameworks, we propose a general metric for evaluating potential legacy effects, which are computed by normalizing altered system function persistence with...

  16. GammaLib and ctools. A software framework for the analysis of astronomical gamma-ray data

    NASA Astrophysics Data System (ADS)

    Knödlseder, J.; Mayer, M.; Deil, C.; Cayrou, J.-B.; Owen, E.; Kelley-Hoskins, N.; Lu, C.-C.; Buehler, R.; Forest, F.; Louge, T.; Siejkowski, H.; Kosack, K.; Gerard, L.; Schulz, A.; Martin, P.; Sanchez, D.; Ohm, S.; Hassan, T.; Brau-Nogué, S.

    2016-08-01

    The field of gamma-ray astronomy has seen important progress during the last decade, yet to date no common software framework has been developed for the scientific analysis of gamma-ray telescope data. We propose to fill this gap by means of the GammaLib software, a generic library that we have developed to support the analysis of gamma-ray event data. GammaLib was written in C++ and all functionality is available in Python through an extension module. Based on this framework we have developed the ctools software package, a suite of software tools that enables flexible workflows to be built for the analysis of Imaging Air Cherenkov Telescope event data. The ctools are inspired by science analysis software available for existing high-energy astronomy instruments, and they follow the modular ftools model developed by the High Energy Astrophysics Science Archive Research Center. The ctools were written in Python and C++, and can be either used from the command line via shell scripts or directly from Python. In this paper we present the GammaLib and ctools software versions 1.0 that were released at the end of 2015. GammaLib and ctools are ready for the science analysis of Imaging Air Cherenkov Telescope event data, and also support the analysis of Fermi-LAT data and the exploitation of the COMPTEL legacy data archive. We propose using ctools as the science tools software for the Cherenkov Telescope Array Observatory.

  17. Legacy effects in material flux: structural catchment changes predate long-term studies

    Treesearch

    Daniel Bain; Mark B. Green; John L. Campbell; John F. Chamblee; Sayo Chaoka; Jennifer M. Fraterrigo; Sujay S. Kaushal; Sujay S. Kaushal; Sherry L. Martin; Thomas E. Jordan; Anthony J. Parolari; William V. Sobczak; Donald E. Weller; Wilfred M. Wolheim; Emery R. Boose; Jonathan M. Duncan; Gretchen M. Gettel; Brian R. Hall; Praveen Kumar; Jonathan R. Thompson; James M. Vose; Emily M. Elliott; David S. Leigh

    2012-01-01

    Legacy effects of past land use and disturbance are increasingly recognized, yet consistent definitions of and criteria for defining them do not exist. To address this gap in biological- and ecosystem-assessment frameworks, we propose a general metric for evaluating potential legacy effects, which are computed by normalizing altered system function persistence with...

  18. Individual legacy trees influence vertebrate wildlife diversity in commercial forests

    Treesearch

    M.J. Mazurek; William J. Zielinski

    2007-01-01

    Old-growth forests provide important structural habitat elements for many species of wildlife. These forests, however, are rare where lands are managed for timber. In commercial forests, large and old trees sometimes exist only as widely-dispersed residual or legacy trees. Legacy trees are old trees that have been spared during harvest or have survived stand-replacing...

  19. Software for Collaborative Engineering of Launch Rockets

    NASA Technical Reports Server (NTRS)

    Stanley, Thomas Troy

    2003-01-01

    The Rocket Evaluation and Cost Integration for Propulsion and Engineering software enables collaborative computing with automated exchange of information in the design and analysis of launch rockets and other complex systems. RECIPE can interact with and incorporate a variety of programs, including legacy codes, that model aspects of a system from the perspectives of different technological disciplines (e.g., aerodynamics, structures, propulsion, trajectory, aeroheating, controls, and operations) and that are used by different engineers on different computers running different operating systems. RECIPE consists mainly of (1) ISCRM a file-transfer subprogram that makes it possible for legacy codes executed in their original operating systems on their original computers to exchange data and (2) CONES an easy-to-use filewrapper subprogram that enables the integration of legacy codes. RECIPE provides a tightly integrated conceptual framework that emphasizes connectivity among the programs used by the collaborators, linking these programs in a manner that provides some configuration control while facilitating collaborative engineering tradeoff studies, including design to cost studies. In comparison with prior collaborative-engineering schemes, one based on the use of RECIPE enables fewer engineers to do more in less time.

  20. Cybersecurity of Critical Control Networks

    DTIC Science & Technology

    2015-07-14

    project are included below. The tasks include work in link encryption for existing legacy SCADA equipment, where we continue to develop lightweight...language for authoring and monitoring compliance of SCADA systems, including technologies for a “policy monitor” which reports out on any observance issues...Acquisition ( SCADA ). Details of each project are included below. The tasks include work in link encryption for existing legacy SCADA equipment

  1. Green Turning Brown - Domain Engineering for Social and Health Services in Finland.

    PubMed

    Suomi, Reima; Nykänen, Pirkko; Vepsäläinen, Tapio; Hiltunen, Riina

    2017-01-01

    Being able to design information systems to an untouched domain, without the burden of existing information systems, especially legacy systems, is often seen as a dream of most information system professionals. Uncharted domains are anyway scarce, and often such greenfield projects turn into brownfield projects, also to projects where existing structures severely constrain the development of new systems. In this article we discuss the concepts of greenfield and brownfield domain engineering and software development, and reflect their possible messages to the re-engineering of the Finnish health- and social care ecosystem currently under way. In our fieldwork we could identify a lot of need and wish for greenfield domain engineering in the Finnish health and social services delivery. As well we found a lot of brownfield elements inhibiting change. Our proposal for the future is a ecosystem approach, where new and established elements could live together in a self-governed balance.

  2. CrossTalk: The Journal of Defense Software Engineering. Volume 21, Number 9

    DTIC Science & Technology

    2008-09-01

    including suggestions for reducing this burden, to Washington Headquarters Services, Directorate for Information Operations and Reports, 1215 Jefferson...SEP 2008 2. REPORT TYPE 3. DATES COVERED 00-00-2008 to 00-00-2008 4. TITLE AND SUBTITLE CrossTalk: The Journal of Defense Software Engineering...The Journal of Defense Software Engineering September 2008 4 10 15 19 24 26 Securing Legacy C Applications Using Dynamic Data Flow Analysis This

  3. Training survey -- educational profile for Hanford HANDI 2000 project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilson, D.

    Fluor Daniel Hanford, Inc. (FDH) is currently adopting streamlined business processes through integrated software solutions. Replacing the legacy software (current/replacement systems, attached) also avoids significant maintenance required to resolve Year 2000 issues. This initiative is being referred to as `HANDI 2000`. The software being implemented in the first phase of this project includes Indus International`s PASSPORT Software, Peoplesoft and Primavera P3 Software. The project, which encompasses all the system replacements that will occur, has been named `HANDI 2000.` The PASSPORT applications being implemented are Inventory Management, Purchasing, Contract Management, Accounts Payable, and MSDS (Material Safety Data Sheets).

  4. Building the Pipeline for Hubble Legacy Archive Grism data

    NASA Astrophysics Data System (ADS)

    Kümmel, M.; Albrecht, R.; Fosbury, R.; Freudling, W.; Haase, J.; Hook, R. N.; Kuntschner, H.; Lombardi, M.; Micol, A.; Rosa, M.; Stoehr, F.; Walsh, J. R.

    2008-10-01

    The Pipeline for Hubble Legacy Archive Grism data (PHLAG) is currently being developed as an end-to-end pipeline for the Hubble Legacy Archive (HLA). The inputs to PHLAG are slitless spectroscopic HST data with only the basic calibrations from standard HST pipelines applied; the outputs are fully calibrated, Virtuall Observatory-compatible spectra, which will be made available through a static HLA-archive. We give an overview of the various aspects of PHLAG. The pipeline consists of several subcomponents -- data preparation, data retrieval, image combination, object detection, spectral extraction using the aXe software, quality control -- which is discussed in detail. As a pilot project, PHLAG is currently being applied to NICMOS G141 grism data. Examples of G141 spectra reduced with PHLAG are shown.

  5. BenMAP 1.0

    EPA Pesticide Factsheets

    BenMAP 1.0 is the legacy version of the BenMAP software that the EPA is making available for archival purposes. It is designed for regional and national-scale analyses within the Continental United States.

  6. BenMAP Downloads

    EPA Pesticide Factsheets

    Download the current and legacy versions of the BenMAP program. Download configuration and aggregation/pooling/valuation files to estimate benefits. BenMAP-CE is free and open source software, and the source code is available upon request.

  7. U35: Legacy Engine Final Report

    DOT National Transportation Integrated Search

    2012-07-01

    The Legacy engine is a new core technology that can be used with existing infrastructure providing for near term benefits while minimizing costs. Also, as a new technology, it will be optimized for many years to come providing the opportunity for con...

  8. Space Shuttle Program Primary Avionics Software System (PASS) Success Legacy - Quality and Reliability Date

    NASA Technical Reports Server (NTRS)

    Orr, James K.; Peltier, Daryl

    2010-01-01

    Thsi slide presentation reviews the avionics software system on board the space shuttle, with particular emphasis on the quality and reliability. The Primary Avionics Software System (PASS) provides automatic and fly-by-wire control of critical shuttle systems which executes in redundant computers. Charts given show the number of space shuttle flights vs time, PASS's development history, and other charts that point to the reliability of the system's development. The reliability of the system is also compared to predicted reliability.

  9. High-Surety Telemedicine in a Distributed, 'Plug-andPlan' Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Craft, Richard L.; Funkhouser, Donald R.; Gallagher, Linda K.

    1999-05-17

    Commercial telemedicine systems are increasingly functional, incorporating video-conferencing capabilities, diagnostic peripherals, medication reminders, and patient education services. However, these systems (1) rarely utilize information architectures which allow them to be easily integrated with existing health information networks and (2) do not always protect patient confidentiality with adequate security mechanisms. Using object-oriented methods and software wrappers, we illustrate the transformation of an existing stand-alone telemedicine system into `plug-and-play' components that function in a distributed medical information environment. We show, through the use of open standards and published component interfaces, that commercial telemedicine offerings which were once incompatible with electronic patient recordmore » systems can now share relevant data with clinical information repositories while at the same time hiding the proprietary implementations of the respective systems. Additionally, we illustrate how leading-edge technology can secure this distributed telemedicine environment, maintaining patient confidentiality and the integrity of the associated electronic medical data. Information surety technology also encourages the development of telemedicine systems that have both read and write access to electronic medical records containing patient-identifiable information. The win-win approach to telemedicine information system development preserves investments in legacy software and hardware while promoting security and interoperability in a distributed environment.« less

  10. Collaborative Software Development in Support of Fast Adaptive AeroSpace Tools (FAAST)

    NASA Technical Reports Server (NTRS)

    Kleb, William L.; Nielsen, Eric J.; Gnoffo, Peter A.; Park, Michael A.; Wood, William A.

    2003-01-01

    A collaborative software development approach is described. The software product is an adaptation of proven computational capabilities combined with new capabilities to form the Agency's next generation aerothermodynamic and aerodynamic analysis and design tools. To efficiently produce a cohesive, robust, and extensible software suite, the approach uses agile software development techniques; specifically, project retrospectives, the Scrum status meeting format, and a subset of Extreme Programming's coding practices are employed. Examples are provided which demonstrate the substantial benefits derived from employing these practices. Also included is a discussion of issues encountered when porting legacy Fortran 77 code to Fortran 95 and a Fortran 95 coding standard.

  11. Software Model Checking Without Source Code

    NASA Technical Reports Server (NTRS)

    Chaki, Sagar; Ivers, James

    2009-01-01

    We present a framework, called AIR, for verifying safety properties of assembly language programs via software model checking. AIR extends the applicability of predicate abstraction and counterexample guided abstraction refinement to the automated verification of low-level software. By working at the assembly level, AIR allows verification of programs for which source code is unavailable-such as legacy and COTS software-and programs that use features-such as pointers, structures, and object-orientation-that are problematic for source-level software verification tools. In addition, AIR makes no assumptions about the underlying compiler technology. We have implemented a prototype of AIR and present encouraging results on several non-trivial examples.

  12. Auto-Generated Semantic Processing Services

    NASA Technical Reports Server (NTRS)

    Davis, Rodney; Hupf, Greg

    2009-01-01

    Auto-Generated Semantic Processing (AGSP) Services is a suite of software tools for automated generation of other computer programs, denoted cross-platform semantic adapters, that support interoperability of computer-based communication systems that utilize a variety of both new and legacy communication software running in a variety of operating- system/computer-hardware combinations. AGSP has numerous potential uses in military, space-exploration, and other government applications as well as in commercial telecommunications. The cross-platform semantic adapters take advantage of common features of computer- based communication systems to enforce semantics, messaging protocols, and standards of processing of streams of binary data to ensure integrity of data and consistency of meaning among interoperating systems. The auto-generation aspect of AGSP Services reduces development time and effort by emphasizing specification and minimizing implementation: In effect, the design, building, and debugging of software for effecting conversions among complex communication protocols, custom device mappings, and unique data-manipulation algorithms is replaced with metadata specifications that map to an abstract platform-independent communications model. AGSP Services is modular and has been shown to be easily integrable into new and legacy NASA flight and ground communication systems.

  13. Regulatory Oversight of the Legacy Gunner Uranium Mine and Mill Site in Northern Saskatchewan, Canada - 13434

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stenson, Ron; Howard, Don

    2013-07-01

    As Canada's nuclear regulator, the Canadian Nuclear Safety Commission (CNSC) is responsible for licensing all aspects of uranium mining, including remediation activities at legacy sites. Since these sites already existed when the current legislation came into force in 2000, and the previous legislation did not apply, they present a special case. The Nuclear Safety and Control Act (NSCA), was written with cradle-to- grave oversight in mind. Applying the NSCA at the end of a 'facilities' life-cycle poses some challenges to both the regulator and the proponent. When the proponent is the public sector, even more challenges can present themselves. Althoughmore » the licensing process for legacy sites is no different than for any other CNSC license, assuring regulatory compliance can be more complicated. To demonstrate how the CNSC has approached the oversight of legacy sites the history of the Commission's involvement with the Gunnar uranium mine and mill site provides a good case study. The lessons learned from the CNSC's experience regulating the Gunnar site will benefit those in the future who will need to regulate legacy sites under existing or new legislation. (authors)« less

  14. Legacy of the Space Shuttle Program

    NASA Technical Reports Server (NTRS)

    Sullivan, Steven J.

    2010-01-01

    This slide presentation reviews many of the innovations from Kennedy Space Center engineering for ground operations that were made during the shuttle program. The innovations are in the areas of detection, image analysis, protective equipment, software development and communications.

  15. Talking Back: Weapons, Warfare, and Feedback

    DTIC Science & Technology

    2010-04-01

    realize that these laws are not laws of physics . They don’t allow for performance or effectiveness comparisons either as they don’t have a common...the weapon’s next software update. Software updates are done by physical connections like most legacy systems as well as by secure data link...Generally the land based Air Force squadrons use physical connections due to the increased reliability, while sea based squadrons use the wireless

  16. Aviation Proof of Concept: The Transition of Marine Corps Aviation Maintenance Computer Assets and Systems into the Navy Marine Corps Intranet

    DTIC Science & Technology

    2010-03-01

    Blackberry Services Increment 1 SLAPC: 103.8 SERVICE NAME: HELP DESK SLA: 104 Performance Category: Average Speed of Answer - Telephone Calls...Category: Blackberry Services Increment 1 SLAPC: 103.8 SERVICE NAME: HELP DESK SLA: 104 Performance Category: Average Speed of Answer - Telephone...legacy VPNs for existing tunnels , until MCNOSC agrees to shut down of legacy B-1’s. o Shut down of legacy B-1’s will transfer VPN’s tunnels to

  17. Legacy Sediments in U.S. River Environments: Atrazine and Aggradation to Zinc and Zoobenthos

    NASA Astrophysics Data System (ADS)

    Wohl, E.

    2014-12-01

    Legacy sediments are those that are altered by human activities. Alterations include (i) human-caused aggradation (and subsequent erosion), such as sediment accumulating upstream from relict or contemporary dams, (ii) human-caused lack of continuing deposition that results in changing moisture and nutrient levels within existing sediments, such as on floodplains that no longer receive lateral or vertical accretion deposits because of levees, bank stabilization, and other channel engineering, and (iii) human-generated contaminants such as PCBs and pesticides that adsorb to fine sediment. Existing estimates of human alterations of river systems suggest that legacy sediments are ubiquitous. Only an estimated 2% of river miles in the United States are not affected by flow regulation that alters sediment transport, for example, and less than half of major river basins around the world are minimally altered by flow regulation. Combined with extensive but poorly documented reduction in floodplain sedimentation, as well as sediment contamination by diverse synthetic compounds, excess nutrients, and heavy metals, these national and global estimates suggest that legacy sediments now likely constitute a very abundant type of fluvial sediment. Because legacy sediments can alter river form and function for decades to centuries after the cessation of the human activity that created the legacy sediments, river management and restoration must be informed by accurate knowledge of the distribution and characteristics of legacy sediments. Geomorphologists can contribute understanding of sediment dynamics, including: the magnitude, frequency, and duration of flows that mobilize sediments with adsorbed contaminants; sites where erosion and deposition are most likely to occur under specified flow and sediment supply; residence time of sediments; and the influence of surface and subsurface water fluxes on sediment stability and geochemistry.

  18. Coupling Legacy and Contemporary Deterministic Codes to Goldsim for Probabilistic Assessments of Potential Low-Level Waste Repository Sites

    NASA Astrophysics Data System (ADS)

    Mattie, P. D.; Knowlton, R. G.; Arnold, B. W.; Tien, N.; Kuo, M.

    2006-12-01

    Sandia National Laboratories (Sandia), a U.S. Department of Energy National Laboratory, has over 30 years experience in radioactive waste disposal and is providing assistance internationally in a number of areas relevant to the safety assessment of radioactive waste disposal systems. International technology transfer efforts are often hampered by small budgets, time schedule constraints, and a lack of experienced personnel in countries with small radioactive waste disposal programs. In an effort to surmount these difficulties, Sandia has developed a system that utilizes a combination of commercially available codes and existing legacy codes for probabilistic safety assessment modeling that facilitates the technology transfer and maximizes limited available funding. Numerous codes developed and endorsed by the United States Nuclear Regulatory Commission and codes developed and maintained by United States Department of Energy are generally available to foreign countries after addressing import/export control and copyright requirements. From a programmatic view, it is easier to utilize existing codes than to develop new codes. From an economic perspective, it is not possible for most countries with small radioactive waste disposal programs to maintain complex software, which meets the rigors of both domestic regulatory requirements and international peer review. Therefore, re-vitalization of deterministic legacy codes, as well as an adaptation of contemporary deterministic codes, provides a creditable and solid computational platform for constructing probabilistic safety assessment models. External model linkage capabilities in Goldsim and the techniques applied to facilitate this process will be presented using example applications, including Breach, Leach, and Transport-Multiple Species (BLT-MS), a U.S. NRC sponsored code simulating release and transport of contaminants from a subsurface low-level waste disposal facility used in a cooperative technology transfer project between Sandia National Laboratories and Taiwan's Institute of Nuclear Energy Research (INER) for the preliminary assessment of several candidate low-level waste repository sites. Sandia National Laboratories is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy under Contract DE AC04 94AL85000.

  19. iVirus: facilitating new insights in viral ecology with software and community data sets imbedded in a cyberinfrastructure.

    PubMed

    Bolduc, Benjamin; Youens-Clark, Ken; Roux, Simon; Hurwitz, Bonnie L; Sullivan, Matthew B

    2017-01-01

    Microbes affect nutrient and energy transformations throughout the world's ecosystems, yet they do so under viral constraints. In complex communities, viral metagenome (virome) sequencing is transforming our ability to quantify viral diversity and impacts. Although some bottlenecks, for example, few reference genomes and nonquantitative viromics, have been overcome, the void of centralized data sets and specialized tools now prevents viromics from being broadly applied to answer fundamental ecological questions. Here we present iVirus, a community resource that leverages the CyVerse cyberinfrastructure to provide access to viromic tools and data sets. The iVirus Data Commons contains both raw and processed data from 1866 samples and 73 projects derived from global ocean expeditions, as well as existing and legacy public repositories. Through the CyVerse Discovery Environment, users can interrogate these data sets using existing analytical tools (software applications known as 'Apps') for assembly, open reading frame prediction and annotation, as well as several new Apps specifically developed for analyzing viromes. Because Apps are web based and powered by CyVerse supercomputing resources, they enable scalable analyses for a broad user base. Finally, a use-case scenario documents how to apply these advances toward new data. This growing iVirus resource should help researchers utilize viromics as yet another tool to elucidate viral roles in nature.

  20. iVirus: facilitating new insights in viral ecology with software and community data sets imbedded in a cyberinfrastructure

    PubMed Central

    Bolduc, Benjamin; Youens-Clark, Ken; Roux, Simon; Hurwitz, Bonnie L; Sullivan, Matthew B

    2017-01-01

    Microbes affect nutrient and energy transformations throughout the world's ecosystems, yet they do so under viral constraints. In complex communities, viral metagenome (virome) sequencing is transforming our ability to quantify viral diversity and impacts. Although some bottlenecks, for example, few reference genomes and nonquantitative viromics, have been overcome, the void of centralized data sets and specialized tools now prevents viromics from being broadly applied to answer fundamental ecological questions. Here we present iVirus, a community resource that leverages the CyVerse cyberinfrastructure to provide access to viromic tools and data sets. The iVirus Data Commons contains both raw and processed data from 1866 samples and 73 projects derived from global ocean expeditions, as well as existing and legacy public repositories. Through the CyVerse Discovery Environment, users can interrogate these data sets using existing analytical tools (software applications known as ‘Apps') for assembly, open reading frame prediction and annotation, as well as several new Apps specifically developed for analyzing viromes. Because Apps are web based and powered by CyVerse supercomputing resources, they enable scalable analyses for a broad user base. Finally, a use-case scenario documents how to apply these advances toward new data. This growing iVirus resource should help researchers utilize viromics as yet another tool to elucidate viral roles in nature. PMID:27420028

  1. Virtualization of Legacy Instrumentation Control Computers for Improved Reliability, Operational Life, and Management.

    PubMed

    Katz, Jonathan E

    2017-01-01

    Laboratories tend to be amenable environments for long-term reliable operation of scientific measurement equipment. Indeed, it is not uncommon to find equipment 5, 10, or even 20+ years old still being routinely used in labs. Unfortunately, the Achilles heel for many of these devices is the control/data acquisition computer. Often these computers run older operating systems (e.g., Windows XP) and, while they might only use standard network, USB or serial ports, they require proprietary software to be installed. Even if the original installation disks can be found, it is a burdensome process to reinstall and is fraught with "gotchas" that can derail the process-lost license keys, incompatible hardware, forgotten configuration settings, etc. If you have running legacy instrumentation, the computer is the ticking time bomb waiting to put a halt to your operation.In this chapter, I describe how to virtualize your currently running control computer. This virtualized computer "image" is easy to maintain, easy to back up and easy to redeploy. I have used this multiple times in my own lab to greatly improve the robustness of my legacy devices.After completing the steps in this chapter, you will have your original control computer as well as a virtual instance of that computer with all the software installed ready to control your hardware should your original computer ever be decommissioned.

  2. Operational Use of GPS Navigation for Space Shuttle Entry

    NASA Technical Reports Server (NTRS)

    Goodman, John L.; Propst, Carolyn A.

    2008-01-01

    The STS-118 flight of the Space Shuttle Endeavour was the first shuttle mission flown with three Global Positioning System (GPS) receivers in place of the three legacy Tactical Air Navigation (TACAN) units. This marked the conclusion of a 15 year effort involving procurement, missionization, integration, and flight testing of a GPS receiver and a parallel effort to formulate and implement shuttle computer software changes to support GPS. The use of GPS data from a single receiver in parallel with TACAN during entry was successfully demonstrated by the orbiters Discovery and Atlantis during four shuttle missions in 2006 and 2007. This provided the confidence needed before flying the first all GPS, no TACAN flight with Endeavour. A significant number of lessons were learned concerning the integration of a software intensive navigation unit into a legacy avionics system. These lessons have been taken into consideration during vehicle design by other flight programs, including the vehicle that will replace the Space Shuttle, Orion.

  3. Adding Processing Functionality to the Sensor Web

    NASA Astrophysics Data System (ADS)

    Stasch, Christoph; Pross, Benjamin; Jirka, Simon; Gräler, Benedikt

    2017-04-01

    The Sensor Web allows discovering, accessing and tasking different kinds of environmental sensors in the Web, ranging from simple in-situ sensors to remote sensing systems. However, (geo-)processing functionality needs to be applied to integrate data from different sensor sources and to generate higher level information products. Yet, a common standardized approach for processing sensor data in the Sensor Web is still missing and the integration differs from application to application. Standardizing not only the provision of sensor data, but also the processing facilitates sharing and re-use of processing modules, enables reproducibility of processing results, and provides a common way to integrate external scalable processing facilities or legacy software. In this presentation, we provide an overview on on-going research projects that develop concepts for coupling standardized geoprocessing technologies with Sensor Web technologies. At first, different architectures for coupling sensor data services with geoprocessing services are presented. Afterwards, profiles for linear regression and spatio-temporal interpolation of the OGC Web Processing Services that allow consuming sensor data coming from and uploading predictions to Sensor Observation Services are introduced. The profiles are implemented in processing services for the hydrological domain. Finally, we illustrate how the R software can be coupled with existing OGC Sensor Web and Geoprocessing Services and present an example, how a Web app can be built that allows exploring the results of environmental models in an interactive way using the R Shiny framework. All of the software presented is available as Open Source Software.

  4. Archetype-based conversion of EHR content models: pilot experience with a regional EHR system.

    PubMed

    Chen, Rong; Klein, Gunnar O; Sundvall, Erik; Karlsson, Daniel; Ahlfeldt, Hans

    2009-07-01

    Exchange of Electronic Health Record (EHR) data between systems from different suppliers is a major challenge. EHR communication based on archetype methodology has been developed by openEHR and CEN/ISO. The experience of using archetypes in deployed EHR systems is quite limited today. Currently deployed EHR systems with large user bases have their own proprietary way of representing clinical content using various models. This study was designed to investigate the feasibility of representing EHR content models from a regional EHR system as openEHR archetypes and inversely to convert archetypes to the proprietary format. The openEHR EHR Reference Model (RM) and Archetype Model (AM) specifications were used. The template model of the Cambio COSMIC, a regional EHR product from Sweden, was analyzed and compared to the openEHR RM and AM. This study was focused on the convertibility of the EHR semantic models. A semantic mapping between the openEHR RM/AM and the COSMIC template model was produced and used as the basis for developing prototype software that performs automated bi-directional conversion between openEHR archetypes and COSMIC templates. Automated bi-directional conversion between openEHR archetype format and COSMIC template format has been achieved. Several archetypes from the openEHR Clinical Knowledge Repository have been imported into COSMIC, preserving most of the structural and terminology related constraints. COSMIC templates from a large regional installation were successfully converted into the openEHR archetype format. The conversion from the COSMIC templates into archetype format preserves nearly all structural and semantic definitions of the original content models. A strategy of gradually adding archetype support to legacy EHR systems was formulated in order to allow sharing of clinical content models defined using different formats. The openEHR RM and AM are expressive enough to represent the existing clinical content models from the template based EHR system tested and legacy content models can automatically be converted to archetype format for sharing of knowledge. With some limitations, internationally available archetypes could be converted to the legacy EHR models. Archetype support can be added to legacy EHR systems in an incremental way allowing a migration path to interoperability based on standards.

  5. ON UPGRADING THE NUMERICS IN COMBUSTION CHEMISTRY CODES. (R824970)

    EPA Science Inventory

    A method of updating and reusing legacy FORTRAN codes for combustion simulations is presented using the DAEPACK software package. The procedure is demonstrated on two codes that come with the CHEMKIN-II package, CONP and SENKIN, for the constant-pressure batch reactor simulati...

  6. 75 FR 26906 - Connect America Fund, A National Broadband Plan for Our Future, High-Cost Universal Service Support

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-13

    ... recommending that the Commission adopt cost-cutting measures for existing voice support and create a Connect... specific common-sense reforms to cap growth and cut inefficient funding in the legacy high-cost support... other regulatory actions? 2. Specific Steps To Cut Legacy High-Cost Support 3. As discussed in more...

  7. Gene Transcription, Metabolite and Lipid Profiling in Eco-Indicator Daphnia magna Indicate Diverse Mechanisms of Toxicity by Legacy and Emerging Flame-Retardants

    EPA Science Inventory

    The use of chemical flame-retardants (FR) in consumer products has steadily increased over the last 30 years. Toxicity data exist for legacy FRs such as pentabromodiphenyl ether (pentaBDE), but less is known about effects of new formulations. To address this issue, the toxicity o...

  8. Distribution Feeder Modeling for Time-Series Simulation of Voltage Management Strategies: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Giraldez Miner, Julieta I; Gotseff, Peter; Nagarajan, Adarsh

    This paper presents techniques to create baseline distribution models using a utility feeder from Hawai'ian Electric Company. It describes the software-to-software conversion, steady-state, and time-series validations of a utility feeder model. It also presents a methodology to add secondary low-voltage circuit models to accurately capture the voltage at the customer meter level. This enables preparing models to perform studies that simulate how customer-sited resources integrate into legacy utility distribution system operations.

  9. Section 508 Electronic Information Accessibility Requirements for Software Development

    NASA Technical Reports Server (NTRS)

    Ellis, Rebecca

    2014-01-01

    Section 508 Subpart B 1194.21 outlines requirements for operating system and software development in order to create a product that is accessible to users with various disabilities. This portion of Section 508 contains a variety of standards to enable those using assistive technology and with visual, hearing, cognitive and motor difficulties to access all information provided in software. The focus on requirements was limited to the Microsoft Windows® operating system as it is the predominant operating system used at this center. Compliance with this portion of the requirements can be obtained by integrating the requirements into the software development cycle early and by remediating issues in legacy software if possible. There are certain circumstances with software that may arise necessitating an exemption from these requirements, such as design or engineering software using dynamically changing graphics or numbers to convey information. These exceptions can be discussed with the Section 508 Coordinator and another method of accommodation used.

  10. Development and implementation of software systems for imaging spectroscopy

    USGS Publications Warehouse

    Boardman, J.W.; Clark, R.N.; Mazer, A.S.; Biehl, L.L.; Kruse, F.A.; Torson, J.; Staenz, K.

    2006-01-01

    Specialized software systems have played a crucial role throughout the twenty-five year course of the development of the new technology of imaging spectroscopy, or hyperspectral remote sensing. By their very nature, hyperspectral data place unique and demanding requirements on the computer software used to visualize, analyze, process and interpret them. Often described as a marriage of the two technologies of reflectance spectroscopy and airborne/spaceborne remote sensing, imaging spectroscopy, in fact, produces data sets with unique qualities, unlike previous remote sensing or spectrometer data. Because of these unique spatial and spectral properties hyperspectral data are not readily processed or exploited with legacy software systems inherited from either of the two parent fields of study. This paper provides brief reviews of seven important software systems developed specifically for imaging spectroscopy.

  11. Information Loss: Exploring the Information Systems Management's Neglect Affecting Softcopy Reproduction of Heritage-Data

    ERIC Educational Resources Information Center

    Oskooie, Kamran Rezai

    2012-01-01

    This exploratory mixed methods study quantified and explored leadership interest in legacy-data conversion and information processing. Questionnaires were administered electronically to 92 individuals in design, manufacturing, and other professions from the manufacturing, processing, Internet, computing, software and technology divisions. Research…

  12. Status of the AIAA Modeling and Simulation Format Standard

    NASA Technical Reports Server (NTRS)

    Jackson, E. Bruce; Hildreth, Bruce L.

    2008-01-01

    The current draft AIAA Standard for flight simulation models represents an on-going effort to improve the productivity of practitioners of the art of digital flight simulation (one of the original digital computer applications). This initial release provides the capability for the efficient representation and exchange of an aerodynamic model in full fidelity; the DAVE-ML format can be easily imported (with development of site-specific import tools) in an unambiguous way with automatic verification. An attractive feature of the standard is the ability to coexist with existing legacy software or tools. The draft Standard is currently limited in scope to static elements of dynamic flight simulations; however, these static elements represent the bulk of typical flight simulation mathematical models. It is already seeing application within U.S. and Australian government agencies in an effort to improve productivity and reduce model rehosting overhead. An existing tool allows import of DAVE-ML models into a popular simulation modeling and analysis tool, and other community-contributed tools and libraries can simplify the use of DAVE-ML compliant models at compile- or run-time of high-fidelity flight simulation.

  13. STAR Online Meta-Data Collection Framework: Integration with the Pre-existing Controls Infrastructure

    NASA Astrophysics Data System (ADS)

    Arkhipkin, D.; Lauret, J.

    2017-10-01

    One of the STAR experiment’s modular Messaging Interface and Reliable Architecture framework (MIRA) integration goals is to provide seamless and automatic connections with the existing control systems. After an initial proof of concept and operation of the MIRA system as a parallel data collection system for online use and real-time monitoring, the STAR Software and Computing group is now working on the integration of Experimental Physics and Industrial Control System (EPICS) with MIRA’s interfaces. This integration goals are to allow functional interoperability and, later on, to replace the existing/legacy Detector Control System components at the service level. In this report, we describe the evolutionary integration process and, as an example, will discuss the EPICS Alarm Handler conversion. We review the complete upgrade procedure starting with the integration of EPICS-originated alarm signals propagation into MIRA, followed by the replacement of the existing operator interface based on Motif Editor and Display Manager (MEDM) with modern portable web-based Alarm Handler interface. To achieve this aim, we have built an EPICS-to-MQTT [8] bridging service, and recreated the functionality of the original Alarm Handler using low-latency web messaging technologies. The integration of EPICS alarm handling into our messaging framework allowed STAR to improve the DCS alarm awareness of existing STAR DAQ and RTS services, which use MIRA as a primary source of experiment control information.

  14. The Legacy of the Teaching American History Grants: Statues on Easter Island?

    ERIC Educational Resources Information Center

    Olwell, Russell

    2007-01-01

    It is not too early to ask what legacy the Teaching American History grants will leave behind. Put another way, when all the money is spent, when all the seminars are done, when all the professional development has ended, what evidence will there be that the program ever existed? Will historians in the future look back at the evidence left behind…

  15. IRAF: Lessons for Project Longevity

    NASA Astrophysics Data System (ADS)

    Fitzpatrick, M.

    2012-09-01

    Although sometimes derided as a product of the 80's (or more generously, as a legacy system), the fact that IRAF remains a productive work environment for many astronomers today is a testament to one of its core design principles, portability. This idea has meaning beyond a survey of platforms in use at the peak of a project's active development; for true longevity, a project must be able to weather completely unimagined OS, hardware, data, staffing and political environments. A lack of attention to the broader issues of portability, or the true lifespan of a software system (e.g. archival science may extend for years beyond a given mission, upgraded or similar instruments may be developed that require the same reduction/analysis techniques, etc) might require costly new software development instead of simple code re-use. Additionally, one under-appreciated benefit to having a long history in the community is the trust that users have established in the science results produced by a particular system. However a software system evolves architecturally, preserving this trust (and by implication, the applications themselves) is the key to continued success. In this paper, we will discuss how the system architecture has allowed IRAF to navigate the many changes in computing since it was first released. It is hoped that the lessons learned can be adopted by software systems being built today so that they too can survive long enough to one day earn the distinction of being called a legacy system.

  16. Modelling a hydropower plant with reservoir with the micropower optimisation model (HOMER)

    NASA Astrophysics Data System (ADS)

    Canales, Fausto A.; Beluco, Alexandre; Mendes, Carlos André B.

    2017-08-01

    Hydropower with water accumulation is an interesting option to consider in hybrid systems, because it helps dealing with the intermittence characteristics of renewable energy resources. The software HOMER (version Legacy) is extensively used in research works related to these systems, but it does not include a specific option for modelling hydro with reservoir. This paper describes a method for modelling a hydropower plant with reservoir with HOMER by adapting an existing procedure used for modelling pumped storage. An example with two scenarios in southern Brazil is presented for illustrating and validating the method explained in this paper. The results validate the method by showing a direct correspondence between an equivalent battery and the reservoir. The refill of the reservoir, its power output as a function of the flow rate and installed hydropower capacity are effectively simulated, indicating an adequate representation of a hydropower plant with reservoir is possible with HOMER.

  17. Mac OS X for Astronomy

    NASA Astrophysics Data System (ADS)

    Pierfederici, F.; Pirzkal, N.; Hook, R. N.

    Mac OS X is the new Unix based version of the Macintosh operating system. It combines a high performance DisplayPDF user interface with a standard BSD UNIX subsystem and provides users with simultaneous access to a broad range of applications which were not previously available on a single system such as Microsoft Office and Adobe Photoshop, as well as legacy X11-based scientific tools and packages like IRAF, SuperMongo, MIDAS, etc. The combination of a modern GUI layered on top of a familiar UNIX environment paves the way for new, more flexible and powerful astronomical tools to be developed while assuring compatibility with already existing, older programs. In this paper, we outline the strengths of the Mac OS X platform in a scientific environment, astronomy in particular, and point to the numerous astronomical software packages available for this platform; most notably the Scisoft collection which we have compiled.

  18. Delivering Unidata Technology via the Cloud

    NASA Astrophysics Data System (ADS)

    Fisher, Ward; Oxelson Ganter, Jennifer

    2016-04-01

    Over the last two years, Docker has emerged as the clear leader in open-source containerization. Containerization technology provides a means by which software can be pre-configured and packaged into a single unit, i.e. a container. This container can then be easily deployed either on local or remote systems. Containerization is particularly advantageous when moving software into the cloud, as it simplifies the process. Unidata is adopting containerization as part of our commitment to migrate our technologies to the cloud. We are using a two-pronged approach in this endeavor. In addition to migrating our data-portal services to a cloud environment, we are also exploring new and novel ways to use cloud-specific technology to serve our community. This effort has resulted in several new cloud/Docker-specific projects at Unidata: "CloudStream," "CloudIDV," and "CloudControl." CloudStream is a docker-based technology stack for bringing legacy desktop software to new computing environments, without the need to invest significant engineering/development resources. CloudStream helps make it easier to run existing software in a cloud environment via a technology called "Application Streaming." CloudIDV is a CloudStream-based implementation of the Unidata Integrated Data Viewer (IDV). CloudIDV serves as a practical example of application streaming, and demonstrates how traditional software can be easily accessed and controlled via a web browser. Finally, CloudControl is a web-based dashboard which provides administrative controls for running docker-based technologies in the cloud, as well as providing user management. In this work we will give an overview of these three open-source technologies and the value they offer to our community.

  19. Developing CORBA-Based Distributed Scientific Applications from Legacy Fortran Programs

    NASA Technical Reports Server (NTRS)

    Sang, Janche; Kim, Chan; Lopez, Isaac

    2000-01-01

    Recent progress in distributed object technology has enabled software applications to be developed and deployed easily such that objects or components can work together across the boundaries of the network, different operating systems, and different languages. A distributed object is not necessarily a complete application but rather a reusable, self-contained piece of software that co-operates with other objects in a plug-and-play fashion via a well-defined interface. The Common Object Request Broker Architecture (CORBA), a middleware standard defined by the Object Management Group (OMG), uses the Interface Definition Language (IDL) to specify such an interface for transparent communication between distributed objects. Since IDL can be mapped to any programming language, such as C++, Java, Smalltalk, etc., existing applications can be integrated into a new application and hence the tasks of code re-writing and software maintenance can be reduced. Many scientific applications in aerodynamics and solid mechanics are written in Fortran. Refitting these legacy Fortran codes with CORBA objects can increase the codes reusability. For example, scientists could link their scientific applications to vintage Fortran programs such as Partial Differential Equation(PDE) solvers in a plug-and-play fashion. Unfortunately, CORBA IDL to Fortran mapping has not been proposed and there seems to be no direct method of generating CORBA objects from Fortran without having to resort to manually writing C/C++ wrappers. In this paper, we present an efficient methodology to integrate Fortran legacy programs into a distributed object framework. Issues and strategies regarding the conversion and decomposition of Fortran codes into CORBA objects are discussed. The following diagram shows the conversion and decomposition mechanism we proposed. Our goal is to keep the Fortran codes unmodified. The conversion- aided tool takes the Fortran application program as input and helps programmers generate C/C++ header file and IDL file for wrapping the Fortran code. Programmers need to determine by themselves how to decompose the legacy application into several reusable components based on the cohesion and coupling factors among the functions and subroutines. However, programming effort still can be greatly reduced because function headings and types have been converted to C++ and IDL styles. Most Fortran applications use the COMMON block to facilitate the transfer of large amount of variables among several functions. The COMMON block plays the similar role of global variables used in C. In the CORBA-compliant programming environment, global variables can not be used to pass values between objects. One approach to dealing with this problem is to put the COMMON variables into the parameter list. We do not adopt this approach because it requires modification of the Fortran source code which violates our design consideration. Our approach is to extract the COMMON blocks and convert them into a structure-typed attribute in C++. Through attributes, each component can initialize the variables and return the computation result back to the client. We have tested successfully the proposed conversion methodology based on the f2c converter. Since f2c only translates Fortran to C, we still needed to edit the converted code to meet the C++ and IDL syntax. For example, C++/IDL requires a tag in the structure type, while C does not. In this paper, we identify the necessary changes to the f2c converter in order to directly generate the C++ header and the IDL file. Our future work is to add GUI interface to ease the decomposition task by simply dragging and dropping icons.

  20. Preserving Long-Term Access to United States Government Documents in Legacy Digital Formats

    ERIC Educational Resources Information Center

    Woods, Kam A.

    2010-01-01

    Over the past several decades, millions of digital objects of significant scientific, economic, cultural, and historic value have been published and distributed to libraries and archives on removable media. Providing long-term access to these documents, media files, and software executables is an increasingly complex task because of dependencies…

  1. COVERT: A Framework for Finding Buffer Overflows in C Programs via Software Verification

    DTIC Science & Technology

    2010-08-01

    is greater than the allocated size of B. In the case of a type-safe language or a language with runtime bounds checking (such as Java), an overflow...leads either to a (compile-time) type error or a (runtime) exception. In such languages , a buffer overflow can lead to a denial of service attack (i.e...of current and legacy software is written in unsafe languages (such as C or C++) that allow buffers to be overflowed with impunity. For reasons such as

  2. Recording the LHCb data and software dependencies

    NASA Astrophysics Data System (ADS)

    Trisovic, Ana; Couturier, Ben; Gibson, Val; Jones, Chris

    2017-10-01

    In recent years awareness of the importance of preserving the experimental data and scientific software at CERN has been rising. To support this effort, we are presenting a novel approach to structure dependencies of the LHCb data and software to make it more accessible in the long-term future. In this paper, we detail the implementation of a graph database of these dependencies. We list the implications that can be deduced from the graph mining (such as a search for the legacy software), with emphasis on data preservation. Furthermore, we introduce a methodology of recreating the LHCb data, thus supporting reproducible research and data stewardship. Finally, we describe how this information is made available to the users on a web portal that promotes data and analysis preservation and good practise with analysis documentation.

  3. Managing MDO Software Development Projects

    NASA Technical Reports Server (NTRS)

    Townsend, J. C.; Salas, A. O.

    2002-01-01

    Over the past decade, the NASA Langley Research Center developed a series of 'grand challenge' applications demonstrating the use of parallel and distributed computation and multidisciplinary design optimization. All but the last of these applications were focused on the high-speed civil transport vehicle; the final application focused on reusable launch vehicles. Teams of discipline experts developed these multidisciplinary applications by integrating legacy engineering analysis codes. As teams became larger and the application development became more complex with increasing levels of fidelity and numbers of disciplines, the need for applying software engineering practices became evident. This paper briefly introduces the application projects and then describes the approaches taken in project management and software engineering for each project; lessons learned are highlighted.

  4. A Transparent Translation from Legacy System Model into Common Information Model: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ding, Fei; Simpson, Jeffrey; Zhang, Yingchen

    Advance in smart grid is forcing utilities towards better monitoring, control and analysis of distribution systems, and requires extensive cyber-based intelligent systems and applications to realize various functionalities. The ability of systems, or components within systems, to interact and exchange services or information with each other is the key to the success of smart grid technologies, and it requires efficient information exchanging and data sharing infrastructure. The Common Information Model (CIM) is a standard that allows different applications to exchange information about an electrical system, and it has become a widely accepted solution for information exchange among different platforms andmore » applications. However, most existing legacy systems are not developed using CIM, but using their own languages. Integrating such legacy systems is a challenge for utilities, and the appropriate utilization of the integrated legacy systems is even more intricate. Thus, this paper has developed an approach and open-source tool in order to translate legacy system models into CIM format. The developed tool is tested for a commercial distribution management system and simulation results have proved its effectiveness.« less

  5. Cadastral Database Positional Accuracy Improvement

    NASA Astrophysics Data System (ADS)

    Hashim, N. M.; Omar, A. H.; Ramli, S. N. M.; Omar, K. M.; Din, N.

    2017-10-01

    Positional Accuracy Improvement (PAI) is the refining process of the geometry feature in a geospatial dataset to improve its actual position. This actual position relates to the absolute position in specific coordinate system and the relation to the neighborhood features. With the growth of spatial based technology especially Geographical Information System (GIS) and Global Navigation Satellite System (GNSS), the PAI campaign is inevitable especially to the legacy cadastral database. Integration of legacy dataset and higher accuracy dataset like GNSS observation is a potential solution for improving the legacy dataset. However, by merely integrating both datasets will lead to a distortion of the relative geometry. The improved dataset should be further treated to minimize inherent errors and fitting to the new accurate dataset. The main focus of this study is to describe a method of angular based Least Square Adjustment (LSA) for PAI process of legacy dataset. The existing high accuracy dataset known as National Digital Cadastral Database (NDCDB) is then used as bench mark to validate the results. It was found that the propose technique is highly possible for positional accuracy improvement of legacy spatial datasets.

  6. Software ``Best'' Practices: Agile Deconstructed

    NASA Astrophysics Data System (ADS)

    Fraser, Steven

    Software “best” practices depend entirely on context - in terms of the problem domain, the system constructed, the software designers, and the “customers” ultimately deriving value from the system. Agile practices no longer have the luxury of “choosing” small non-mission critical projects with co-located teams. Project stakeholders are selecting and adapting practices based on a combina tion of interest, need and staffing. For example, growing product portfolios through a merger or the acquisition of a company exposes legacy systems to new staff, new software integration challenges, and new ideas. Innovation in communications (tools and processes) to span the growth and contraction of both information and organizations, while managing the adoption of changing software practices, is imperative for success. Traditional web-based tools such as web pages, document libraries, and forums are not suf ficient. A blend of tweeting, blogs, wikis, instant messaging, web-based confer encing, and telepresence creates a new dimension of communication “best” practices.

  7. Link Analysis in the Mission Planning Lab

    NASA Technical Reports Server (NTRS)

    McCarthy, Jessica A.; Cervantes, Benjamin W.; Daugherty, Sarah C.; Arroyo, Felipe; Mago, Divyang

    2011-01-01

    The legacy communications link analysis software currently used at Wallops Flight Facility involves processes that are different for command destruct, radar, and telemetry. There is a clear advantage to developing an easy-to-use tool that combines all the processes in one application. Link Analysis in the Mission Planning Lab (MPL) uses custom software and algorithms integrated with Analytical Graphics Inc. Satellite Toolkit (AGI STK). The MPL link analysis tool uses pre/post-mission data to conduct a dynamic link analysis between ground assets and the launch vehicle. Just as the legacy methods do, the MPL link analysis tool calculates signal strength and signal- to-noise according to the accepted processes for command destruct, radar, and telemetry assets. Graphs and other custom data are generated rapidly in formats for reports and presentations. STK is used for analysis as well as to depict plume angles and antenna gain patterns in 3D. The MPL has developed two interfaces with the STK software (see figure). The first interface is an HTML utility, which was developed in Visual Basic to enhance analysis for plume modeling and to offer a more user friendly, flexible tool. A graphical user interface (GUI) written in MATLAB (see figure upper right-hand corner) is also used to quickly depict link budget information for multiple ground assets. This new method yields a dramatic decrease in the time it takes to provide launch managers with the required link budgets to make critical pre-mission decisions. The software code used for these two custom utilities is a product of NASA's MPL.

  8. Migration strategies for service-enabling ground control stations for unmanned systems

    NASA Astrophysics Data System (ADS)

    Kroculick, Joseph B.

    2011-06-01

    Future unmanned systems will be integrated into the Global Information Grid (GIG) and support net-centric data sharing, where information in a domain is exposed to a wide variety of GIG stakeholders that can make use of the information provided. Adopting a Service-Oriented Architecture (SOA) approach to package reusable UAV control station functionality into common control services provides a number of benefits including enabling dynamic plug and play of components depending on changing mission requirements, supporting information sharing to the enterprise, and integrating information from authoritative sources such as mission planners with the UAV control stations data model. It also allows the wider enterprise community to use the services provided by unmanned systems and improve data quality to support more effective decision-making. We explore current challenges in migrating UAV control systems that manage multiple types of vehicles to a Service-Oriented Architecture (SOA). Service-oriented analysis involves reviewing legacy systems and determining which components can be made into a service. Existing UAV control stations provide audio/visual, navigation, and vehicle health and status information that are useful to C4I systems. However, many were designed to be closed systems with proprietary software and hardware implementations, message formats, and specific mission requirements. An architecture analysis can be performed that reviews legacy systems and determines which components can be made into a service. A phased SOA adoption approach can then be developed that improves system interoperability.

  9. Archetype-based conversion of EHR content models: pilot experience with a regional EHR system

    PubMed Central

    2009-01-01

    Background Exchange of Electronic Health Record (EHR) data between systems from different suppliers is a major challenge. EHR communication based on archetype methodology has been developed by openEHR and CEN/ISO. The experience of using archetypes in deployed EHR systems is quite limited today. Currently deployed EHR systems with large user bases have their own proprietary way of representing clinical content using various models. This study was designed to investigate the feasibility of representing EHR content models from a regional EHR system as openEHR archetypes and inversely to convert archetypes to the proprietary format. Methods The openEHR EHR Reference Model (RM) and Archetype Model (AM) specifications were used. The template model of the Cambio COSMIC, a regional EHR product from Sweden, was analyzed and compared to the openEHR RM and AM. This study was focused on the convertibility of the EHR semantic models. A semantic mapping between the openEHR RM/AM and the COSMIC template model was produced and used as the basis for developing prototype software that performs automated bi-directional conversion between openEHR archetypes and COSMIC templates. Results Automated bi-directional conversion between openEHR archetype format and COSMIC template format has been achieved. Several archetypes from the openEHR Clinical Knowledge Repository have been imported into COSMIC, preserving most of the structural and terminology related constraints. COSMIC templates from a large regional installation were successfully converted into the openEHR archetype format. The conversion from the COSMIC templates into archetype format preserves nearly all structural and semantic definitions of the original content models. A strategy of gradually adding archetype support to legacy EHR systems was formulated in order to allow sharing of clinical content models defined using different formats. Conclusion The openEHR RM and AM are expressive enough to represent the existing clinical content models from the template based EHR system tested and legacy content models can automatically be converted to archetype format for sharing of knowledge. With some limitations, internationally available archetypes could be converted to the legacy EHR models. Archetype support can be added to legacy EHR systems in an incremental way allowing a migration path to interoperability based on standards. PMID:19570196

  10. Modernization and optimization of a legacy open-source CFD code for high-performance computing architectures

    NASA Astrophysics Data System (ADS)

    Gel, Aytekin; Hu, Jonathan; Ould-Ahmed-Vall, ElMoustapha; Kalinkin, Alexander A.

    2017-02-01

    Legacy codes remain a crucial element of today's simulation-based engineering ecosystem due to the extensive validation process and investment in such software. The rapid evolution of high-performance computing architectures necessitates the modernization of these codes. One approach to modernization is a complete overhaul of the code. However, this could require extensive investments, such as rewriting in modern languages, new data constructs, etc., which will necessitate systematic verification and validation to re-establish the credibility of the computational models. The current study advocates using a more incremental approach and is a culmination of several modernization efforts of the legacy code MFIX, which is an open-source computational fluid dynamics code that has evolved over several decades, widely used in multiphase flows and still being developed by the National Energy Technology Laboratory. Two different modernization approaches,'bottom-up' and 'top-down', are illustrated. Preliminary results show up to 8.5x improvement at the selected kernel level with the first approach, and up to 50% improvement in total simulated time with the latter were achieved for the demonstration cases and target HPC systems employed.

  11. Use of SUSA in Uncertainty and Sensitivity Analysis for INL VHTR Coupled Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerhard Strydom

    2010-06-01

    The need for a defendable and systematic Uncertainty and Sensitivity approach that conforms to the Code Scaling, Applicability, and Uncertainty (CSAU) process, and that could be used for a wide variety of software codes, was defined in 2008.The GRS (Gesellschaft für Anlagen und Reaktorsicherheit) company of Germany has developed one type of CSAU approach that is particularly well suited for legacy coupled core analysis codes, and a trial version of their commercial software product SUSA (Software for Uncertainty and Sensitivity Analyses) was acquired on May 12, 2010. This interim milestone report provides an overview of the current status of themore » implementation and testing of SUSA at the INL VHTR Project Office.« less

  12. Space Shuttle Program Primary Avionics Software System (PASS) Success Legacy - Major Accomplishments and Lessons Learned Detail Historical Timeline Analysis

    NASA Technical Reports Server (NTRS)

    Orr, James K.

    2010-01-01

    This presentation focuses on the Space Shuttle Primary Avionics Software System (PASS) and the people who developed and maintained this system. One theme is to provide quantitative data on software quality and reliability over a 30 year period. Consistent data relates to code break discrepancies. Requirements were supplied from external sources. Requirement inspections and measurements not implemented until later, beginning in 1985. Second theme is to focus on the people and organization of PASS. Many individuals have supported the PASS project over the entire period while transitioning from company to company and contract to contract. Major events and transitions have impacted morale (both positively and negatively) across the life of the project.

  13. Space Shuttle Program Primary Avionics Software System (PASS) Success Legacy -Major Accomplishments and Lessons Learned

    NASA Technical Reports Server (NTRS)

    Orr, James K.

    2010-01-01

    This presentation has shown the accomplishments of the PASS project over three decades and highlighted the lessons learned. Over the entire time, our goal has been to continuously improve our process, implement automation for both quality and increased productivity, and identify and remove all defects due to prior execution of a flawed process in addition to improving our processes following identification of significant process escapes. Morale and workforce instability have been issues, most significantly during 1993 to 1998 (period of consolidation in aerospace industry). The PASS project has also consulted with others, including the Software Engineering Institute, so as to be an early evaluator, adopter, and adapter of state-of-the-art software engineering innovations.

  14. Design and implementation of a general main axis controller for the ESO telescopes

    NASA Astrophysics Data System (ADS)

    Sandrock, Stefan; Di Lieto, Nicola; Pettazzi, Lorenzo; Erm, Toomas

    2012-09-01

    Most of the real-time control systems at the existing ESO telescopes were developed with "traditional" methods, using general purpose VMEbus electronics, and running applications that were coded by hand, mostly using the C programming language under VxWorks. As we are moving towards more modern design methods, we have explored a model-based design approach for real-time applications in the telescope area, and used the control algorithm of a standard telescope main axis as a first example. We wanted to have a clear work-flow that follows the "correct-by-construction" paradigm, where the implementation is testable in simulation on the development host, and where the testing time spent by debugging on target is minimized. It should respect the domains of control, electronics, and software engineers in the choice of tools. It should be a targetindependent approach so that the result could be deployed on various platforms. We have selected the Mathworks tools Simulink, Stateflow, and Embedded Coder for design and implementation, and LabVIEW with NI hardware for hardware-in-the-loop testing, all of which are widely used in industry. We describe how these tools have been used in order to model, simulate, and test the application. We also evaluate the benefits of this approach compared to the traditional method with respect to testing effort and maintainability. For a specific axis controller application we have successfully integrated the result into the legacy platform of the existing VLT software, as well as demonstrated how to use the same design for a new development with a completely different environment.

  15. Regulating the path from legacy recognition, through recovery to release from regulatory control.

    PubMed

    Sneve, Malgorzata Karpow; Smith, Graham

    2015-04-01

    Past development of processes and technologies using radioactive material led to construction of many facilities worldwide. Some of these facilities were built and operated before the regulatory infrastructure was in place to ensure adequate control of radioactive material during operation and decommissioning. In other cases, controls were in place but did not meet modern standards, leading to what is now considered to have been inadequate control. Accidents and other events have occurred resulting in loss of control of radioactive material and unplanned releases to the environment. The legacy from these circumstances is that many countries have areas or facilities at which abnormal radiation conditions exist at levels that give rise to concerns about environmental and human health of potential interest to regulatory authorities. Regulation of these legacy situations is complex. This paper examines the regulatory challenges associated with such legacy management and brings forward suggestions for finding the path from: legacy recognition; implementation, as necessary, of urgent mitigation measures; development of a longer-term management strategy, through to release from regulatory control. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  16. A methodology based on openEHR archetypes and software agents for developing e-health applications reusing legacy systems.

    PubMed

    Cardoso de Moraes, João Luís; de Souza, Wanderley Lopes; Pires, Luís Ferreira; do Prado, Antonio Francisco

    2016-10-01

    In Pervasive Healthcare, novel information and communication technologies are applied to support the provision of health services anywhere, at anytime and to anyone. Since health systems may offer their health records in different electronic formats, the openEHR Foundation prescribes the use of archetypes for describing clinical knowledge in order to achieve semantic interoperability between these systems. Software agents have been applied to simulate human skills in some healthcare procedures. This paper presents a methodology, based on the use of openEHR archetypes and agent technology, which aims to overcome the weaknesses typically found in legacy healthcare systems, thereby adding value to the systems. This methodology was applied in the design of an agent-based system, which was used in a realistic healthcare scenario in which a medical staff meeting to prepare a cardiac surgery has been supported. We conducted experiments with this system in a distributed environment composed by three cardiology clinics and a center of cardiac surgery, all located in the city of Marília (São Paulo, Brazil). We evaluated this system according to the Technology Acceptance Model. The case study confirmed the acceptance of our agent-based system by healthcare professionals and patients, who reacted positively with respect to the usefulness of this system in particular, and with respect to task delegation to software agents in general. The case study also showed that a software agent-based interface and a tools-based alternative must be provided to the end users, which should allow them to perform the tasks themselves or to delegate these tasks to other people. A Pervasive Healthcare model requires efficient and secure information exchange between healthcare providers. The proposed methodology allows designers to build communication systems for the message exchange among heterogeneous healthcare systems, and to shift from systems that rely on informal communication of actors to a more automated and less error-prone agent-based system. Our methodology preserves significant investment of many years in the legacy systems and allows developers to extend them adding new features to these systems, by providing proactive assistance to the end-users and increasing the user mobility with an appropriate support. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  17. American Association of Orthodontists Foundation Craniofacial Growth Legacy Collection in the orthodontic literature-use and trends: A systematic review.

    PubMed

    Al-Jewair, Thikriat; Stellrecht, Elizabeth; Lewandowski, Lauren; Chakaki, Rania

    2018-01-01

    The American Association of Orthodontists Foundation (AAOF) Craniofacial Growth Legacy Collection is a digital repository of records from 9 craniofacial growth study collections in the United States and Canada. The purposes of this article were to describe the use of materials from the AAOF Craniofacial Growth Legacy Collection in the orthodontic literature in comparative and follow-up studies, and to analyze trends before and after the project's launch in 2009. An electronic search without date or language restriction was conducted in the following databases: PubMed, Embase, Evidence-Based Medicine Reviews, and CINAHL. Grey literature resources and the bibliographies of the selected studies were also consulted. Three independent reviewers assessed the studies for inclusion. The criteria were human subjects of any age, sex, and ethnicity; at least 1 of the 9 AAOF legacy collections used as either the main sample population or the comparison or control; and orthodontic outcomes assessed. Data were analyzed using STATA software (version 14.2; StataCorp, College Station, Tex). A total of 199 studies (127 follow-up, 72 comparative) were included. The most commonly used collection in comparative studies was the Michigan Growth sample. The number of published studies more than doubled after the AAOF Legacy Collection project testing and launch in 2009. The increase continued through 2010 to 2014, during which there was a trend to use multiple collections. The Burlington Growth collection was the most commonly used collection for follow-up studies. The overall use of the legacy collection showed a small increase in published studies after 2009. The overall numbers of published studies in the comparative and follow-up categories increased after 2009, reflecting the efforts of the AAOF team and collection curators to make the records available worldwide. Further research should consider studying each collection to identify utilization predictors. Copyright © 2017 American Association of Orthodontists. Published by Elsevier Inc. All rights reserved.

  18. Observing System Simulation Experiment (OSSE) for the HyspIRI Spectrometer Mission

    NASA Technical Reports Server (NTRS)

    Turmon, Michael J.; Block, Gary L.; Green, Robert O.; Hua, Hook; Jacob, Joseph C.; Sobel, Harold R.; Springer, Paul L.; Zhang, Qingyuan

    2010-01-01

    The OSSE software provides an integrated end-to-end environment to simulate an Earth observing system by iteratively running a distributed modeling workflow based on the HyspIRI Mission, including atmospheric radiative transfer, surface albedo effects, detection, and retrieval for agile exploration of the mission design space. The software enables an Observing System Simulation Experiment (OSSE) and can be used for design trade space exploration of science return for proposed instruments by modeling the whole ground truth, sensing, and retrieval chain and to assess retrieval accuracy for a particular instrument and algorithm design. The OSSE in fra struc ture is extensible to future National Research Council (NRC) Decadal Survey concept missions where integrated modeling can improve the fidelity of coupled science and engineering analyses for systematic analysis and science return studies. This software has a distributed architecture that gives it a distinct advantage over other similar efforts. The workflow modeling components are typically legacy computer programs implemented in a variety of programming languages, including MATLAB, Excel, and FORTRAN. Integration of these diverse components is difficult and time-consuming. In order to hide this complexity, each modeling component is wrapped as a Web Service, and each component is able to pass analysis parameterizations, such as reflectance or radiance spectra, on to the next component downstream in the service workflow chain. In this way, the interface to each modeling component becomes uniform and the entire end-to-end workflow can be run using any existing or custom workflow processing engine. The architecture lets users extend workflows as new modeling components become available, chain together the components using any existing or custom workflow processing engine, and distribute them across any Internet-accessible Web Service endpoints. The workflow components can be hosted on any Internet-accessible machine. This has the advantages that the computations can be distributed to make best use of the available computing resources, and each workflow component can be hosted and maintained by their respective domain experts.

  19. ER2OWL: Generating OWL Ontology from ER Diagram

    NASA Astrophysics Data System (ADS)

    Fahad, Muhammad

    Ontology is the fundamental part of Semantic Web. The goal of W3C is to bring the web into (its full potential) a semantic web with reusing previous systems and artifacts. Most legacy systems have been documented in structural analysis and structured design (SASD), especially in simple or Extended ER Diagram (ERD). Such systems need up-gradation to become the part of semantic web. In this paper, we present ERD to OWL-DL ontology transformation rules at concrete level. These rules facilitate an easy and understandable transformation from ERD to OWL. The set of rules for transformation is tested on a structured analysis and design example. The framework provides OWL ontology for semantic web fundamental. This framework helps software engineers in upgrading the structured analysis and design artifact ERD, to components of semantic web. Moreover our transformation tool, ER2OWL, reduces the cost and time for building OWL ontologies with the reuse of existing entity relationship models.

  20. Survey of Verification and Validation Techniques for Small Satellite Software Development

    NASA Technical Reports Server (NTRS)

    Jacklin, Stephen A.

    2015-01-01

    The purpose of this paper is to provide an overview of the current trends and practices in small-satellite software verification and validation. This document is not intended to promote a specific software assurance method. Rather, it seeks to present an unbiased survey of software assurance methods used to verify and validate small satellite software and to make mention of the benefits and value of each approach. These methods include simulation and testing, verification and validation with model-based design, formal methods, and fault-tolerant software design with run-time monitoring. Although the literature reveals that simulation and testing has by far the longest legacy, model-based design methods are proving to be useful for software verification and validation. Some work in formal methods, though not widely used for any satellites, may offer new ways to improve small satellite software verification and validation. These methods need to be further advanced to deal with the state explosion problem and to make them more usable by small-satellite software engineers to be regularly applied to software verification. Last, it is explained how run-time monitoring, combined with fault-tolerant software design methods, provides an important means to detect and correct software errors that escape the verification process or those errors that are produced after launch through the effects of ionizing radiation.

  1. Model-Based Development of Automotive Electronic Climate Control Software

    NASA Astrophysics Data System (ADS)

    Kakade, Rupesh; Murugesan, Mohan; Perugu, Bhupal; Nair, Mohanan

    With increasing complexity of software in today's products, writing and maintaining thousands of lines of code is a tedious task. Instead, an alternative methodology must be employed. Model-based development is one candidate that offers several benefits and allows engineers to focus on the domain of their expertise than writing huge codes. In this paper, we discuss the application of model-based development to the electronic climate control software of vehicles. The back-to-back testing approach is presented that ensures flawless and smooth transition from legacy designs to the model-based development. Simulink report generator to create design documents from the models is presented along with its usage to run the simulation model and capture the results into the test report. Test automation using model-based development tool that support the use of unique set of test cases for several testing levels and the test procedure that is independent of software and hardware platform is also presented.

  2. Three Years of Global Positioning System Experience on International Space Station

    NASA Technical Reports Server (NTRS)

    Gomez, Susan

    2005-01-01

    The International Space Station global positioning systems (GPS) receiver was activated in April 2002. Since that time, numerous software anomalies surfaced that had to be worked around. Some of the software problems required waivers, such as the time function, while others required extensive operator intervention, such as numerous power cycles. Eventually, enough anomalies surfaced that the three pieces of code included in the GPS unit have been re-written and the GPS units were upgraded. The technical aspects of the problems are discussed, as well as the underlying causes that led to the delivery of a product that has had numerous problems. The technical aspects of the problems included physical phenomena that were not well understood, such as the affect that the ionosphere would have on the GPS measurements. The underlying causes were traced to inappropriate use of legacy software, changing requirements, inadequate software processes, unrealistic schedules, incorrect contract type, and unclear ownership responsibilities.

  3. Three Years of Global Positioning System Experience on International Space Station

    NASA Technical Reports Server (NTRS)

    Gomez, Susan

    2006-01-01

    The International Space Station global positioning system (GPS) receiver was activated in April 2002. Since that time, numerous software anomalies surfaced that had to be worked around. Some of the software problems required waivers, such as the time function, while others required extensive operator intervention, such as numerous power cycles. Eventually enough anomalies surfaced that the three pieces of code included in the GPS unit have been re-written and the GPS units upgraded. The technical aspects of the problems are discussed, as well as the underlying causes that led to the delivery of a product that has had so many problems. The technical aspects of the problems included physical phenomena that were not well understood, such as the affect that the ionosphere would have on the GPS measurements. The underlying causes were traced to inappropriate use of legacy software, changing requirements, inadequate software processes, unrealistic schedules, incorrect contract type, and unclear ownership responsibilities..

  4. Scalable computing for evolutionary genomics.

    PubMed

    Prins, Pjotr; Belhachemi, Dominique; Möller, Steffen; Smant, Geert

    2012-01-01

    Genomic data analysis in evolutionary biology is becoming so computationally intensive that analysis of multiple hypotheses and scenarios takes too long on a single desktop computer. In this chapter, we discuss techniques for scaling computations through parallelization of calculations, after giving a quick overview of advanced programming techniques. Unfortunately, parallel programming is difficult and requires special software design. The alternative, especially attractive for legacy software, is to introduce poor man's parallelization by running whole programs in parallel as separate processes, using job schedulers. Such pipelines are often deployed on bioinformatics computer clusters. Recent advances in PC virtualization have made it possible to run a full computer operating system, with all of its installed software, on top of another operating system, inside a "box," or virtual machine (VM). Such a VM can flexibly be deployed on multiple computers, in a local network, e.g., on existing desktop PCs, and even in the Cloud, to create a "virtual" computer cluster. Many bioinformatics applications in evolutionary biology can be run in parallel, running processes in one or more VMs. Here, we show how a ready-made bioinformatics VM image, named BioNode, effectively creates a computing cluster, and pipeline, in a few steps. This allows researchers to scale-up computations from their desktop, using available hardware, anytime it is required. BioNode is based on Debian Linux and can run on networked PCs and in the Cloud. Over 200 bioinformatics and statistical software packages, of interest to evolutionary biology, are included, such as PAML, Muscle, MAFFT, MrBayes, and BLAST. Most of these software packages are maintained through the Debian Med project. In addition, BioNode contains convenient configuration scripts for parallelizing bioinformatics software. Where Debian Med encourages packaging free and open source bioinformatics software through one central project, BioNode encourages creating free and open source VM images, for multiple targets, through one central project. BioNode can be deployed on Windows, OSX, Linux, and in the Cloud. Next to the downloadable BioNode images, we provide tutorials online, which empower bioinformaticians to install and run BioNode in different environments, as well as information for future initiatives, on creating and building such images.

  5. Four aspects to make science open "by design" and not as an after-thought.

    PubMed

    Halchenko, Yaroslav O; Hanke, Michael

    2015-01-01

    Unrestricted dissemination of methodological developments in neuroimaging became the propelling force in advancing our understanding of brain function. However, despite such a rich legacy, it remains not uncommon to encounter software and datasets that are distributed under unnecessarily restricted terms, or that violate terms of third-party products (software or data). With this brief correspondence we would like to recapitulate four important aspects of scientific research practice, which should be taken into consideration as early as possible in the course of any project. Keeping these in check will help neuroimaging to stay at the forefront of the open science movement.

  6. Integrated Software Systems for Crew Management During Extravehicular Activity in Planetary Terrain Exploration

    NASA Technical Reports Server (NTRS)

    Kuznetz, Lawrence; Nguen, Dan; Jones, Jeffrey; Lee, Pascal; Merrell, Ronald; Rafiq, Azhar

    2008-01-01

    Initial planetary explorations with the Apollo program had a veritable ground support army monitoring the safety and health of the 12 astronauts who performed lunar surface extravehicular activities (EVAs). Given the distances involved, this will not be possible on Mars. A spacesuit for Mars must be smart enough to replace that army. The next generation suits can do so using 2 software systems serving as virtual companions, LEGACI (Life support, Exploration Guidance Algorithm and Consumable Interrogator) and VIOLET (Voice Initiated Operator for Life support and Exploration Tracking). The system presented in this study integrates data inputs from a suite of sensors into the MIII suit s communications, avionics and informatics hardware for distribution to remote managers and data analysis. If successful, the system has application not only for Mars but for nearer term missions to the Moon, and the next generation suits used on ISS as well. Field tests are conducted to assess capabilities for next generation spacesuits at Johnson Space Center (JSC) as well as the Mars and Lunar analog (Devon Island, Canada). LEGACI integrates data inputs from a suite of noninvasive biosensors in the suit and the astronaut (heart rate, suit inlet/outlet lcg temperature and flowrate, suit outlet gas and dewpoint temperature, pCO2, suit O2 pressure, state vector (accelerometry) and others). In the Integrated Walkback Suit Tests held at NASA-JSC and the HMP tests at Devon Island, communication and informatics capabilities were tested (including routing by satellite from the suit at Devon Island to JSC in Houston via secure servers at VCU in Richmond, VA). Results. The input from all the sensors enable LEGACI to compute multiple independent assessments of metabolic rate, from which a "best" met rate is chosen based on statistical methods. This rate can compute detailed information about the suit, crew and EVA performance using test-derived algorithms. VIOLET gives LEGACI voice activation capability, allowing the crew to query the suit, and receive feedback and alerts that will lead to corrective action. LEGACI and VIOLET can also automatically control the astronaut's cooling and consumable use rate without crew input if desired. These findings suggest that non-invasive physiological and environmental sensors supported with data analysis can allow for more effective management of mission task performance during EVA. Integrated remote and local view of data metrics allow crewmember to receive real time feedback in synch with mission control in preventing performance shortcomings for EVA in exploration missions.

  7. A Mobile Asset Tracking System Architecture under Mobile-Stationary Co-Existing WSNs

    PubMed Central

    Kim, Tae Hyon; Jo, Hyeong Gon; Lee, Jae Shin; Kang, Soon Ju

    2012-01-01

    The tracking of multiple wireless mobile nodes is not easy with current legacy WSN technologies, due to their inherent technical complexity, especially when heavy traffic and frequent movement of mobile nodes are encountered. To enable mobile asset tracking under these legacy WSN systems, it is necessary to design a specific system architecture that can manage numerous mobile nodes attached to mobile assets. In this paper, we present a practical system architecture including a communication protocol, a three-tier network, and server-side middleware for mobile asset tracking in legacy WSNs consisting of mobile-stationary co-existing infrastructures, and we prove the functionality of this architecture through careful evaluation in a test bed. Evaluation was carried out in a microwave anechoic chamber as well as on a straight road near our office. We evaluated communication mobility performance between mobile and stationary nodes, location-awareness performance, system stability under numerous mobile node conditions, and the successful packet transfer rate according to the speed of the mobile nodes. The results indicate that the proposed architecture is sufficiently robust for application in realistic mobile asset tracking services that require a large number of mobile nodes. PMID:23242277

  8. The Legacy Archive for Microwave Background Data Analysis (LAMBDA)

    NASA Astrophysics Data System (ADS)

    Miller, Nathan; LAMBDA

    2018-01-01

    The Legacy Archive for Microwave Background Data Analysis (LAMBDA) provides CMB researchers with archival data for cosmology missions, software tools, and links to other sites of interest. LAMBDA is one-stop shopping for CMB researchers. It hosts data from WMAP along with many suborbital experiments. Over the past year, LAMBDA has acquired new data from SPTpol, SPIDER and ACTPol. In addition to the primary CMB, LAMBDA also provides foreground data.LAMBDA has several ongoing efforts to provide tools for CMB researchers. These tools include a web interface for CAMB and a web interface for a CMB survey footprint database and plotting tool. Additionally, we have recently developed a Docker container with standard CMB analysis tools and demonstrations in the form of Jupyter notebooks. These containers will be publically available through Docker's container repository and the source will be available on github.

  9. Mitigating PQ Problems in Legacy Data Centers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ilinets, Boris; /SLAC

    2011-06-01

    The conclusions of this presentation are: (1) Problems with PQ in legacy data centers still exist and need to be mitigated; (2) Harmonics generated by non-linear IT load can be lowered by passive, active and hybrid cancellation methods; (3) Harmonic study is necessary to find the best way to treat PQ problems; (4) AHF's and harmonic cancellation transformers proved to be very efficient in mitigating PQ problems; and (5) It is important that IT leaders partner with electrical engineering to appropriate ROI statements, justifying many of these expenditures.

  10. Modernization and optimization of a legacy open-source CFD code for high-performance computing architectures

    DOE PAGES

    Gel, Aytekin; Hu, Jonathan; Ould-Ahmed-Vall, ElMoustapha; ...

    2017-03-20

    Legacy codes remain a crucial element of today's simulation-based engineering ecosystem due to the extensive validation process and investment in such software. The rapid evolution of high-performance computing architectures necessitates the modernization of these codes. One approach to modernization is a complete overhaul of the code. However, this could require extensive investments, such as rewriting in modern languages, new data constructs, etc., which will necessitate systematic verification and validation to re-establish the credibility of the computational models. The current study advocates using a more incremental approach and is a culmination of several modernization efforts of the legacy code MFIX, whichmore » is an open-source computational fluid dynamics code that has evolved over several decades, widely used in multiphase flows and still being developed by the National Energy Technology Laboratory. Two different modernization approaches,‘bottom-up’ and ‘top-down’, are illustrated. Here, preliminary results show up to 8.5x improvement at the selected kernel level with the first approach, and up to 50% improvement in total simulated time with the latter were achieved for the demonstration cases and target HPC systems employed.« less

  11. Modernization and optimization of a legacy open-source CFD code for high-performance computing architectures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gel, Aytekin; Hu, Jonathan; Ould-Ahmed-Vall, ElMoustapha

    Legacy codes remain a crucial element of today's simulation-based engineering ecosystem due to the extensive validation process and investment in such software. The rapid evolution of high-performance computing architectures necessitates the modernization of these codes. One approach to modernization is a complete overhaul of the code. However, this could require extensive investments, such as rewriting in modern languages, new data constructs, etc., which will necessitate systematic verification and validation to re-establish the credibility of the computational models. The current study advocates using a more incremental approach and is a culmination of several modernization efforts of the legacy code MFIX, whichmore » is an open-source computational fluid dynamics code that has evolved over several decades, widely used in multiphase flows and still being developed by the National Energy Technology Laboratory. Two different modernization approaches,‘bottom-up’ and ‘top-down’, are illustrated. Here, preliminary results show up to 8.5x improvement at the selected kernel level with the first approach, and up to 50% improvement in total simulated time with the latter were achieved for the demonstration cases and target HPC systems employed.« less

  12. Catchment Legacies and Time Lags: A Parsimonious Watershed Model to Predict the Effects of Legacy Storage on Nitrogen Export

    PubMed Central

    Van Meter, Kimberly J.; Basu, Nandita B.

    2015-01-01

    Nutrient legacies in anthropogenic landscapes, accumulated over decades of fertilizer application, lead to time lags between implementation of conservation measures and improvements in water quality. Quantification of such time lags has remained difficult, however, due to an incomplete understanding of controls on nutrient depletion trajectories after changes in land-use or management practices. In this study, we have developed a parsimonious watershed model for quantifying catchment-scale time lags based on both soil nutrient accumulations (biogeochemical legacy) and groundwater travel time distributions (hydrologic legacy). The model accurately predicted the time lags observed in an Iowa watershed that had undergone a 41% conversion of area from row crop to native prairie. We explored the time scales of change for stream nutrient concentrations as a function of both natural and anthropogenic controls, from topography to spatial patterns of land-use change. Our results demonstrate that the existence of biogeochemical nutrient legacies increases time lags beyond those due to hydrologic legacy alone. In addition, we show that the maximum concentration reduction benefits vary according to the spatial pattern of intervention, with preferential conversion of land parcels having the shortest catchment-scale travel times providing proportionally greater concentration reductions as well as faster response times. In contrast, a random pattern of conversion results in a 1:1 relationship between percent land conversion and percent concentration reduction, irrespective of denitrification rates within the landscape. Our modeling framework allows for the quantification of tradeoffs between costs associated with implementation of conservation measures and the time needed to see the desired concentration reductions, making it of great value to decision makers regarding optimal implementation of watershed conservation measures. PMID:25985290

  13. Catchment legacies and time lags: a parsimonious watershed model to predict the effects of legacy storage on nitrogen export.

    PubMed

    Van Meter, Kimberly J; Basu, Nandita B

    2015-01-01

    Nutrient legacies in anthropogenic landscapes, accumulated over decades of fertilizer application, lead to time lags between implementation of conservation measures and improvements in water quality. Quantification of such time lags has remained difficult, however, due to an incomplete understanding of controls on nutrient depletion trajectories after changes in land-use or management practices. In this study, we have developed a parsimonious watershed model for quantifying catchment-scale time lags based on both soil nutrient accumulations (biogeochemical legacy) and groundwater travel time distributions (hydrologic legacy). The model accurately predicted the time lags observed in an Iowa watershed that had undergone a 41% conversion of area from row crop to native prairie. We explored the time scales of change for stream nutrient concentrations as a function of both natural and anthropogenic controls, from topography to spatial patterns of land-use change. Our results demonstrate that the existence of biogeochemical nutrient legacies increases time lags beyond those due to hydrologic legacy alone. In addition, we show that the maximum concentration reduction benefits vary according to the spatial pattern of intervention, with preferential conversion of land parcels having the shortest catchment-scale travel times providing proportionally greater concentration reductions as well as faster response times. In contrast, a random pattern of conversion results in a 1:1 relationship between percent land conversion and percent concentration reduction, irrespective of denitrification rates within the landscape. Our modeling framework allows for the quantification of tradeoffs between costs associated with implementation of conservation measures and the time needed to see the desired concentration reductions, making it of great value to decision makers regarding optimal implementation of watershed conservation measures.

  14. Modular Software Interfaces for Revolutionary Flexibility in Space Operations

    NASA Technical Reports Server (NTRS)

    Glass, Brian; Braham, Stephen; Pollack, Jay

    2005-01-01

    To make revolutionary improvements in exploration, space systems need to be flexible, realtime reconfigurable, and able to trade data transparently among themselves and mission operations. Onboard operations systems, space assembly coordination and EVA systems in exploration and construction all require real-time modular reconfigurability and data sharing. But NASA's current exploration systems are still largely legacies from hastily-developed, one-off Apollo-era practices. Today's rovers, vehicles, spacesuits, space stations, and instruments are not able to plug-and-play, Lego-like: into different combinations. Point-to-point dominates - individual suit to individual vehicle, individual instrument to rover. All are locally optimized, all unique, each of the data interfaces has been recoded for each possible combination. This will be an operations and maintenance nightmare in the much larger Project Constellation system of systems. This legacy approach does not scale to the hundreds of networked space components needed for space construction and for new, space-based approaches to Earth-Moon operations. By comparison, battlefield information management systems, which are considered critical to military force projection, have long since abandoned a point-to-point approach to systems integration. From a system-of-systems viewpoint, a clean-sheet redesign of the interfaces of all exploration systems is a necessary prerequisite before designing the interfaces of the individual exploration systems. Existing communications and Global Information Grid and middleware technologies are probably sufficient for command and control and information interfaces, with some hardware and time-delay modifications for space environments. NASA's future advanced space operations must also be information and data compatible with aerospace operations and surveillance systems being developed by other US Government agencies such as the Department of Homeland Security, Federal Aviation Administration and Department of Defense. This paper discusses fundamental system-of-systems infrastructure: approaches and architectures for modular plug-and-play software interfaces for revolutionary improvements in flexibility, modularity, robustness, ease of maintenance, reconfigurability, safety and productivity. Starting with middleware, databases, and mobile communications technologies, our technical challenges will be to apply these ideas to the requirements of constellations of space systems and to implement them initially on prototype space hardware. This is necessary to demonstrate an integrated information sharing architecture and services. It is a bottom-up approach, one that solves the problem of space operations data integration. Exploration demands uniform software mechanisms for application information interchange, and the corresponding uniformly available software services to enhance these mechanisms. We will examine the issues in plug-and-play, real-time-configurable systems, including common definition and management and tracking of data and information among many different space systems. Different field test approaches are discussed, including the use of the International Space Station and terrestrial analog mission operations at field sites.

  15. Summary of Research 2000, Department of Systems Management

    DTIC Science & Technology

    2001-12-01

    Postgraduate School, June 2000. Fryzlewicz, J., "Analysis of Measures of Performance and Continuous Improvement at the Naval Dental Center Pearl Harbor," Masters...mart driven relational system. Fourth, using the prototype relational data mart as a source system, a contemporary OLAP application is used to prove the...warehouse solution to integrating legacy systems are discussed. DoD KEY TECHNOLOGY AREA: Computing and Software KEYWORDS: OLAP , Data Warehouse

  16. Enabling data science in the Gaia mission archive: The present-day mass function and age distribution

    NASA Astrophysics Data System (ADS)

    Tapiador, D.; Berihuete, A.; Sarro, L. M.; Julbe, F.; Huedo, E.

    2017-04-01

    Recent advances in large scale computing architectures enable new opportunities to extract value out of the vast amounts of data being currently generated. However, their successful adoption is not straightforward in areas like science, as there are still some barriers that need to be overcome. Those comprise (i) the existence of legacy code that needs to be ported, (ii) the lack of high-level and use case specific frameworks that facilitate a smoother transition, or (iii) the scarcity of profiles with the balanced skill sets between the technological and scientific domains. The European Space Agency's Gaia mission will create the largest and most precise three dimensional chart of our galaxy (the Milky Way), providing unprecedented position, parallax and proper motion measurements for about one billion stars. The successful exploitation of this data archive will depend on the ability to offer the proper infrastructure upon which scientists will be able to do exploration and modelling with this huge data set. In this paper, we present and contextualize these challenges by building two probabilistic models using Hierarchical Bayesian Modelling. These models represent a key challenge in astronomy and are of paramount importance for the Gaia mission itself. Moreover, we approach the implementation by leveraging a generic distributed processing engine through an existing software package for Markov chain Monte Carlo sampling. The two computationally intensive models are then validated with simulated data in different scenarios under specific restrictions, and their performance is assessed to prove their scalability. We argue that this approach will not only serve for the models in hand but also for exemplifying how to address similar problems in science, which may need to both scale to bigger data sets and reuse existing software as much as possible. This will lead to shorter time to science in massive data archives.

  17. Implicit Particle Filter for Power System State Estimation with Large Scale Renewable Power Integration.

    NASA Astrophysics Data System (ADS)

    Uzunoglu, B.; Hussaini, Y.

    2017-12-01

    Implicit Particle Filter is a sequential Monte Carlo method for data assimilation that guides the particles to the high-probability by an implicit step . It optimizes a nonlinear cost function which can be inherited from legacy assimilation routines . Dynamic state estimation for almost real-time applications in power systems are becomingly increasingly more important with integration of variable wind and solar power generation. New advanced state estimation tools that will replace the old generation state estimation in addition to having a general framework of complexities should be able to address the legacy software and able to integrate the old software in a mathematical framework while allowing the power industry need for a cautious and evolutionary change in comparison to a complete revolutionary approach while addressing nonlinearity and non-normal behaviour. This work implements implicit particle filter as a state estimation tool for the estimation of the states of a power system and presents the first implicit particle filter application study on a power system state estimation. The implicit particle filter is introduced into power systems and the simulations are presented for a three-node benchmark power system . The performance of the filter on the presented problem is analyzed and the results are presented.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sadayappan, Ponnuswamy

    Exascale computing systems will provide a thousand-fold increase in parallelism and a proportional increase in failure rate relative to today's machines. Systems software for exascale machines must provide the infrastructure to support existing applications while simultaneously enabling efficient execution of new programming models that naturally express dynamic, adaptive, irregular computation; coupled simulations; and massive data analysis in a highly unreliable hardware environment with billions of threads of execution. We propose a new approach to the data and work distribution model provided by system software based on the unifying formalism of an abstract file system. The proposed hierarchical data model providesmore » simple, familiar visibility and access to data structures through the file system hierarchy, while providing fault tolerance through selective redundancy. The hierarchical task model features work queues whose form and organization are represented as file system objects. Data and work are both first class entities. By exposing the relationships between data and work to the runtime system, information is available to optimize execution time and provide fault tolerance. The data distribution scheme provides replication (where desirable and possible) for fault tolerance and efficiency, and it is hierarchical to make it possible to take advantage of locality. The user, tools, and applications, including legacy applications, can interface with the data, work queues, and one another through the abstract file model. This runtime environment will provide multiple interfaces to support traditional Message Passing Interface applications, languages developed under DARPA's High Productivity Computing Systems program, as well as other, experimental programming models. We will validate our runtime system with pilot codes on existing platforms and will use simulation to validate for exascale-class platforms. In this final report, we summarize research results from the work done at the Ohio State University towards the larger goals of the project listed above.« less

  19. Integration of an open interface PC scene generator using COTS DVI converter hardware

    NASA Astrophysics Data System (ADS)

    Nordland, Todd; Lyles, Patrick; Schultz, Bret

    2006-05-01

    Commercial-Off-The-Shelf (COTS) personal computer (PC) hardware is increasingly capable of computing high dynamic range (HDR) scenes for military sensor testing at high frame rates. New electro-optical and infrared (EO/IR) scene projectors feature electrical interfaces that can accept the DVI output of these PC systems. However, military Hardware-in-the-loop (HWIL) facilities such as those at the US Army Aviation and Missile Research Development and Engineering Center (AMRDEC) utilize a sizeable inventory of existing projection systems that were designed to use the Silicon Graphics Incorporated (SGI) digital video port (DVP, also known as DVP2 or DD02) interface. To mate the new DVI-based scene generation systems to these legacy projection systems, CG2 Inc., a Quantum3D Company (CG2), has developed a DVI-to-DVP converter called Delta DVP. This device takes progressive scan DVI input, converts it to digital parallel data, and combines and routes color components to derive a 16-bit wide luminance channel replicated on a DVP output interface. The HWIL Functional Area of AMRDEC has developed a suite of modular software to perform deterministic real-time, wave band-specific rendering of sensor scenes, leveraging the features of commodity graphics hardware and open source software. Together, these technologies enable sensor simulation and test facilities to integrate scene generation and projection components with diverse pedigrees.

  20. The Offline Software Framework of the NA61/SHINE Experiment

    NASA Astrophysics Data System (ADS)

    Sipos, Roland; Laszlo, Andras; Marcinek, Antoni; Paul, Tom; Szuba, Marek; Unger, Michael; Veberic, Darko; Wyszynski, Oskar

    2012-12-01

    NA61/SHINE (SHINE = SPS Heavy Ion and Neutrino Experiment) is an experiment at the CERN SPS using the upgraded NA49 hadron spectrometer. Among its physics goals are precise hadron production measurements for improving calculations of the neutrino beam flux in the T2K neutrino oscillation experiment as well as for more reliable simulations of cosmic-ray air showers. Moreover, p+p, p+Pb and nucleus+nucleus collisions will be studied extensively to allow for a study of properties of the onset of deconfinement and search for the critical point of strongly interacting matter. Currently NA61/SHINE uses the old NA49 software framework for reconstruction, simulation and data analysis. The core of this legacy framework was developed in the early 1990s. It is written in different programming and scripting languages (C, pgi-Fortran, shell) and provides several concurrent data formats for the event data model, which includes also obsolete parts. In this contribution we will introduce the new software framework, called Shine, that is written in C++ and designed to comprise three principal parts: a collection of processing modules which can be assembled and sequenced by the user via XML files, an event data model which contains all simulation and reconstruction information based on STL and ROOT streaming, and a detector description which provides data on the configuration and state of the experiment. To assure a quick migration to the Shine framework, wrappers were introduced that allow to run legacy code parts as modules in the new framework and we will present first results on the cross validation of the two frameworks.

  1. FPGA Coprocessor for Accelerated Classification of Images

    NASA Technical Reports Server (NTRS)

    Pingree, Paula J.; Scharenbroich, Lucas J.; Werne, Thomas A.

    2008-01-01

    An effort related to that described in the preceding article focuses on developing a spaceborne processing platform for fast and accurate onboard classification of image data, a critical part of modern satellite image processing. The approach again has been to exploit the versatility of recently developed hybrid Virtex-4FX field-programmable gate array (FPGA) to run diverse science applications on embedded processors while taking advantage of the reconfigurable hardware resources of the FPGAs. In this case, the FPGA serves as a coprocessor that implements legacy C-language support-vector-machine (SVM) image-classification algorithms to detect and identify natural phenomena such as flooding, volcanic eruptions, and sea-ice break-up. The FPGA provides hardware acceleration for increased onboard processing capability than previously demonstrated in software. The original C-language program demonstrated on an imaging instrument aboard the Earth Observing-1 (EO-1) satellite implements a linear-kernel SVM algorithm for classifying parts of the images as snow, water, ice, land, or cloud or unclassified. Current onboard processors, such as on EO-1, have limited computing power, extremely limited active storage capability and are no longer considered state-of-the-art. Using commercially available software that translates C-language programs into hardware description language (HDL) files, the legacy C-language program, and two newly formulated programs for a more capable expanded-linear-kernel and a more accurate polynomial-kernel SVM algorithm, have been implemented in the Virtex-4FX FPGA. In tests, the FPGA implementations have exhibited significant speedups over conventional software implementations running on general-purpose hardware.

  2. A Highly Scalable Data Service (HSDS) using Cloud-based Storage Technologies for Earth Science Data

    NASA Astrophysics Data System (ADS)

    Michaelis, A.; Readey, J.; Votava, P.; Henderson, J.; Willmore, F.

    2017-12-01

    Cloud based infrastructure may offer several key benefits of scalability, built in redundancy, security mechanisms and reduced total cost of ownership as compared with a traditional data center approach. However, most of the tools and legacy software systems developed for online data repositories within the federal government were not developed with a cloud based infrastructure in mind and do not fully take advantage of commonly available cloud-based technologies. Moreover, services bases on object storage are well established and provided through all the leading cloud service providers (Amazon Web Service, Microsoft Azure, Google Cloud, etc…) of which can often provide unmatched "scale-out" capabilities and data availability to a large and growing consumer base at a price point unachievable from in-house solutions. We describe a system that utilizes object storage rather than traditional file system based storage to vend earth science data. The system described is not only cost effective, but shows a performance advantage for running many different analytics tasks in the cloud. To enable compatibility with existing tools and applications, we outline client libraries that are API compatible with existing libraries for HDF5 and NetCDF4. Performance of the system is demonstrated using clouds services running on Amazon Web Services.

  3. Legacy model integration for enhancing hydrologic interdisciplinary research

    NASA Astrophysics Data System (ADS)

    Dozier, A.; Arabi, M.; David, O.

    2013-12-01

    Many challenges are introduced to interdisciplinary research in and around the hydrologic science community due to advances in computing technology and modeling capabilities in different programming languages, across different platforms and frameworks by researchers in a variety of fields with a variety of experience in computer programming. Many new hydrologic models as well as optimization, parameter estimation, and uncertainty characterization techniques are developed in scripting languages such as Matlab, R, Python, or in newer languages such as Java and the .Net languages, whereas many legacy models have been written in FORTRAN and C, which complicates inter-model communication for two-way feedbacks. However, most hydrologic researchers and industry personnel have little knowledge of the computing technologies that are available to address the model integration process. Therefore, the goal of this study is to address these new challenges by utilizing a novel approach based on a publish-subscribe-type system to enhance modeling capabilities of legacy socio-economic, hydrologic, and ecologic software. Enhancements include massive parallelization of executions and access to legacy model variables at any point during the simulation process by another program without having to compile all the models together into an inseparable 'super-model'. Thus, this study provides two-way feedback mechanisms between multiple different process models that can be written in various programming languages and can run on different machines and operating systems. Additionally, a level of abstraction is given to the model integration process that allows researchers and other technical personnel to perform more detailed and interactive modeling, visualization, optimization, calibration, and uncertainty analysis without requiring deep understanding of inter-process communication. To be compatible, a program must be written in a programming language with bindings to a common implementation of the message passing interface (MPI), which includes FORTRAN, C, Java, the .NET languages, Python, R, Matlab, and many others. The system is tested on a longstanding legacy hydrologic model, the Soil and Water Assessment Tool (SWAT), to observe and enhance speed-up capabilities for various optimization, parameter estimation, and model uncertainty characterization techniques, which is particularly important for computationally intensive hydrologic simulations. Initial results indicate that the legacy extension system significantly decreases developer time, computation time, and the cost of purchasing commercial parallel processing licenses, while enhancing interdisciplinary research by providing detailed two-way feedback mechanisms between various process models with minimal changes to legacy code.

  4. Classical Biological Control of Invasive Legacy Crop Pests: New Technologies Offer Opportunities to Revisit Old Pest Problems in Perennial Tree Crops

    PubMed Central

    Hoddle, Mark S.; Warner, Keith; Steggall, John; Jetter, Karen M.

    2014-01-01

    Advances in scientific disciplines that support classical biological control have provided “new tools” that could have important applications for biocontrol programs for some long-established invasive arthropod pests. We suggest that these previously unavailable tools should be used in biological control programs targeting “legacy pests”, even if they have been targets of previously unsuccessful biocontrol projects. Examples of “new tools” include molecular analyses to verify species identities and likely geographic area of origin, climate matching and ecological niche modeling, preservation of natural enemy genetic diversity in quarantine, the use of theory from invasion biology to maximize establishment likelihoods for natural enemies, and improved understanding of the interactions between natural enemy and target pest microbiomes. This review suggests that opportunities exist for revisiting old pest problems and funding research programs using “new tools” for developing biological control programs for “legacy pests” could provide permanent suppression of some seemingly intractable pest problems. As a case study, we use citricola scale, Coccus pseudomagnoliarum, an invasive legacy pest of California citrus, to demonstrate the potential of new tools to support a new classical biological control program targeting this insect. PMID:26463063

  5. Astrolabe: Curating, Linking, and Computing Astronomy’s Dark Data

    NASA Astrophysics Data System (ADS)

    Heidorn, P. Bryan; Stahlman, Gretchen R.; Steffen, Julie

    2018-05-01

    Where appropriate repositories are not available to support all relevant astronomical data products, data can fall into darkness: unseen and unavailable for future reference and reuse. Some data in this category are legacy or old data, but newer data sets are also often uncurated and could remain dark. This paper provides a description of the design motivation and development of Astrolabe, a cyberinfrastructure project that addresses a set of community recommendations for locating and ensuring the long-term curation of dark or otherwise at-risk data and integrated computing. This paper also describes the outcomes of the series of community workshops that informed creation of Astrolabe. According to participants in these workshops, much astronomical dark data currently exist that are not curated elsewhere, as well as software that can only be executed by a few individuals and therefore becomes unusable because of changes in computing platforms. Astronomical research questions and challenges would be better addressed with integrated data and computational resources that fall outside the scope of existing observatory and space mission projects. As a solution, the design of the Astrolabe system is aimed at developing new resources for management of astronomical data. The project is based in CyVerse cyberinfrastructure technology and is a collaboration between the University of Arizona and the American Astronomical Society. Overall, the project aims to support open access to research data by leveraging existing cyberinfrastructure resources and promoting scientific discovery by making potentially useful data available to the astronomical community, in a computable format.

  6. Challenges in Achieving Trajectory-Based Operations

    NASA Technical Reports Server (NTRS)

    Cate, Karen Tung

    2012-01-01

    In the past few years much of the global ATM research community has proposed advanced systems based on Trajectory-Based Operations (TBO). The concept of TBO uses four-dimensional aircraft trajectories as the base information for managing safety and capacity. Both the US and European advanced ATM programs call for the sharing of trajectory data across different decision support tools for successful operations. However, the actual integration of TBO systems presents many challenges. Trajectory predictors are built to meet the specific needs of a particular system and are not always compatible with others. Two case studies are presented which examine the challenges of introducing a new concept into two legacy systems in regards to their trajectory prediction software. The first case describes the issues with integrating a new decision support tool with a legacy operational system which overlap in domain space. These tools perform similar functions but are driven by different requirements. The difference in the resulting trajectories can lead to conflicting advisories. The second case looks at integrating this same new tool with a legacy system originally developed as an integrated system, but diverged many years ago. Both cases illustrate how the lack of common architecture concepts for the trajectory predictors added cost and complexity to the integration efforts.

  7. SwaMURAy - Swapping Memory Unit for Radio Astronomy

    NASA Astrophysics Data System (ADS)

    Winberg, Simon

    2016-03-01

    This paper concerns design and performance testing of an HDL module called SwaMURAy that is a configurable, high-speed data sequencing and flow control module serving as an intermediary between data acquisition and subsequent processing stages. While a FIFO suffices for many applications, our case needed a more elaborate solution to overcome legacy design limitations. The SwaMURAy is designed around a system where a block of sampled data is acquired at a fast rate and is then distributed among multiple processing paths to achieve a desired overall processing rate. This architecture provides an effective design pattern around which various software defined radio (SDR) and radio astronomy applications can be built. This solution was partly in response to legacy design restrictions of the SDR platform we used, a difficulty likely experienced by many developers whereby new sampling peripherals are inhibited by legacy characteristics of an underlying reconfigurable platform. Our SDR platform had a planned lifetime of at least five years as a complete redesign and refabrication would be too costly. While the SwaMURAy overcame some performance problems, other problems arose. This paper overviews the SwaMURAy design, performance improvements achieved in an SDR case study, and discusses remaining limitations and workarounds we expect will achieve further improvements.

  8. XML Flight/Ground Data Dictionary Management

    NASA Technical Reports Server (NTRS)

    Wright, Jesse; Wiklow, Colette

    2007-01-01

    A computer program generates Extensible Markup Language (XML) files that effect coupling between the command- and telemetry-handling software running aboard a spacecraft and the corresponding software running in ground support systems. The XML files are produced by use of information from the flight software and from flight-system engineering. The XML files are converted to legacy ground-system data formats for command and telemetry, transformed into Web-based and printed documentation, and used in developing new ground-system data-handling software. Previously, the information about telemetry and command was scattered in various paper documents that were not synchronized. The process of searching and reading the documents was time-consuming and introduced errors. In contrast, the XML files contain all of the information in one place. XML structures can evolve in such a manner as to enable the addition, to the XML files, of the metadata necessary to track the changes and the associated documentation. The use of this software has reduced the extent of manual operations in developing a ground data system, thereby saving considerable time and removing errors that previously arose in the translation and transcription of software information from the flight to the ground system.

  9. Benefits of Matching Domain Structure for Planning Software: The Right Stuff

    NASA Technical Reports Server (NTRS)

    Billman, Dorrit Owen; Arsintescu, Lucica; Feary, Michael S.; Lee, Jessica Chia-Rong; Smith, Asha Halima; Tiwary, Rachna

    2011-01-01

    We investigated the role of domain structure in software design. We compared 2 planning applications, for a Mission Control group (International Space Station), and measured users speed and accuracy. Based on our needs analysis, we identified domain structure and used this to develop new prototype software that matched domain structure better than the legacy system. We took a high-fidelity analog of the natural task into the laboratory and found (large) periformance differences, favoring the system that matched domain structure. Our task design enabled us to attribute better periormance to better match of domain structure. We ran through the whole development cycle, in miniature, from needs analysis through design, development, and evaluation. Doing so enabled inferences not just about the particular systems compared, but also provided evidence for the viability of the design process (particularly needs analysis) that we are exploring.

  10. Resurrecting Legacy Code Using Ontosoft Knowledge-Sharing and Digital Object Management to Revitalize and Reproduce Software for Groundwater Management Research

    NASA Astrophysics Data System (ADS)

    Kwon, N.; Gentle, J.; Pierce, S. A.

    2015-12-01

    Software code developed for research is often used for a relatively short period of time before it is abandoned, lost, or becomes outdated. This unintentional abandonment of code is a valid problem in the 21st century scientific process, hindering widespread reusability and increasing the effort needed to develop research software. Potentially important assets, these legacy codes may be resurrected and documented digitally for long-term reuse, often with modest effort. Furthermore, the revived code may be openly accessible in a public repository for researchers to reuse or improve. For this study, the research team has begun to revive the codebase for Groundwater Decision Support System (GWDSS), originally developed for participatory decision making to aid urban planning and groundwater management, though it may serve multiple use cases beyond those originally envisioned. GWDSS was designed as a java-based wrapper with loosely federated commercial and open source components. If successfully revitalized, GWDSS will be useful for both practical applications as a teaching tool and case study for groundwater management, as well as informing theoretical research. Using the knowledge-sharing approaches documented by the NSF-funded Ontosoft project, digital documentation of GWDSS is underway, from conception to development, deployment, characterization, integration, composition, and dissemination through open source communities and geosciences modeling frameworks. Information assets, documentation, and examples are shared using open platforms for data sharing and assigned digital object identifiers. Two instances of GWDSS version 3.0 are being created: 1) a virtual machine instance for the original case study to serve as a live demonstration of the decision support tool, assuring the original version is usable, and 2) an open version of the codebase, executable installation files, and developer guide available via an open repository, assuring the source for the application is accessible with version control and potential for new branch developments. Finally, metadata about the software has been completed within the OntoSoft portal to provide descriptive curation, make GWDSS searchable, and complete documentation of the scientific software lifecycle.

  11. A HUMAN FACTORS ENGINEERING PROCESS TO SUPPORT HUMAN-SYSTEM INTERFACE DESIGN IN CONTROL ROOM MODERNIZATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kovesdi, C.; Joe, J.; Boring, R.

    The primary objective of the United States (U.S.) Department of Energy (DOE) Light Water Reactor Sustainability (LWRS) program is to sustain operation of the existing commercial nuclear power plants (NPPs) through a multi-pathway approach in conducting research and development (R&D). The Advanced Instrumentation, Information, and Control (II&C) System Technologies pathway conducts targeted R&D to address aging and reliability concerns with legacy instrumentation and control (I&C) and other information systems in existing U.S. NPPs. Control room modernization is an important part following this pathway, and human factors experts at Idaho National Laboratory (INL) have been involved in conducting R&D to supportmore » migration of new digital main control room (MCR) technologies from legacy analog and legacy digital I&C. This paper describes a human factors engineering (HFE) process that supports human-system interface (HSI) design in MCR modernization activities, particularly with migration of old digital to new digital I&C. The process described in this work is an expansion from the LWRS Report INL/EXT-16-38576, and is a requirements-driven approach that aligns with NUREG-0711 requirements. The work described builds upon the existing literature by adding more detail around key tasks and decisions to make when transitioning from HSI Design into Verification and Validation (V&V). The overall objective of this process is to inform HSI design and elicit specific, measurable, and achievable human factors criteria for new digital technologies. Upon following this process, utilities should have greater confidence with transitioning from HSI design into V&V.« less

  12. Reconfigurable Transceiver and Software-Defined Radio Architecture and Technology Evaluated for NASA Space Communications

    NASA Technical Reports Server (NTRS)

    Reinhart, Richard C.; Kacpura, Thomas J.

    2004-01-01

    The NASA Glenn Research Center is investigating the development and suitability of a software-based open-architecture for space-based reconfigurable transceivers (RTs) and software-defined radios (SDRs). The main objectives of this project are to enable advanced operations and reduce mission costs. SDRs are becoming more common because of the capabilities of reconfigurable digital signal processing technologies such as field programmable gate arrays and digital signal processors, which place radio functions in firmware and software that were traditionally performed with analog hardware components. Features of interest of this communications architecture include nonproprietary open standards and application programming interfaces to enable software reuse and portability, independent hardware and software development, and hardware and software functional separation. The goals for RT and SDR technologies for NASA space missions include prelaunch and on-orbit frequency and waveform reconfigurability and programmability, high data rate capability, and overall communications and processing flexibility. These operational advances over current state-of-art transceivers will be provided to reduce the power, mass, and cost of RTs and SDRs for space communications. The open architecture for NASA communications will support existing (legacy) communications needs and capabilities while providing a path to more capable, advanced waveform development and mission concepts (e.g., ad hoc constellations with self-healing networks and high-rate science data return). A study was completed to assess the state of the art in RT architectures, implementations, and technologies. In-house researchers conducted literature searches and analysis, interviewed Government and industry contacts, and solicited information and white papers from industry on space-qualifiable RTs and SDRs and their associated technologies for space-based NASA applications. The white papers were evaluated, compiled, and used to assess RT and SDR system architectures and core technology elements to determine an appropriate investment strategy to advance these technologies to meet future mission needs. The use of these radios in the space environment represents a challenge because of the space radiation suitability of the components, which drastically reduces the processing capability. The radios available for space are considered to be RTs (as opposed to SDRs), which are digitally programmable radios with selectable changes from an architecture combining analog and digital components. The limited flexibility of this design contrasts against the desire to have a power-efficient solution and open architecture.

  13. System and method for deriving a process-based specification

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael Gerard (Inventor); Rouff, Christopher A. (Inventor); Rash, James Larry (Inventor)

    2009-01-01

    A system and method for deriving a process-based specification for a system is disclosed. The process-based specification is mathematically inferred from a trace-based specification. The trace-based specification is derived from a non-empty set of traces or natural language scenarios. The process-based specification is mathematically equivalent to the trace-based specification. Code is generated, if applicable, from the process-based specification. A process, or phases of a process, using the features disclosed can be reversed and repeated to allow for an interactive development and modification of legacy systems. The process is applicable to any class of system, including, but not limited to, biological and physical systems, electrical and electro-mechanical systems in addition to software, hardware and hybrid hardware-software systems.

  14. When the New Application Smell Is Gone: Traditional Intranet Best Practices and Existing Web 2.0 Intranet Infrastructures

    ERIC Educational Resources Information Center

    Yoose, Becky

    2010-01-01

    With the growth of Web 2.0 library intranets in recent years, many libraries are leaving behind legacy, first-generation intranets. As Web 2.0 intranets multiply and mature, how will traditional intranet best practices--especially in the areas of planning, implementation, and evaluation--translate into an existing Web 2.0 intranet infrastructure?…

  15. Leading Efforts to Increase Organ Donation Through Professionalization of Organ Procurement Organizations and Establishment of Organ and Tissue Donor Registries.

    PubMed

    Vertanous, T; Czer, L S C; de Robertis, M; Kiankhooy, A; Kobashigawa, J; Esmailian, F; Trento, A

    2016-01-01

    The influence of new donor registrations through the California Organ and Tissue Donor Registry on the local OneLegacy Organ Procurement Organization (OPO) was examined during a 6-year period. Publicly available data from Donate Life America for California were examined for the 6 calendar years of 2009-2014. Performance data from OneLegacy for the same 6 years for organ donors and number of transplants were also examined. The donor designation rate (DDR) was defined as the rate at which new individuals joined the state donor registry as a percentage of all driver licenses and ID cards issued within a calendar year. The total donor designation (TDD) was defined as the sum of the new and existing people who were registered organ donors. Donor designation share (DDS) was the total number of designated donors as a percentage of all residents of the state who were ≥18 years old. The business practices and educational efforts of the OneLegacy OPO were examined as well. In California, from 2009 through 2014, the DDR was 25.5%-28%. When added to the existing donor registrations, the TDD and DDS increased each year from 2009 through 2014. With the current level of growth, it is projected that California will be able to reach a DDS of 50% by 2017. For the OneLegacy OPO, designated donors from the California Organ and Tissue Donor Registry made up 15% of the total donations in 2009, and 39% of the total donations in 2014, increasing by ∼5% each year since 2009. By increasing professionalization and transparency, and widening its educational and training efforts, OneLegacy was able to take advantage of an increasing percentage of donors who were designated donors and to increase the overall number of donors and organs transplanted, becoming one of the largest OPOs in the nation. This can be a model for OPOs in other donor service areas, and it may set the stage for the United States to serve as an example to the global community in the practice of organ donation. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. DSN Beowulf Cluster-Based VLBI Correlator

    NASA Technical Reports Server (NTRS)

    Rogstad, Stephen P.; Jongeling, Andre P.; Finley, Susan G.; White, Leslie A.; Lanyi, Gabor E.; Clark, John E.; Goodhart, Charles E.

    2009-01-01

    The NASA Deep Space Network (DSN) requires a broadband VLBI (very long baseline interferometry) correlator to process data routinely taken as part of the VLBI source Catalogue Maintenance and Enhancement task (CAT M&E) and the Time and Earth Motion Precision Observations task (TEMPO). The data provided by these measurements are a crucial ingredient in the formation of precision deep-space navigation models. In addition, a VLBI correlator is needed to provide support for other VLBI related activities for both internal and external customers. The JPL VLBI Correlator (JVC) was designed, developed, and delivered to the DSN as a successor to the legacy Block II Correlator. The JVC is a full-capability VLBI correlator that uses software processes running on multiple computers to cross-correlate two-antenna broadband noise data. Components of this new system (see Figure 1) consist of Linux PCs integrated into a Beowulf Cluster, an existing Mark5 data storage system, a RAID array, an existing software correlator package (SoftC) originally developed for Delta DOR Navigation processing, and various custom- developed software processes and scripts. Parallel processing on the JVC is achieved by assigning slave nodes of the Beowulf cluster to process separate scans in parallel until all scans have been processed. Due to the single stream sequential playback of the Mark5 data, some ramp-up time is required before all nodes can have access to required scan data. Core functions of each processing step are accomplished using optimized C programs. The coordination and execution of these programs across the cluster is accomplished using Pearl scripts, PostgreSQL commands, and a handful of miscellaneous system utilities. Mark5 data modules are loaded on Mark5 Data systems playback units, one per station. Data processing is started when the operator scans the Mark5 systems and runs a script that reads various configuration files and then creates an experiment-dependent status database used to delegate parallel tasks between nodes and storage areas (see Figure 2). This script forks into three processes: extract, translate, and correlate. Each of these processes iterates on available scan data and updates the status database as the work for each scan is completed. The extract process coordinates and monitors the transfer of data from each of the Mark5s to the Beowulf RAID storage systems. The translate process monitors and executes the data conversion processes on available scan files, and writes the translated files to the slave nodes. The correlate process monitors the execution of SoftC correlation processes on the slave nodes for scans that have completed translation. A comparison of the JVC and the legacy Block II correlator outputs reveals they are well within a formal error, and that the data are comparable with respect to their use in flight navigation. The processing speed of the JVC is improved over the Block II correlator by a factor of 4, largely due to the elimination of the reel-to-reel tape drives used in the Block II correlator.

  17. Polyfluorinated substances in abiotic standard reference materials

    EPA Science Inventory

    The National Institute of Standards and Technology (NIST) has a wide range of Standard Reference Materials (SRMs) which have values assigned for legacy organic pollutants and toxic elements. Existing SRMs serve as homogenous materials that can be used for method development, meth...

  18. The Legacy of Space Shuttle Flight Software

    NASA Technical Reports Server (NTRS)

    Hickey, Christopher J.; Loveall, James B.; Orr, James K.; Klausman, Andrew L.

    2011-01-01

    The initial goals of the Space Shuttle Program required that the avionics and software systems blaze new trails in advancing avionics system technology. Many of the requirements placed on avionics and software were accomplished for the first time on this program. Examples include comprehensive digital fly-by-wire technology, use of a digital databus for flight critical functions, fail operational/fail safe requirements, complex automated redundancy management, and the use of a high-order software language for flight software development. In order to meet the operational and safety goals of the program, the Space Shuttle software had to be extremely high quality, reliable, robust, reconfigurable and maintainable. To achieve this, the software development team evolved a software process focused on continuous process improvement and defect elimination that consistently produced highly predictable and top quality results, providing software managers the confidence needed to sign each Certificate of Flight Readiness (COFR). This process, which has been appraised at Capability Maturity Model (CMM)/Capability Maturity Model Integration (CMMI) Level 5, has resulted in one of the lowest software defect rates in the industry. This paper will present an overview of the evolution of the Primary Avionics Software System (PASS) project and processes over thirty years, an argument for strong statistical control of software processes with examples, an overview of the success story for identifying and driving out errors before flight, a case study of the few significant software issues and how they were either identified before flight or slipped through the process onto a flight vehicle, and identification of the valuable lessons learned over the life of the project.

  19. Three legacies of humanitarianism in China.

    PubMed

    Hirono, Miwa

    2013-10-01

    The rise of China has altered the context of the international humanitarian community of donors and aid agencies. China is becoming one of the key actors in this grouping, undertaking infrastructure projects in areas in which paramount humanitarian challenges exist. The literature discusses how the Chinese approach differs from that of Western donors, but it does not pay much attention to why China concentrates on its state-centric and infrastructure-based approach. This paper seeks to shed some light on this subject by examining the historical evolution of the concept of humanitarianism in China. This evolution has produced three legacies: (i) the ideal of a well-ordered state; (ii) anti-Western sentiment; and (iii) the notion of comprehensive development based on a human-oriented approach. China's policies and discourses on assistance in humanitarian crises today rest on these three legacies. Traditional donors would be well advised to consider carefully the implications of the Chinese understanding of humanitarianism when engaging with the country. © 2013 The Author(s). Disasters © Overseas Development Institute, 2013.

  20. Quasi 1D Modeling of Mixed Compression Supersonic Inlets

    NASA Technical Reports Server (NTRS)

    Kopasakis, George; Connolly, Joseph W.; Paxson, Daniel E.; Woolwine, Kyle J.

    2012-01-01

    The AeroServoElasticity task under the NASA Supersonics Project is developing dynamic models of the propulsion system and the vehicle in order to conduct research for integrated vehicle dynamic performance. As part of this effort, a nonlinear quasi 1-dimensional model of the 2-dimensional bifurcated mixed compression supersonic inlet is being developed. The model utilizes computational fluid dynamics for both the supersonic and subsonic diffusers. The oblique shocks are modeled utilizing compressible flow equations. This model also implements variable geometry required to control the normal shock position. The model is flexible and can also be utilized to simulate other mixed compression supersonic inlet designs. The model was validated both in time and in the frequency domain against the legacy LArge Perturbation INlet code, which has been previously verified using test data. This legacy code written in FORTRAN is quite extensive and complex in terms of the amount of software and number of subroutines. Further, the legacy code is not suitable for closed loop feedback controls design, and the simulation environment is not amenable to systems integration. Therefore, a solution is to develop an innovative, more simplified, mixed compression inlet model with the same steady state and dynamic performance as the legacy code that also can be used for controls design. The new nonlinear dynamic model is implemented in MATLAB Simulink. This environment allows easier development of linear models for controls design for shock positioning. The new model is also well suited for integration with a propulsion system model to study inlet/propulsion system performance, and integration with an aero-servo-elastic system model to study integrated vehicle ride quality, vehicle stability, and efficiency.

  1. The Computational Infrastructure for Geodynamics: An Example of Software Curation and Citation in the Geodynamics Community

    NASA Astrophysics Data System (ADS)

    Hwang, L.; Kellogg, L. H.

    2017-12-01

    Curation of software promotes discoverability and accessibility and works hand in hand with scholarly citation to ascribe value to, and provide recognition for software development. To meet this challenge, the Computational Infrastructure for Geodynamics (CIG) maintains a community repository built on custom and open tools to promote discovery, access, identification, credit, and provenance of research software for the geodynamics community. CIG (geodynamics.org) originated from recognition of the tremendous effort required to develop sound software and the need to reduce duplication of effort and to sustain community codes. CIG curates software across 6 domains and has developed and follows software best practices that include establishing test cases, documentation, and a citable publication for each software package. CIG software landing web pages provide access to current and past releases; many are also accessible through the CIG community repository on github. CIG has now developed abc - attribution builder for citation to enable software users to give credit to software developers. abc uses zenodo as an archive and as the mechanism to obtain a unique identifier (DOI) for scientific software. To assemble the metadata, we searched the software's documentation and research publications and then requested the primary developers to verify. In this process, we have learned that each development community approaches software attribution differently. The metadata gathered is based on guidelines established by groups such as FORCE11 and OntoSoft. The rollout of abc is gradual as developers are forward-looking, rarely willing to go back and archive prior releases in zenodo. Going forward all actively developed packages will utilize the zenodo and github integration to automate the archival process when a new release is issued. How to handle legacy software, multi-authored libraries, and assigning roles to software remain open issues.

  2. Integrating Salmon Recovery, Clean Water Act Compliance ...

    EPA Pesticide Factsheets

    "The South Fork Nooksack River (SFNR) is an important tributary to the Nooksack River, Bellingham Bay, and the Salish Sea. The South Fork Nooksack River comprises one of the 22 independent populations of spring Chinook in the Puget Sound Chinook Evolutionarily Significant Unit (ESU), which are listed as threatened under the Endangered Species Act (ESA). The population is considered essential for recovery of the ESU. The SFNR has suffered from legacy impacts, temperature exceedances and fine sediment, due to forestry, agriculture, flood control, and transportation facilities. The temperature exceedances threaten spring Chinook salmon survival and as such under the Clean Water Act, this pollution must be addressed through a total maximum daily load (TMDL) regulatory program. Further, climate change is projected to cumulatively add to the existing legacy impacts. Millions of dollars are spent on salmon habitat restoration in the SFNR that primarily addresses these legacy impacts, but few if any restoration actions take climate change into direct consideration. The Nooksack Indian Tribe and USEPA-ORD jointly completed a climate change pilot research project that addresses legacy impacts, ESA recovery actions, CWA regulatory compliance, and salmon habitat restoration in one comprehensive project. The project evaluates how land use impacts, including altered hydrology, stream temperature, sediment dynamics, and flooding of adjacent river floodplains, combined with pr

  3. Migrating from a legacy fixed-format measure to CAT administration: calibrating the PHQ-9 to the PROMIS depression measures

    PubMed Central

    Feldman, Betsy J.; Crane, Heidi M.; Mugavero, Michael; Willig, James H.; Patrick, Donald; Schumacher, Joseph; Saag, Michael; Kitahata, Mari M.; Crane, Paul K.

    2011-01-01

    Purpose We provide detailed instructions for analyzing patient-reported outcome (PRO) data collected with an existing (legacy) instrument so that scores can be calibrated to the PRO Measurement Information System (PROMIS) metric. This calibration facilitates migration to computerized adaptive test (CAT) PROMIS data collection, while facilitating research using historical legacy data alongside new PROMIS data. Methods A cross-sectional convenience sample (n = 2,178) from the Universities of Washington and Alabama at Birmingham HIV clinics completed the PROMIS short form and Patient Health Questionnaire (PHQ-9) depression symptom measures between August 2008 and December 2009. We calibrated the tests using item response theory. We compared measurement precision of the PHQ-9, the PROMIS short form, and simulated PROMIS CAT. Results Dimensionality analyses confirmed the PHQ-9 could be calibrated to the PROMIS metric. We provide code used to score the PHQ-9 on the PROMIS metric. The mean standard errors of measurement were 0.49 for the PHQ-9, 0.35 for the PROMIS short form, and 0.37, 0.28, and 0.27 for 3-, 8-, and 9-item-simulated CATs. Conclusions The strategy described here facilitated migration from a fixed-format legacy scale to PROMIS CAT administration and may be useful in other settings. PMID:21409516

  4. Migrating from a legacy fixed-format measure to CAT administration: calibrating the PHQ-9 to the PROMIS depression measures.

    PubMed

    Gibbons, Laura E; Feldman, Betsy J; Crane, Heidi M; Mugavero, Michael; Willig, James H; Patrick, Donald; Schumacher, Joseph; Saag, Michael; Kitahata, Mari M; Crane, Paul K

    2011-11-01

    We provide detailed instructions for analyzing patient-reported outcome (PRO) data collected with an existing (legacy) instrument so that scores can be calibrated to the PRO Measurement Information System (PROMIS) metric. This calibration facilitates migration to computerized adaptive test (CAT) PROMIS data collection, while facilitating research using historical legacy data alongside new PROMIS data. A cross-sectional convenience sample (n = 2,178) from the Universities of Washington and Alabama at Birmingham HIV clinics completed the PROMIS short form and Patient Health Questionnaire (PHQ-9) depression symptom measures between August 2008 and December 2009. We calibrated the tests using item response theory. We compared measurement precision of the PHQ-9, the PROMIS short form, and simulated PROMIS CAT. Dimensionality analyses confirmed the PHQ-9 could be calibrated to the PROMIS metric. We provide code used to score the PHQ-9 on the PROMIS metric. The mean standard errors of measurement were 0.49 for the PHQ-9, 0.35 for the PROMIS short form, and 0.37, 0.28, and 0.27 for 3-, 8-, and 9-item-simulated CATs. The strategy described here facilitated migration from a fixed-format legacy scale to PROMIS CAT administration and may be useful in other settings.

  5. Inspiring Collaboration: The Legacy of Theo Colborn's Transdisciplinary Research on Fracking.

    PubMed

    Wylie, Sara; Schultz, Kim; Thomas, Deborah; Kassotis, Chris; Nagel, Susan

    2016-09-13

    This article describes Dr Theo Colborn's legacy of inspiring complementary and synergistic environmental health research and advocacy. Colborn, a founder of endocrine disruption research, also stimulated study of hydraulic fracturing (fracking). In 2014, the United States led the world in oil and gas production, with fifteen million Americans living within one mile of an oil or gas well. Colborn pioneered efforts to understand and control the impacts of this sea change in energy production. In 2005, her research organization The Endocrine Disruption Exchange (TEDX) developed a database of chemicals used in natural gas extraction and their health effects. This database stimulated novel scientific and social scientific research and informed advocacy by (1) connecting communities' diverse health impacts to chemicals used in natural gas development, (2) inspiring social science research on open-source software and hardware for citizen science, and (3) posing new scientific questions about the endocrine-disrupting properties of fracking chemicals. © The Author(s) 2016.

  6. Design and Testing of Space Telemetry SCA Waveform

    NASA Technical Reports Server (NTRS)

    Mortensen, Dale J.; Handler, Louis M.; Quinn, Todd M.

    2006-01-01

    A Software Communications Architecture (SCA) Waveform for space telemetry is being developed at the NASA Glenn Research Center (GRC). The space telemetry waveform is implemented in a laboratory testbed consisting of general purpose processors, field programmable gate arrays (FPGAs), analog-to-digital converters (ADCs), and digital-to-analog converters (DACs). The radio hardware is integrated with an SCA Core Framework and other software development tools. The waveform design is described from both the bottom-up signal processing and top-down software component perspectives. Simulations and model-based design techniques used for signal processing subsystems are presented. Testing with legacy hardware-based modems verifies proper design implementation and dynamic waveform operations. The waveform development is part of an effort by NASA to define an open architecture for space based reconfigurable transceivers. Use of the SCA as a reference has increased understanding of software defined radio architectures. However, since space requirements put a premium on size, mass, and power, the SCA may be impractical for today s space ready technology. Specific requirements for an SCA waveform and other lessons learned from this development are discussed.

  7. Old and New Insurgency Forms

    DTIC Science & Technology

    2016-03-01

    undertaken to analyze and synthesize the post -Cold War insurgency form writings that have emerged over the last 2 de- cades. It is apropos that these...implosion of the Soviet Union, post -Cold War insurgency typologies began to emerge because a need existed to understand where this component of the...provide a literature review of the post -Cold War in- surgency typologies that exist, create a proposed in- surgency typology divided into legacy

  8. VOClient: Application Integration in the Virtual Observatory

    NASA Astrophysics Data System (ADS)

    Fitzpatrick, Michael J.; Tody, D.

    2007-12-01

    We present VOClient, a new software package that provides a high-level, easy-to-use, programmable interface between desktop applications and the distributed VO framework, providing access to remote VO data and services, reference implementations for VO data-providers and end-user applications. Applications have traditionally been written to deal directly with local images, catalogs or spectra; VOClient allows these applications to use remote VO data and services without requiring a developer to know the details of the underlying and evolving VO technologies. The programmable interface provides equivalent functionality for a wide variety of both legacy and modern development languages and environments and can be easily extended to add new functionality. The server component of the project provides a reference implementation and toolkit which can be used to build VO data services, and the commandline tools provide ready-to-use applications to access VO data and services from the desktop or scripting environment. The use of VOClient to integrate VO technologies with legacy systems such as IRAF is examined as a case-study, and the use of these techniques in other environments, especially their applicability to legacy code and systems, is also discussed. VOClient is meant both for the astronomer wishing to revive an old and trusted task with new VO capabiities, as well as the institutional project providing data or services to the Virtual Observatory.

  9. Legacy sediment storage in New England river valleys: anthropogenic processes in a postglacial landscape

    NASA Astrophysics Data System (ADS)

    Snyder, N. P.; Johnson, K. M.; Waltner, M.; Hopkins, A. J.; Dow, S.; Ames, E.; Merritts, D. J.; Walter, R. C.; Rahnis, M. A.

    2016-12-01

    Walter and Merritts (2008, and subsequent papers) show that legacy sediment associated with deposition in millponds is a common feature in river valleys of the Mid-Atlantic Piedmont region, with 1-5 m of fine sand and silt overlying Holocene soil and Pleistocene periglacial deposits. For this project, we seek to test the hypothesis that these field relationships are seen in New England, a formerly glaciated region with similar history and intensity of forest clearing and milldam construction during the 17-19th centuries. We study three watersheds, using field observations of bank stratigraphy, radiocarbon dating, and mapping of terraces and floodplains using lidar digital elevation models and other GIS datasets. The 68 km2 South River watershed in western Massachusetts exhibits the most extensive evidence for legacy sediment storage. We visited 17 historic dam sites in the watershed and found field evidence for fine sand and silt legacy sediment storage at 14, up to 2.2 m thick. In the 558 km2 Sheepscot River watershed in coastal Maine, we visited 12 historic dam sites, and found likely legacy sediment at six, up to 2.3 m thick. In the 171 km2 upper Charles River watershed in eastern Massachusetts, we investigated 14 dam sites, and found legacy sediment at two, up to 1.8 m thick. Stratigraphically, we identified the base of legacy sediment from a change in grain size to gravel at most sites, or to Pleistocene marine clay at some Sheepscot River sites. In the Sheepscot River, we observed cut timbers underlying historic sediment at several locations, likely associated with sawmill activities. Only at the Charles River were we able to radiocarbon date the underlying gravel (1281-1391 calibrated CE). At no site did we find a buried Holocene soil, in contrast to the field relations commonly observed in the Mid-Atlantic region. This may indicate that the New England sites have eroded to the pre-historic river bed, not floodplain surfaces. We attribute the variation in thickness and presence of legacy sediment at the New England sites to the existence or absence of upstream sediment supply in the form of thick (>5 m) glacial deposits. Of the three study watersheds, the South River has the most extensive glacial sediments, having been occupied by one or more ice-dammed lakes during the late Pleistocene, and the most legacy sediment storage.

  10. A Plan for Academic Biobank Solvency-Leveraging Resources and Applying Business Processes to Improve Sustainability.

    PubMed

    Uzarski, Diane; Burke, James; Turner, Barbara; Vroom, James; Short, Nancy

    2015-10-01

    Researcher-initiated biobanks based at academic institutions contribute valuable biomarker and translational research advances to medicine. With many legacy banks once supported by federal funding, reductions in fiscal support threaten the future of existing and new biobanks. When the Brain Bank at Duke University's Bryan Alzheimer's Disease Center (ADRC) faced a funding crisis, a collaborative, multidisciplinary team embarked on a 2-year biobank sustainability project utilizing a comprehensive business strategy, dedicated project management, and a systems approach involving many Duke University entities. By synthesizing and applying existing knowledge, Duke Translational Medicine Institute created and launched a business model that can be adjusted and applied to legacy and start-up academic biobanks. This model provides a path to identify new funding mechanisms, while also emphasizing improved communication, business development, and a focus on collaborating with industry to improve access to biospecimens. Benchmarks for short-term Brain Bank stabilization have been successfully attained, and the evaluation of long-term sustainability metrics is ongoing. © 2015 Wiley Periodicals, Inc.

  11. A Plan for Academic Biobank Solvency—Leveraging Resources and Applying Business Processes to Improve Sustainability

    PubMed Central

    Burke, James; Turner, Barbara; Vroom, James; Short, Nancy

    2015-01-01

    Abstract Researcher‐initiated biobanks based at academic institutions contribute valuable biomarker and translational research advances to medicine. With many legacy banks once supported by federal funding, reductions in fiscal support threaten the future of existing and new biobanks. When the Brain Bank at Duke University's Bryan Alzheimer's Disease Center (ADRC) faced a funding crisis, a collaborative, multidisciplinary team embarked on a 2‐year biobank sustainability project utilizing a comprehensive business strategy, dedicated project management, and a systems approach involving many Duke University entities. By synthesizing and applying existing knowledge, Duke Translational Medicine Institute created and launched a business model that can be adjusted and applied to legacy and start‐up academic biobanks. This model provides a path to identify new funding mechanisms, while also emphasizing improved communication, business development, and a focus on collaborating with industry to improve access to biospecimens. Benchmarks for short‐term Brain Bank stabilization have been successfully attained, and the evaluation of long‐term sustainability metrics is ongoing. PMID:25996355

  12. Sustainable legacies for the 2012 Olympic Games.

    PubMed

    Shipway, Richard

    2007-05-01

    The London 2012 Olympic and Paralympic Games have the unique potential to deliver sustainable sporting, social, cultural, economic and environmental legacies, not just for London as the host city, but for the whole of Britain. This article focuses primarily on the first three of these potential Olympics legacies. The first area explored is the social legacy as it impacts on host communities; second, the potential educational and cultural legacy of the 2012 Games are examined; and finally, there follows an overview of the health benefits that could result from a sustained increase in mass participation in sport, physical activity and exercise. This appraisal is undertaken through a review of existing Olympic literature and examples are drawn from previous summer and winter Games. This preliminary exploration is followed by the identification of some key challenges to be overcome if the opportunities available to a wide and diverse range of stakeholders are to be fully optimized. The article suggests that the 2012 Games can act as a catalyst for sports development throughout Britain, while also assisting with government cross-cutting agendas such as tackling crime, antisocial behaviour, developing healthy and active communities, improving educational attainment, and combating barriers to participation. In doing so, this article argues that priority should be placed at supporting grassroots sport through greater access to sport in the community, and not solely elite level sports development. The article concludes by suggesting that the 2012 Games provide opportunities to deliver real and tangible changes and most importantly, to afford a higher priority to sport, along with the obvious associated health benefits for Britain as a whole. The underlying challenge as we move towards 2012 is to achieve a positive step change in the attitudes towards sport and physical activity in British society. Achieving this would possibly be the greatest legacy of the 2012 Olympic and Paralympic Games.

  13. After the Fall: The RHESSI Legacy Archive

    NASA Astrophysics Data System (ADS)

    Schwartz, Richard A.; Zarro, Dominic M.; Tolbert, Anne K.

    2017-08-01

    Launched in 2002 the Ramaty High Energy Solar Spectroscopic Imager (RHESSI) continues to observe the Sun with a nearly 50% duty cycle. During that time the instrument has recorded ~100,000 solar flares in energies from 4 keV to over 10 MeV.with durations of 10s to 1000s of seconds. However, for the reasons of the decline of the solar cycle, possible failure of the instrument, or the absence of funding, our operational phase will end someday. We describe here our plans to continue to serve this dataset in raw, processed, and analyzed forms to the worldwide solar community to continue our legacy of a stream of rich scientific results.We have and are providing a stream of quicklook lightcurves, spectra, and images that we mainly serve through a web interface as well as the data in raw form to be fully analyzed within our own branch of Solar Software written in IDL. We are in the process of creating higher quality images for flares in multiple energy bands on relevant timescales for those whose needs can be met without further processing. For users with IDL licenses we expect this software to be available far into the unknowable future. Together with a database of AIA cutouts during all SDO-era flares, along with software to recover saturated images by using the AIA diffraction fringes, these will be a highly used resource.We also are developing additional tools and databases that will increase the utility of RHESSI data to members of the community with and without either IDL licenses or full access to the RHESSI database. We will provide a database of RHESSI x-ray visibilities obtained during flares at a >4 second cadence over a broad range of detectable energies. With our IDL software those can be rendered as images for times and energies of nearly the analysts's choosing. And going beyond that we are converting our imaging procedures to the Python language to eliminate the need for an IDL license. We are also developing methods to allow the customization of these visibilities in time and energy by access from a non-local server which has full access to all of the IDL software and database files.

  14. LMS Transitioning to "Moodle": A Surprising Case of Successful, Emergent Change Management

    ERIC Educational Resources Information Center

    Lawler, Alan

    2011-01-01

    During 2009-10 the University of Ballarat implemented the open-source learning management system (LMS) "Moodle" alongside its existing legacy LMS, "Blackboard". While previous IT implementations have been troublesome at the university, notably the student information and finance management systems in 2008-09, the…

  15. Inequality, Segregation and Poor Performance: The Education System in Northern Ireland

    ERIC Educational Resources Information Center

    Borooah, Vani K.; Knox, Colin

    2017-01-01

    Northern Ireland is now a post-conflict society but one of the legacies of the "troubles" is an education system which is defined by religious affiliation/identity. A parallel system of schools continues to exist where Catholics largely attend "maintained" schools and Protestants "controlled" or state schools. While…

  16. The American Indian Holocaust: Healing Historical Unresolved Grief.

    ERIC Educational Resources Information Center

    Brave Heart, Maria Yellow Horse; DeBruyn, Lemyra M.

    1998-01-01

    Argues for the existence of historical unresolved grief among American Indians. Outlines the historical legacy of war, genocide, and boarding schools resulting in intergenerational trauma and a host of associated social problems. Suggests healing strategies that integrate modern and traditional approaches to healing at the individual, family, and…

  17. Environmental Assessment for Selection and Operation of the Proposed Field Research Centers for the Natural and Accelerated Bioremediation Research (NABIR) Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    N /A

    2000-04-18

    The US Department of Energy (DOE) Office of Biological and Environmental Research (OBER), within the Office of Science (SC), proposes to add a Field Research Center (FRC) component to the existing Natural and Accelerated Bioremediation Research (NABIR) Program. The NABIR Program is a ten-year fundamental research program designed to increase the understanding of fundamental biogeochemical processes that would allow the use of bioremediation approaches for cleaning up DOE's contaminated legacy waste sites. An FRC would be integrated with the existing and future laboratory and field research and would provide a means of examining the fundamental biogeochemical processes that influence bioremediationmore » under controlled small-scale field conditions. The NABIR Program would continue to perform fundamental research that might lead to promising bioremediation technologies that could be demonstrated by other means in the future. For over 50 years, DOE and its predecessor agencies have been responsible for the research, design, and production of nuclear weapons, as well as other energy-related research and development efforts. DOE's weapons production and research activities generated hazardous, mixed, and radioactive waste products. Past disposal practices have led to the contamination of soils, sediments, and groundwater with complex and exotic mixtures of compounds. This contamination and its associated costs and risks represents a major concern to DOE and the public. The high costs, long duration, and technical challenges associated with remediating the subsurface contamination at DOE sites present a significant need for fundamental research in the biological, chemical, and physical sciences that will contribute to new and cost-effective solutions. One possible low-cost approach for remediating the subsurface contamination of DOE sites is through the use of a technology known as bioremediation. Bioremediation has been defined as the use of microorganisms to biodegrade or biotransform hazardous organic contaminants to environmentally safe levels in soils, subsurface materials, water, sludges, and residues.. While bioremediation technology is promising, DOE managers and non-DOE scientists have recognized that the fundamental scientific information needed to develop effective bioremediation technologies for cleanup of the legacy waste sites is lacking in many cases. DOE believes that field-based research is needed to realize the full potential of bioremediation. The Department of Energy faces a unique set of challenges associated with cleaning up waste at its former weapons production and research sites. These sites contain complex mixtures of contaminants in the subsurface, including radioactive compounds. In many cases, the fundamental field-based scientific information needed to develop safe and effective remediation and cleanup technologies is lacking. DOE needs fundamental research on the use of microorganisms and their products to assist DOE in the decontamination and cleanup of its legacy waste sites. The existing NABIR program to-date has focused on fundamental scientific research in the laboratory. Because subsurface hydrologic and geologic conditions at contaminated DOE sites cannot easily be duplicated in a laboratory, however, the DOE needs a field component to permit existing and future laboratory research results to be field-tested on a small scale in a controlled outdoor setting. Such field-testing needs to be conducted under actual legacy waste field conditions representative of those that DOE is most in need of remediating. Ideally, these field conditions should be as representative as practicable of the types of subsurface contamination conditions that resulted from legacy wastes from the nuclear weapons program activities. They should also be representative of the types of hydrologic and geologic conditions that exist across the DOE complex.« less

  18. Navy Enterprise Resource Planning Program: Governance Challenges in Deploying an Enterprise-Wide Information Technology System in the Department of the Navy

    DTIC Science & Technology

    2010-12-01

    Authority WCF Working Capital Fund Y2K Year 2000 xvi THIS PAGE INTENTIONALLY LEFT BLANK xvii ACKNOWLEDGMENTS We express our deepest...signaling the move away from mainframe systems. However, it was the year 2000 ( Y2K ) dilemma that ushered in unprecedented growth in the development of ERP...software and IT systems of the 1990s. The possibility of non- Y2K compliant legacy systems failing at the turn of the century resulted in the

  19. Design and Implementation of a Secure Modbus Protocol

    NASA Astrophysics Data System (ADS)

    Fovino, Igor Nai; Carcano, Andrea; Masera, Marcelo; Trombetta, Alberto

    The interconnectivity of modern and legacy supervisory control and data acquisition (SCADA) systems with corporate networks and the Internet has significantly increased the threats to critical infrastructure assets. Meanwhile, traditional IT security solutions such as firewalls, intrusion detection systems and antivirus software are relatively ineffective against attacks that specifically target vulnerabilities in SCADA protocols. This paper describes a secure version of the Modbus SCADA protocol that incorporates integrity, authentication, non-repudiation and anti-replay mechanisms. Experimental results using a power plant testbed indicate that the augmented protocol provides good security functionality without significant overhead.

  20. The Lightcurve Legacy of COS and STIS

    NASA Astrophysics Data System (ADS)

    Ely, Justin

    2014-10-01

    The Cosmic Origin Spectrograph {COS} and Space Telescope Imaging Spectrograph {STIS} have been advancing astronomy with high quality spectroscopic products for years, and in the case of STIS, more than a decade. Though already incredibly productive, there remains an untapped potential of discovery in the data of these instruments. Due to their detector designs, both of these instruments can operate in a mode where each indivudal photon's arrival time is recorded and saved. Though this TIME-TAG ability is typically utilized to provide second-by-second calibrations to the final spectral data, this mode can also be exploited to re-examine the data in the time domain, turning spectra into lightcurves. With the appropriate knowledge and software, the time-resolved spectra can instead be extracted into photometric lightcurves with high temporal and spectral resolution.We propose here to expand our current software tool into a community-ready pipeline and to deliver a collection of high-level science lightcurves for the entire COS and STIS TIME-TAG archives. By providing this tool and data archive to the community we will lower the barrier to time domain research with these two instruments. This will demonstrate to the community not only the potential contained in re-analysis of existing datasets, but also the exquisite time-series capabilities of the instruments available for future cycles. The enabling and demonstration of this so far underutilized technique should be done now. At a time when HST and its UV capabilities are nearing their end, it's vital that all possible avenues for exploration are made readily available to the scientific community.

  1. Myths and realities: Defining re-engineering for a large organization

    NASA Technical Reports Server (NTRS)

    Yin, Sandra; Mccreary, Julia

    1992-01-01

    This paper describes the background and results of three studies concerning software reverse engineering, re-engineering, and reuse (R3) hosted by the Internal Revenue Service in 1991 and 1992. The situation at the Internal Revenue--aging, piecemeal computer systems and outdated technology maintained by a large staff--is familiar to many institutions, especially among management information systems. The IRS is distinctive for the sheer magnitude and diversity of its problems; the country's tax records are processed using assembly language and COBOL and spread across tape and network DBMS files. How do we proceed with replacing legacy systems? The three software re-engineering studies looked at methods, CASE tool support, and performed a prototype project using re-engineering methods and tools. During the course of these projects, we discovered critical issues broader than the mechanical definitions of methods and tool technology.

  2. High-fidelity modeling and impact footprint prediction for vehicle breakup analysis

    NASA Astrophysics Data System (ADS)

    Ling, Lisa

    For decades, vehicle breakup analysis had been performed for space missions that used nuclear heater or power units in order to assess aerospace nuclear safety for potential launch failures leading to inadvertent atmospheric reentry. Such pre-launch risk analysis is imperative to assess possible environmental impacts, obtain launch approval, and for launch contingency planning. In order to accurately perform a vehicle breakup analysis, the analysis tool should include a trajectory propagation algorithm coupled with thermal and structural analyses and influences. Since such a software tool was not available commercially or in the public domain, a basic analysis tool was developed by Dr. Angus McRonald prior to this study. This legacy software consisted of low-fidelity modeling and had the capability to predict vehicle breakup, but did not predict the surface impact point of the nuclear component. Thus the main thrust of this study was to develop and verify the additional dynamics modeling and capabilities for the analysis tool with the objectives to (1) have the capability to predict impact point and footprint, (2) increase the fidelity in the prediction of vehicle breakup, and (3) reduce the effort and time required to complete an analysis. The new functions developed for predicting the impact point and footprint included 3-degrees-of-freedom trajectory propagation, the generation of non-arbitrary entry conditions, sensitivity analysis, and the calculation of impact footprint. The functions to increase the fidelity in the prediction of vehicle breakup included a panel code to calculate the hypersonic aerodynamic coefficients for an arbitrary-shaped body and the modeling of local winds. The function to reduce the effort and time required to complete an analysis included the calculation of node failure criteria. The derivation and development of these new functions are presented in this dissertation, and examples are given to demonstrate the new capabilities and the improvements made, with comparisons between the results obtained from the upgraded analysis tool and the legacy software wherever applicable.

  3. The design and performance of a scintillating-fibre tracker for the cosmic-ray muon tomography of legacy nuclear waste containers

    NASA Astrophysics Data System (ADS)

    Clarkson, A.; Hamilton, D. J.; Hoek, M.; Ireland, D. G.; Johnstone, J. R.; Kaiser, R.; Keri, T.; Lumsden, S.; Mahon, D. F.; McKinnon, B.; Murray, M.; Nutbeam-Tuffs, S.; Shearer, C.; Staines, C.; Yang, G.; Zimmerman, C.

    2014-05-01

    Tomographic imaging techniques using the Coulomb scattering of cosmic-ray muons are increasingly being exploited for the non-destructive assay of shielded containers in a wide range of applications. One such application is the characterisation of legacy nuclear waste materials stored within industrial containers. The design, assembly and performance of a prototype muon tomography system developed for this purpose are detailed in this work. This muon tracker comprises four detection modules, each containing orthogonal layers of Saint-Gobain BCF-10 2 mm-pitch plastic scintillating fibres. Identification of the two struck fibres per module allows the reconstruction of a space point, and subsequently, the incoming and Coulomb-scattered muon trajectories. These allow the container content, with respect to the atomic number Z of the scattering material, to be determined through reconstruction of the scattering location and magnitude. On each detection layer, the light emitted by the fibre is detected by a single Hamamatsu H8500 MAPMT with two fibres coupled to each pixel via dedicated pairing schemes developed to ensure the identification of the struck fibre. The PMT signals are read out to standard charge-to-digital converters and interpreted via custom data acquisition and analysis software. The design and assembly of the detector system are detailed and presented alongside results from performance studies with data collected after construction. These results reveal high stability during extended collection periods with detection efficiencies in the region of 80% per layer. Minor misalignments of millimetre order have been identified and corrected in software. A first image reconstructed from a test configuration of materials has been obtained using software based on the Maximum Likelihood Expectation Maximisation algorithm. The results highlight the high spatial resolution provided by the detector system. Clear discrimination between the low, medium and high-Z materials assayed is also observed.

  4. SparkMed: a framework for dynamic integration of multimedia medical data into distributed m-Health systems.

    PubMed

    Constantinescu, Liviu; Kim, Jinman; Feng, David Dagan

    2012-01-01

    With the advent of 4G and other long-term evolution (LTE) wireless networks, the traditional boundaries of patient record propagation are diminishing as networking technologies extend the reach of hospital infrastructure and provide on-demand mobile access to medical multimedia data. However, due to legacy and proprietary software, storage and decommissioning costs, and the price of centralization and redevelopment, it remains complex, expensive, and often unfeasible for hospitals to deploy their infrastructure for online and mobile use. This paper proposes the SparkMed data integration framework for mobile healthcare (m-Health), which significantly benefits from the enhanced network capabilities of LTE wireless technologies, by enabling a wide range of heterogeneous medical software and database systems (such as the picture archiving and communication systems, hospital information system, and reporting systems) to be dynamically integrated into a cloud-like peer-to-peer multimedia data store. Our framework allows medical data applications to share data with mobile hosts over a wireless network (such as WiFi and 3G), by binding to existing software systems and deploying them as m-Health applications. SparkMed integrates techniques from multimedia streaming, rich Internet applications (RIA), and remote procedure call (RPC) frameworks to construct a Self-managing, Pervasive Automated netwoRK for Medical Enterprise Data (SparkMed). Further, it is resilient to failure, and able to use mobile and handheld devices to maintain its network, even in the absence of dedicated server devices. We have developed a prototype of the SparkMed framework for evaluation on a radiological workflow simulation, which uses SparkMed to deploy a radiological image viewer as an m-Health application for telemedical use by radiologists and stakeholders. We have evaluated our prototype using ten devices over WiFi and 3G, verifying that our framework meets its two main objectives: 1) interactive delivery of medical multimedia data to mobile devices; and 2) attaching to non-networked medical software processes without significantly impacting their performance. Consistent response times of under 500 ms and graphical frame rates of over 5 frames per second were observed under intended usage conditions. Further, overhead measurements displayed linear scalability and low resource requirements.

  5. ESO Reflex: a graphical workflow engine for data reduction

    NASA Astrophysics Data System (ADS)

    Hook, Richard; Ullgrén, Marko; Romaniello, Martino; Maisala, Sami; Oittinen, Tero; Solin, Otto; Savolainen, Ville; Järveläinen, Pekka; Tyynelä, Jani; Péron, Michèle; Ballester, Pascal; Gabasch, Armin; Izzo, Carlo

    ESO Reflex is a prototype software tool that provides a novel approach to astronomical data reduction by integrating a modern graphical workflow system (Taverna) with existing legacy data reduction algorithms. Most of the raw data produced by instruments at the ESO Very Large Telescope (VLT) in Chile are reduced using recipes. These are compiled C applications following an ESO standard and utilising routines provided by the Common Pipeline Library (CPL). Currently these are run in batch mode as part of the data flow system to generate the input to the ESO/VLT quality control process and are also exported for use offline. ESO Reflex can invoke CPL-based recipes in a flexible way through a general purpose graphical interface. ESO Reflex is based on the Taverna system that was originally developed within the UK life-sciences community. Workflows have been created so far for three VLT/VLTI instruments, and the GUI allows the user to make changes to these or create workflows of their own. Python scripts or IDL procedures can be easily brought into workflows and a variety of visualisation and display options, including custom product inspection and validation steps, are available. Taverna is intended for use with web services and experiments using ESO Reflex to access Virtual Observatory web services have been successfully performed. ESO Reflex is the main product developed by Sampo, a project led by ESO and conducted by a software development team from Finland as an in-kind contribution to joining ESO. The goal was to look into the needs of the ESO community in the area of data reduction environments and to create pilot software products that illustrate critical steps along the road to a new system. Sampo concluded early in 2008. This contribution will describe ESO Reflex and show several examples of its use both locally and using Virtual Observatory remote web services. ESO Reflex is expected to be released to the community in early 2009.

  6. Managing Heterogeneous Information Systems through Discovery and Retrieval of Generic Concepts.

    ERIC Educational Resources Information Center

    Srinivasan, Uma; Ngu, Anne H. H.; Gedeon, Tom

    2000-01-01

    Introduces a conceptual integration approach to heterogeneous databases or information systems that exploits the similarity in metalevel information and performs metadata mining on database objects to discover a set of concepts that serve as a domain abstraction and provide a conceptual layer above existing legacy systems. Presents results of…

  7. What is the Best Use(s) and Mission(s) of the C-27J?

    DTIC Science & Technology

    2010-06-01

    cargo compartment at a position most advantageous to the aircraft’s weight and balance – unlike limitations which exist in the legacy C-130 Hercules...ENS/ GMO /04-01. School of Engineering and Management, Air Force Institute of Technology (AU), Wright-Patterson Air Force Base OH, May 2004

  8. Organizational Approaches to Managing Tacit Knowledge Loss of Legacy System Information Technology Professionals

    ERIC Educational Resources Information Center

    Bitner, Michael

    2012-01-01

    Information Technology (IT) employment shortages may be related to employee retirement or normal attrition. Within IT job shortages, tacit knowledge is lost when employees retire or leave organizations. Tacit knowledge is unwritten or unspoken knowledge that is not easily articulated, and exists only with the individuals who obtain the knowledge…

  9. How Parental and School Responses to Choice Policies Reconfigure a Rural Education Market in Victoria, Australia

    ERIC Educational Resources Information Center

    Morgan, Rosemary; Blackmore, Jill

    2013-01-01

    Market principles now dominate the education and social policies of many Anglophone countries, including Australia, but articulate differentially within specific contexts. Existing historical legacies, local economic and social conditions, and geographical settings interact with federal and state funding and transport policies to shape the nature…

  10. National Vocational Qualifications in the United Kingdom: Their Origins and Legacy

    ERIC Educational Resources Information Center

    Young, Michael

    2011-01-01

    National Vocational Qualifications (NVQs) were launched in the United Kingdom (excluding Scotland) in 1987 as a framework for rationalising what was described at the time as the "jungle" of existing vocational qualifications. They were never intended to be the basis for a comprehensive NQF for all qualifications; however, successive…

  11. ODP Legacy

    Science.gov Websites

    Legacy: Scientific results ODP Legacy: Engineering and science operations ODP Legacy: Samples & ; databases ODP Legacy: Outreach Overview Program Administration | Scientific Results | Engineering &

  12. The HEASARC in 2016: 25 Years and Counting

    NASA Astrophysics Data System (ADS)

    Drake, Stephen Alan; Smale, Alan P.

    2016-04-01

    The High Energy Astrophysics Archival Research Center or HEASARC (http://heasarc.gsfc.nasa.gov/) has been the NASA astrophysics discipline archive supporting multi-mission cosmic X-ray and gamma-ray astronomy research for 25 years, and, through its LAMBDA (Legacy Archive for Microwave Background Data Analysis: http://lambda.gsfc.nasa.gov/) component, the archive for cosmic microwave background data for the last 8 years. The HEASARC is the designated archive which supports NASA's Physics of the Cosmos theme (http://pcos.gsfc.nasa.gov/).The HEASARC provides a unified archive and software structure aimed both at 'legacy' high-energy missions such as Einstein, EXOSAT, ROSAT, RXTE, and Suzaku, contemporary missions such as Fermi, Swift, XMM-Newton, Chandra, NuSTAR, etc., and upcoming missions, such as Astro-H and NICER. The HEASARC's high-energy astronomy archive has grown so that it presently contains more than 80 terabytes (TB) of data from 30 past and present orbital missions. The user community downloaded 160 TB of high-energy data from the HEASARC last year, i.e., an amount equivalent to twice the size of the archive.We discuss some of the upcoming new initiatives and developments for the HEASARC, including the arrival of public data from the JAXA/NASA Astro-H mission, expected to have been launched in February 2016, and the NASA mission of opportunity Neutron Star Interior Composition Explorer (NICER), expected to be deployed in late summer 2016. We also highlight some of the new software and web initiatives of the HEASARC, and discuss our plans for the next 3 years.

  13. Exploring the factors influencing the cloud computing adoption: a systematic study on cloud migration.

    PubMed

    Rai, Rashmi; Sahoo, Gadadhar; Mehfuz, Shabana

    2015-01-01

    Today, most of the organizations trust on their age old legacy applications, to support their business-critical systems. However, there are several critical concerns, as maintainability and scalability issues, associated with the legacy system. In this background, cloud services offer a more agile and cost effective platform, to support business applications and IT infrastructure. As the adoption of cloud services has been increasing recently and so has been the academic research in cloud migration. However, there is a genuine need of secondary study to further strengthen this research. The primary objective of this paper is to scientifically and systematically identify, categorize and compare the existing research work in the area of legacy to cloud migration. The paper has also endeavored to consolidate the research on Security issues, which is prime factor hindering the adoption of cloud through classifying the studies on secure cloud migration. SLR (Systematic Literature Review) of thirty selected papers, published from 2009 to 2014 was conducted to properly understand the nuances of the security framework. To categorize the selected studies, authors have proposed a conceptual model for cloud migration which has resulted in a resource base of existing solutions for cloud migration. This study concludes that cloud migration research is in seminal stage but simultaneously it is also evolving and maturing, with increasing participation from academics and industry alike. The paper also identifies the need for a secure migration model, which can fortify organization's trust into cloud migration and facilitate necessary tool support to automate the migration process.

  14. Challenges to EHR implementation in electronic- versus paper-based office practices.

    PubMed

    Zandieh, Stephanie O; Yoon-Flannery, Kahyun; Kuperman, Gilad J; Langsam, Daniel J; Hyman, Daniel; Kaushal, Rainu

    2008-06-01

    Challenges in implementing electronic health records (EHRs) have received some attention, but less is known about the process of transitioning from legacy EHRs to newer systems. To determine how ambulatory leaders differentiate implementation approaches between practices that are currently paper-based and those with a legacy EHR system (EHR-based). Qualitative study. Eleven practice managers and 12 medical directors all part of an academic ambulatory care network of a large teaching hospital in New York City in January to May of 2006. Qualitative approach comparing and contrasting perceived benefits and challenges in implementing an ambulatory EHR between practice leaders from paper- and EHR-based practices. Content analysis was performed using grounded theory and ATLAS.ti 5.0. We found that paper-based leaders prioritized the following: sufficient workstations and printers, a physician information technology (IT) champion at the practice, workflow education to ensure a successful transition to a paperless medical practice, and a high existing comfort level of practitioners and support staff with IT. In contrast, EHR-based leaders prioritized: improved technical training and ongoing technical support, sufficient protection of patient privacy, and open recognition of physician resistance, especially for those who were loyal to a legacy EHR. Unlike paper-based practices, EHR-based leadership believed that comfort level with IT and adjustments to workflow changes would not be difficult challenges to overcome. Leadership at paper- and EHR-based practices in 1 academic network has different priorities for implementing a new EHR. Ambulatory practices upgrading their legacy EHR have unique challenges.

  15. Two-step web-mining approach to study geology/geophysics-related open-source software projects

    NASA Astrophysics Data System (ADS)

    Behrends, Knut; Conze, Ronald

    2013-04-01

    Geology/geophysics is a highly interdisciplinary science, overlapping with, for instance, physics, biology and chemistry. In today's software-intensive work environments, geoscientists often encounter new open-source software from scientific fields that are only remotely related to the own field of expertise. We show how web-mining techniques can help to carry out systematic discovery and evaluation of such software. In a first step, we downloaded ~500 abstracts (each consisting of ~1 kb UTF-8 text) from agu-fm12.abstractcentral.com. This web site hosts the abstracts of all publications presented at AGU Fall Meeting 2012, the world's largest annual geology/geophysics conference. All abstracts belonged to the category "Earth and Space Science Informatics", an interdisciplinary label cross-cutting many disciplines such as "deep biosphere", "atmospheric research", and "mineral physics". Each publication was represented by a highly structured record with ~20 short data attributes, the largest authorship-record being the unstructured "abstract" field. We processed texts of the abstracts with the statistics software "R" to calculate a corpus and a term-document matrix. Using R package "tm", we applied text-mining techniques to filter data and develop hypotheses about software-development activities happening in various geology/geophysics fields. Analyzing the term-document matrix with basic techniques (e.g., word frequencies, co-occurences, weighting) as well as more complex methods (clustering, classification) several key pieces of information were extracted. For example, text-mining can be used to identify scientists who are also developers of open-source scientific software, and the names of their programming projects and codes can also be identified. In a second step, based on the intermediate results found by processing the conference-abstracts, any new hypotheses can be tested in another webmining subproject: by merging the dataset with open data from github.com and stackoverflow.com. These popular, developer-centric websites have powerful application-programmer interfaces, and follow an open-data policy. In this regard, these sites offer a web-accessible reservoir of information that can be tapped to study questions such as: which open source software projects are eminent in the various geoscience fields? What are the most popular programming languages? How are they trending? Are there any interesting temporal patterns in committer activities? How large are programming teams and how do they change over time? What free software packages exist in the vast realms of related fields? Does the software from these fields have capabilities that might still be useful to me as a researcher, or can help me perform my work better? Are there any open-source projects that might be commercially interesting? This evaluation strategy reveals programming projects that tend to be new. As many important legacy codes are not hosted on open-source code-repositories, the presented search method might overlook some older projects.

  16. BGen: A UML Behavior Network Generator Tool

    NASA Technical Reports Server (NTRS)

    Huntsberger, Terry; Reder, Leonard J.; Balian, Harry

    2010-01-01

    BGen software was designed for autogeneration of code based on a graphical representation of a behavior network used for controlling automatic vehicles. A common format used for describing a behavior network, such as that used in the JPL-developed behavior-based control system, CARACaS ["Control Architecture for Robotic Agent Command and Sensing" (NPO-43635), NASA Tech Briefs, Vol. 32, No. 10 (October 2008), page 40] includes a graph with sensory inputs flowing through the behaviors in order to generate the signals for the actuators that drive and steer the vehicle. A computer program to translate Unified Modeling Language (UML) Freeform Implementation Diagrams into a legacy C implementation of Behavior Network has been developed in order to simplify the development of C-code for behavior-based control systems. UML is a popular standard developed by the Object Management Group (OMG) to model software architectures graphically. The C implementation of a Behavior Network is functioning as a decision tree.

  17. Criminal history systems: new technology and new directions

    NASA Astrophysics Data System (ADS)

    Threatte, James

    1997-02-01

    Many forces are driving states to improve their current Criminal History and On-Line Criminal Justice Information Systems. The predominate factors compelling this movement are (1) the deterioration and cost of supporting older legacy systems, (2) current generation high performance, low cost hardware and system software, and (3) funding programs, such as the National Criminal History Improvement Program, which are targeted specifically at improving these important systems. In early 1996, SAIC established an Internal Research and Development project devoted to Computerized Criminal History Systems (CCH). This project began with an assessment of current hardware, operating system, and relational database technology. Application software design and development approaches were then reviewed with a focus on object-oriented approaches, three tier client server architectures, and tools that enable the `right sizing' of systems. An operational prototype of a State CCH system was established based on the results of these investigations.

  18. Observing with Sibling and Twin Telescopes

    NASA Astrophysics Data System (ADS)

    Plank, Lucia; Lovell, Jim; McCallum, Jamie; Mayer, David

    2016-12-01

    With the transition to VGOS, co-located radio telescopes will be common at many sites. This can be as a sibling telescope when a VGOS antenna is built next to a legacy one, or as the concept of a twin telescope with two identical VGOS antennas. The co-location of two antennas offers new possibilities in both operation and analysis. The immediate question for observing with sibling/twin telescopes is the applied observing strategy and its realization in the scheduling software. In this contribution we report about our efforts implementing new scheduling modes for sibling and twin telescopes in the Vienna VLBI Software. For the example of the sibling telescope in Hobart, several types of sessions will be discussed: an improved tag-along mode for the 26-m antenna (Ho), a proper implementation of the twin-mode using the antenna with the shorter slewing time, and an astrometric support mode enabling the observation of weak sources with the AuScope array.

  19. Fast Bayesian Inference of Copy Number Variants using Hidden Markov Models with Wavelet Compression

    PubMed Central

    Wiedenhoeft, John; Brugel, Eric; Schliep, Alexander

    2016-01-01

    By integrating Haar wavelets with Hidden Markov Models, we achieve drastically reduced running times for Bayesian inference using Forward-Backward Gibbs sampling. We show that this improves detection of genomic copy number variants (CNV) in array CGH experiments compared to the state-of-the-art, including standard Gibbs sampling. The method concentrates computational effort on chromosomal segments which are difficult to call, by dynamically and adaptively recomputing consecutive blocks of observations likely to share a copy number. This makes routine diagnostic use and re-analysis of legacy data collections feasible; to this end, we also propose an effective automatic prior. An open source software implementation of our method is available at http://schlieplab.org/Software/HaMMLET/ (DOI: 10.5281/zenodo.46262). This paper was selected for oral presentation at RECOMB 2016, and an abstract is published in the conference proceedings. PMID:27177143

  20. The biological evolution of guilt, shame and anxiety: A new theory of negative legacy emotions.

    PubMed

    Breggin, Peter R

    2015-07-01

    Human beings are the most social and the most violent creatures on Earth. The combination of cooperation and aggression enabled us to dominate our ecosystem. However, the existence of violent impulses would have made it difficult or impossible for humans to live in close-knit families and clans without destroying each other. Nature's answer was the development of guilt, shame and anxiety-internal emotional inhibitions or restraints specifically against aggressive self-assertion within the family and other close relationships. The theory of negative legacy emotions proposes the first unitary concept for the biopsychosocial function of guilt, shame and anxiety, and seeks their origin in biological evolution and natural selection. Natural selection favored individuals with built-in emotional restraints that reduced conflicts within their family and tribal unit, optimizing their capacity to survive and reproduce within the protection of their small, intimate societies, while maintaining their capacity for violence against outsiders. Unfortunately, these negative legacy emotions are rudimentary and often ineffective in their psychosocial and developmental function. As a result, they produce many unintended untoward effects, including the frequent breakdown of restraints in the family and the uninhibited unleashing of violence against outsiders. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. An autonomous sensor module based on a legacy CCTV camera

    NASA Astrophysics Data System (ADS)

    Kent, P. J.; Faulkner, D. A. A.; Marshall, G. F.

    2016-10-01

    A UK MoD funded programme into autonomous sensors arrays (SAPIENT) has been developing new, highly capable sensor modules together with a scalable modular architecture for control and communication. As part of this system there is a desire to also utilise existing legacy sensors. The paper reports upon the development of a SAPIENT-compliant sensor module using a legacy Close-Circuit Television (CCTV) pan-tilt-zoom (PTZ) camera. The PTZ camera sensor provides three modes of operation. In the first mode, the camera is automatically slewed to acquire imagery of a specified scene area, e.g. to provide "eyes-on" confirmation for a human operator or for forensic purposes. In the second mode, the camera is directed to monitor an area of interest, with zoom level automatically optimized for human detection at the appropriate range. Open source algorithms (using OpenCV) are used to automatically detect pedestrians; their real world positions are estimated and communicated back to the SAPIENT central fusion system. In the third mode of operation a "follow" mode is implemented where the camera maintains the detected person within the camera field-of-view without requiring an end-user to directly control the camera with a joystick.

  2. Empowering Our Students to Make a More Just World

    ERIC Educational Resources Information Center

    Dorfman, Shari; Rosenberg, Ruth

    2013-01-01

    Fifth-grade teachers Shari Dorfman and Ruth Rosenberg strive to help their students see the possibilities that exist within themselves, so that their students can begin to envision their own future. To this end, Dorfman and Rosenberg choose to celebrate the legacy of Reverend Dr. Martin Luther King, Jr. by sharing the stories of lesser-known…

  3. Proof of concept for the use of macroinvertebrates as indicators of polychlorinated biphenyls (PCB) contamination in Lake Hartwell

    Treesearch

    James M. Lazorchak; Michael B. Griffith; Marc Mills; Joseph Schubauer-Berigan; Frank McCormick; Richard Brenner; Craig Zeller

    2015-01-01

    The US Environmental Protection Agency (USEPA) develops methods and tools for evaluating risk management strategies for sediments contaminated with polychlorinated biphenyls (PCBs), polycyclic aromatic hydrocarbons (PAHs), and other legacy pollutants. Monitored natural recovery is a risk management alternative that relies on existing physical, chemical, and biological...

  4. Savannah, Georgia: The Lasting Legacy of Colonial City Planning. Teaching with Historic Places.

    ERIC Educational Resources Information Center

    Kratzer, Judson

    Strolling through the old city of Savannah, Georgia's rigid, grid pattern streets, down its linear brick walkways, past over 1,100 residential and public buildings of unparalleled architectural richness and diversity, visitors and residents come to appreciate the original plan that has existed intact since Savannah's founding in 1733. Twenty-four…

  5. AVE-SESAME program for the REEDA System

    NASA Technical Reports Server (NTRS)

    Hickey, J. S.

    1981-01-01

    The REEDA system software was modified and improved to process the AVE-SESAME severe storm data. A random access file system for the AVE storm data was designed, tested, and implemented. The AVE/SESAME software was modified to incorporate the random access file input and to interface with new graphics hardware/software now available on the REEDA system. Software was developed to graphically display the AVE/SESAME data in the convention normally used by severe storm researchers. Software was converted to AVE/SESAME software systems and interfaced with existing graphics hardware/software available on the REEDA System. Software documentation was provided for existing AVE/SESAME programs underlining functional flow charts and interacting questions. All AVE/SESAME data sets in random access format was processed to allow developed software to access the entire AVE/SESAME data base. The existing software was modified to allow for processing of different AVE/SESAME data set types including satellite surface and radar data.

  6. Validation of Patient-Reported Outcomes Measurement Information System (PROMIS) computerized adaptive tests in cervical spine surgery.

    PubMed

    Boody, Barrett S; Bhatt, Surabhi; Mazmudar, Aditya S; Hsu, Wellington K; Rothrock, Nan E; Patel, Alpesh A

    2018-03-01

    OBJECTIVE The Patient-Reported Outcomes Measurement Information System (PROMIS), which is funded by the National Institutes of Health, is a set of adaptive, responsive assessment tools that measures patient-reported health status. PROMIS measures have not been validated for surgical patients with cervical spine disorders. The objective of this project is to evaluate the validity (e.g., convergent validity, known-groups validity, responsiveness to change) of PROMIS computer adaptive tests (CATs) for pain behavior, pain interference, and physical function in patients undergoing cervical spine surgery. METHODS The legacy outcome measures Neck Disability Index (NDI) and SF-12 were used as comparisons with PROMIS measures. PROMIS CATs, NDI-10, and SF-12 measures were administered prospectively to 59 consecutive tertiary hospital patients who were treated surgically for degenerative cervical spine disorders. A subscore of NDI-5 was calculated from NDI-10 by eliminating the lifting, headaches, pain intensity, reading, and driving sections and multiplying the final score by 4. Assessments were administered preoperatively (baseline) and postoperatively at 6 weeks and 3 months. Patients presenting for revision surgery, tumor, infection, or trauma were excluded. Participants completed the measures in Assessment Center, an online data collection tool accessed by using a secure login and password on a tablet computer. Subgroup analysis was also performed based on a primary diagnosis of either cervical radiculopathy or cervical myelopathy. RESULTS Convergent validity for PROMIS CATs was supported with multiple statistically significant correlations with the existing legacy measures, NDI and SF-12, at baseline. Furthermore, PROMIS CATs demonstrated known-group validity and identified clinically significant improvements in all measures after surgical intervention. In the cervical radiculopathy and myelopathic cohorts, the PROMIS measures demonstrated similar responsiveness to the SF-12 and NDI scores in the patients who self-identified as having postoperative clinical improvement. PROMIS CATs required a mean total of 3.2 minutes for PROMIS pain behavior (mean ± SD 0.9 ± 0.5 minutes), pain interference (1.2 ± 1.9 minutes), and physical function (1.1 ± 1.4 minutes) and compared favorably with 3.4 minutes for NDI and 4.1 minutes for SF-12. CONCLUSIONS This study verifies that PROMIS CATs demonstrate convergent and known-groups validity and comparable responsiveness to change as existing legacy measures. The PROMIS measures required less time for completion than legacy measures. The validity and efficiency of the PROMIS measures in surgical patients with cervical spine disorders suggest an improvement over legacy measures and an opportunity for incorporation into clinical practice.

  7. An object-oriented framework for medical image registration, fusion, and visualization.

    PubMed

    Zhu, Yang-Ming; Cochoff, Steven M

    2006-06-01

    An object-oriented framework for image registration, fusion, and visualization was developed based on the classic model-view-controller paradigm. The framework employs many design patterns to facilitate legacy code reuse, manage software complexity, and enhance the maintainability and portability of the framework. Three sample applications built a-top of this framework are illustrated to show the effectiveness of this framework: the first one is for volume image grouping and re-sampling, the second one is for 2D registration and fusion, and the last one is for visualization of single images as well as registered volume images.

  8. Conversion of NIMROD simulation results for graphical analysis using VisIt

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Romero-Talamas, C A

    Software routines developed to prepare NIMROD [C. R. Sovinec et al., J. Comp. Phys. 195, 355 (2004)] results for three-dimensional visualization from simulations of the Sustained Spheromak Physics Experiment (SSPX ) [E. B. Hooper et al., Nucl. Fusion 39, 863 (1999)] are presented here. The visualization is done by first converting the NIMROD output to a format known as legacy VTK and then loading it to VisIt, a graphical analysis tool that includes three-dimensional rendering and various mathematical operations for large data sets. Sample images obtained from the processing of NIMROD data with VisIt are included.

  9. Assessing Resistance to Change During Shifting from Legacy to Open Web-Based Systems in the Air Transport Industry

    NASA Astrophysics Data System (ADS)

    Brewer, Denise

    The air transport industry (ATI) is a dynamic, communal, international, and intercultural environment in which the daily operations of airlines, airports, and service providers are dependent on information technology (IT). Many of the IT legacy systems are more than 30 years old, and current regulations and the globally distributed workplace have brought profound changes to the way the ATI community interacts. The purpose of the study was to identify the areas of resistance to change in the ATI community and the corresponding factors in change management requirements that minimize product development delays and lead to a successful and timely shift from legacy to open web-based systems in upgrading ATI operations. The research questions centered on product development team processes as well as the members' perceived need for acceptance of change. A qualitative case study approach rooted in complexity theory was employed using a single case of an intercultural product development team dispersed globally. Qualitative data gathered from questionnaires were organized using Nvivo software, which coded the words and themes. Once coded, themes emerged identifying the areas of resistance within the product development team. Results of follow-up interviews with team members suggests that intercultural relationship building prior to and during project execution; focus on common team goals; and, development of relationships to enhance interpersonal respect, understanding and overall communication help overcome resistance to change. Positive social change in the form of intercultural group effectiveness evidenced in increased team functioning during major project transitions is likely to result when global managers devote time to cultural understanding.

  10. The JCMT Gould Belt Survey: a quantitative comparison between SCUBA-2 data reduction methods

    NASA Astrophysics Data System (ADS)

    Mairs, S.; Johnstone, D.; Kirk, H.; Graves, S.; Buckle, J.; Beaulieu, S. F.; Berry, D. S.; Broekhoven-Fiene, H.; Currie, M. J.; Fich, M.; Hatchell, J.; Jenness, T.; Mottram, J. C.; Nutter, D.; Pattle, K.; Pineda, J. E.; Salji, C.; di Francesco, J.; Hogerheijde, M. R.; Ward-Thompson, D.; JCMT Gould Belt survey Team

    2015-12-01

    Performing ground-based submillimetre observations is a difficult task as the measurements are subject to absorption and emission from water vapour in the Earth's atmosphere and time variation in weather and instrument stability. Removing these features and other artefacts from the data is a vital process which affects the characteristics of the recovered astronomical structure we seek to study. In this paper, we explore two data reduction methods for data taken with the Submillimetre Common-User Bolometer Array-2 (SCUBA-2) at the James Clerk Maxwell Telescope (JCMT). The JCMT Legacy Reduction 1 (JCMT LR1) and The Gould Belt Legacy Survey Legacy Release 1 (GBS LR1) reduction both use the same software (STARLINK) but differ in their choice of data reduction parameters. We find that the JCMT LR1 reduction is suitable for determining whether or not compact emission is present in a given region and the GBS LR1 reduction is tuned in a robust way to uncover more extended emission, which better serves more in-depth physical analyses of star-forming regions. Using the GBS LR1 method, we find that compact sources are recovered well, even at a peak brightness of only three times the noise, whereas the reconstruction of larger objects requires much care when drawing boundaries around the expected astronomical signal in the data reduction process. Incorrect boundaries can lead to false structure identification or it can cause structure to be missed. In the JCMT LR1 reduction, the extent of the true structure of objects larger than a point source is never fully recovered.

  11. A new Scheme for ATLAS Trigger Simulation using Legacy Code

    NASA Astrophysics Data System (ADS)

    Galster, Gorm; Stelzer, Joerg; Wiedenmann, Werner

    2014-06-01

    Analyses at the LHC which search for rare physics processes or determine with high precision Standard Model parameters require accurate simulations of the detector response and the event selection processes. The accurate determination of the trigger response is crucial for the determination of overall selection efficiencies and signal sensitivities. For the generation and the reconstruction of simulated event data, the most recent software releases are usually used to ensure the best agreement between simulated data and real data. For the simulation of the trigger selection process, however, ideally the same software release that was deployed when the real data were taken should be used. This potentially requires running software dating many years back. Having a strategy for running old software in a modern environment thus becomes essential when data simulated for past years start to present a sizable fraction of the total. We examined the requirements and possibilities for such a simulation scheme within the ATLAS software framework and successfully implemented a proof-of-concept simulation chain. One of the greatest challenges was the choice of a data format which promises long term compatibility with old and new software releases. Over the time periods envisaged, data format incompatibilities are also likely to emerge in databases and other external support services. Software availability may become an issue, when e.g. the support for the underlying operating system might stop. In this paper we present the encountered problems and developed solutions, and discuss proposals for future development. Some ideas reach beyond the retrospective trigger simulation scheme in ATLAS as they also touch more generally aspects of data preservation.

  12. Modernization of the graphics post-processors of the Hamburg German Climate Computer Center Carbon Cycle Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stevens, E.J.; McNeilly, G.S.

    The existing National Center for Atmospheric Research (NCAR) code in the Hamburg Oceanic Carbon Cycle Circulation Model and the Hamburg Large-Scale Geostrophic Ocean General Circulation Model was modernized and reduced in size while still producing an equivalent end result. A reduction in the size of the existing code from more than 50,000 lines to approximately 7,500 lines in the new code has made the new code much easier to maintain. The existing code in Hamburg model uses legacy NCAR (including even emulated CALCOMP subrountines) graphics to display graphical output. The new code uses only current (version 3.1) NCAR subrountines.

  13. NASA Tech Briefs, September 2003

    NASA Technical Reports Server (NTRS)

    2003-01-01

    Topics include: Oxygen-Partial-Pressure Sensor for Aircraft Oxygen Mask; Three-Dimensional Venturi Sensor for Measuring Extreme Winds; Swarms of Micron-Sized Sensors; Monitoring Volcanoes by Use of Air-Dropped Sensor Packages; Capacitive Sensors for Measuring Masses of Cryogenic Fluids; UHF Microstrip Antenna Array for Synthetic- Aperture Radar; Multimode Broad-Band Patch Antennas; 164-GHz MMIC HEMT Frequency Doubler; GPS Position and Heading Circuitry for Ships; Software for Managing Parametric Studies; Software Aids Visualization of Computed Unsteady Flow; Software for Testing Electroactive Structural Components; Advanced Software for Analysis of High-Speed Rolling-Element Bearings; Web Program for Development of GUIs for Cluster Computers; XML-Based Generator of C++ Code for Integration With GUIs; Oxide Protective Coats for Ir/Re Rocket Combustion Chambers; Simplified Waterproofing of Aerogels; Improved Thermal-Insulation Systems for Low Temperatures; Device for Automated Cutting and Transfer of Plant Shoots; Extension of Liouville Formalism to Postinstability Dynamics; Advances in Thrust-Based Emergency Control of an Airplane; Ultrasonic/Sonic Mechanisms for Drilling and Coring; Exercise Device Would Exert Selectable Constant Resistance; Improved Apparatus for Measuring Distance Between Axles; Six Classes of Diffraction-Based Optoelectronic Instruments; Modernizing Fortran 77 Legacy Codes; Active State Model for Autonomous Systems; Shields for Enhanced Protection Against High-Speed Debris; Scaling of Two-Phase Flows to Partial-Earth Gravity; Neutral-Axis Springs for Thin-Wall Integral Boom Hinges.

  14. Using CORBA to integrate manufacturing cells to a virtual enterprise

    NASA Astrophysics Data System (ADS)

    Pancerella, Carmen M.; Whiteside, Robert A.

    1997-01-01

    It is critical in today's enterprises that manufacturing facilities are not isolated from design, planning, and other business activities and that information flows easily and bidirectionally between these activities. It is also important and cost-effective that COTS software, databases, and corporate legacy codes are well integrated in the information architecture. Further, much of the information generated during manufacturing must be dynamically accessible to engineering and business operations both in a restricted corporate intranet and on the internet. The software integration strategy in the Sandia Agile Manufacturing Testbed supports these enterprise requirements. We are developing a CORBA-based distributed object software system for manufacturing. Each physical machining device is a CORBA object and exports a common IDL interface to allow for rapid and dynamic insertion, deletion, and upgrading within the manufacturing cell. Cell management CORBA components access manufacturing devices without knowledge of any device-specific implementation. To support information flow from design to planning data is accessible to machinists on the shop floor. CORBA allows manufacturing components to be easily accessible to the enterprise. Dynamic clients can be created using web browsers and portable Java GUI's. A CORBA-OLE adapter allows integration to PC desktop applications. Other commercial software can access CORBA network objects in the information architecture through vendor API's.

  15. Forensic Tools to Track and Connect Physical Samples to Related Data

    NASA Astrophysics Data System (ADS)

    Molineux, A.; Thompson, A. C.; Baumgardner, R. W.

    2016-12-01

    Identifiers, such as local sample numbers, are critical to successfully connecting physical samples and related data. However, identifiers must be globally unique. The International Geo Sample Number (IGSN) generated when registering the sample in the System for Earth Sample Registration (SESAR) provides a globally unique alphanumeric code associated with basic metadata, related samples and their current physical storage location. When registered samples are published, users can link the figured samples to the basic metadata held at SESAR. The use cases we discuss include plant specimens from a Permian core, Holocene corals and derived powders, and thin sections with SEM stubs. Much of this material is now published. The plant taxonomic study from the core is a digital pdf and samples can be directly linked from the captions to the SESAR record. The study of stable isotopes from the corals is not yet digitally available, but individual samples are accessible. Full data and media records for both studies are located in our database where higher quality images, field notes, and section diagrams may exist. Georeferences permit mapping in current and deep time plate configurations. Several aspects emerged during this study. The first, ensure adequate and consistent details are registered with SESAR. Second, educate and encourage the researcher to obtain IGSNs. Third, publish the archive numbers, assigned prior to publication, alongside the IGSN. This provides access to further data through an Integrated Publishing Toolkit (IPT)/aggregators/or online repository databases, thus placing the initial sample in a much richer context for future studies. Fourth, encourage software developers to customize community software to extract data from a database and use it to register samples in bulk. This would improve workflow and provide a path for registration of large legacy collections.

  16. Work Coordination Engine

    NASA Technical Reports Server (NTRS)

    Zendejas, Silvino; Bui, Tung; Bui, Bach; Malhotra, Shantanu; Chen, Fannie; Kim, Rachel; Allen, Christopher; Luong, Ivy; Chang, George; Sadaqathulla, Syed

    2009-01-01

    The Work Coordination Engine (WCE) is a Java application integrated into the Service Management Database (SMDB), which coordinates the dispatching and monitoring of a work order system. WCE de-queues work orders from SMDB and orchestrates the dispatching of work to a registered set of software worker applications distributed over a set of local, or remote, heterogeneous computing systems. WCE monitors the execution of work orders once dispatched, and accepts the results of the work order by storing to the SMDB persistent store. The software leverages the use of a relational database, Java Messaging System (JMS), and Web Services using Simple Object Access Protocol (SOAP) technologies to implement an efficient work-order dispatching mechanism capable of coordinating the work of multiple computer servers on various platforms working concurrently on different, or similar, types of data or algorithmic processing. Existing (legacy) applications can be wrapped with a proxy object so that no changes to the application are needed to make them available for integration into the work order system as "workers." WCE automatically reschedules work orders that fail to be executed by one server to a different server if available. From initiation to completion, the system manages the execution state of work orders and workers via a well-defined set of events, states, and actions. It allows for configurable work-order execution timeouts by work-order type. This innovation eliminates a current processing bottleneck by providing a highly scalable, distributed work-order system used to quickly generate products needed by the Deep Space Network (DSN) to support space flight operations. WCE is driven by asynchronous messages delivered via JMS indicating the availability of new work or workers. It runs completely unattended in support of the lights-out operations concept in the DSN.

  17. 31 CFR 357.22 - Transfers.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... AND BILLS HELD IN LEGACY TREASURY DIRECT Legacy Treasury Direct Book-Entry Securities System (Legacy...-entry system, and TreasuryDirect ®. A security may also be transferred between accounts in Legacy... system. A transfer of a security from Legacy Treasury Direct to the commercial book-entry system is...

  18. Creating a career legacy map to help assure meaningful work in nursing.

    PubMed

    Hinds, Pamela S; Britton, Dorienda R; Coleman, Lael; Engh, Eileen; Humbel, Tina Kunze; Keller, Susan; Kelly, Katherine Patterson; Menard, Johanna; Lee, Marlene A; Roberts-Turner, Renee; Walczak, Dory

    2015-01-01

    When nurses declare a professional legacy (or what they intend to be better in health care because of their efforts), they are likely to maintain a focus on achieving their legacy and to experience meaning in the process. We depict the legacy and involved steps in creating a legacy map, which is a concrete guide forward to intended career outcomes. Informed by the "meaningful work" literature, we describe a legacy map, its function, the process to create one, and the application of a legacy map to guide careers. We also describe an administrative benefit of the legacy map-the map can be used by team leaders and members to secure needed resources and opportunities to support the desired legacy of team members. Legacy mapping can be a self-use career guidance tool for nurses and other health care professionals or a tool that links the career efforts of a team member with the career support efforts of a team leader. Copyright © 2015 Elsevier Inc. All rights reserved.

  19. Adaptive Suspicious Prevention for Defending DoS Attacks in SDN-Based Convergent Networks

    PubMed Central

    Dao, Nhu-Ngoc; Kim, Joongheon; Park, Minho; Cho, Sungrae

    2016-01-01

    The convergent communication network will play an important role as a single platform to unify heterogeneous networks and integrate emerging technologies and existing legacy networks. Although there have been proposed many feasible solutions, they could not become convergent frameworks since they mainly focused on converting functions between various protocols and interfaces in edge networks, and handling functions for multiple services in core networks, e.g., the Multi-protocol Label Switching (MPLS) technique. Software-defined networking (SDN), on the other hand, is expected to be the ideal future for the convergent network since it can provide a controllable, dynamic, and cost-effective network. However, SDN has an original structural vulnerability behind a lot of advantages, which is the centralized control plane. As the brains of the network, a controller manages the whole network, which is attractive to attackers. In this context, we proposes a novel solution called adaptive suspicious prevention (ASP) mechanism to protect the controller from the Denial of Service (DoS) attacks that could incapacitate an SDN. The ASP is integrated with OpenFlow protocol to detect and prevent DoS attacks effectively. Our comprehensive experimental results show that the ASP enhances the resilience of an SDN network against DoS attacks by up to 38%. PMID:27494411

  20. Adaptive Suspicious Prevention for Defending DoS Attacks in SDN-Based Convergent Networks.

    PubMed

    Dao, Nhu-Ngoc; Kim, Joongheon; Park, Minho; Cho, Sungrae

    2016-01-01

    The convergent communication network will play an important role as a single platform to unify heterogeneous networks and integrate emerging technologies and existing legacy networks. Although there have been proposed many feasible solutions, they could not become convergent frameworks since they mainly focused on converting functions between various protocols and interfaces in edge networks, and handling functions for multiple services in core networks, e.g., the Multi-protocol Label Switching (MPLS) technique. Software-defined networking (SDN), on the other hand, is expected to be the ideal future for the convergent network since it can provide a controllable, dynamic, and cost-effective network. However, SDN has an original structural vulnerability behind a lot of advantages, which is the centralized control plane. As the brains of the network, a controller manages the whole network, which is attractive to attackers. In this context, we proposes a novel solution called adaptive suspicious prevention (ASP) mechanism to protect the controller from the Denial of Service (DoS) attacks that could incapacitate an SDN. The ASP is integrated with OpenFlow protocol to detect and prevent DoS attacks effectively. Our comprehensive experimental results show that the ASP enhances the resilience of an SDN network against DoS attacks by up to 38%.

  1. Dead wood, living legacies: habitat for a host of fungi.

    Treesearch

    Sally Duncan; Jonathan Thompson

    2004-01-01

    The web of life that exists below ground and out of sight may be the final frontier for forest ecologists. Among the many unknowns in this realm are the thousands of species of so-called ectomycorrhizal fungi, those fungi that have mutual-need associations with both trees and mammals.The richness and diversity of ectomycorrhizal fungi species contribute...

  2. Below- and above-ground effects of deadwood and termites in plantation forests

    Treesearch

    Michael D. Ulyshen; Richard Shefferson; Scott Horn; Melanie K. Taylor; Bryana Bush; Cavell Brownie; Sebastian Seibold; Michael S. Strickland

    2017-01-01

    Deadwood is an important legacy structure in managed forests, providing continuity in shelter and resource availability for many organisms and acting as a vehicle by which nutrients can be passed from one stand to the next following a harvest. Despite existing at the interface between below- and above-ground systems, however, much remains unknown about the role woody...

  3. 78 FR 71710 - Notice of Application for Approval of Discontinuance or Modification of a Railroad Signal System

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-29

    ... installation of cab signals without wayside signaling between Control Point (CP) Kiski, Milepost (MP) LC 47.8, and CP Penn, MP LC 77.9, on the Conemaugh Line, Pittsburgh Division. CP Kiski, CP Harris, CP Beale, CP Sharp, and CP Etna will be upgraded from existing, legacy, relay-based signal systems to electronic...

  4. Exploring the Governance of Polish Public Higher Education: Balancing Restored Historical Legacies with Europeanization and Market Pressures

    ERIC Educational Resources Information Center

    Dobbins, Michael

    2015-01-01

    This article places developments in Polish public higher education (HE) in the broader context of the literature on HE governance and, in particular, marketization. The Polish case stands out due to the parallel existence of prestigious large universities with long histories of scientific advancement and the largest number of private HE…

  5. Deconstructing, Reconstructing, Preserving Paul E. Meehl's Legacy of Construct Validity

    ERIC Educational Resources Information Center

    Maher, Brendan A.; Gottesman, Irving I.

    2005-01-01

    The question of the status of cause-and-effect explanations of human behavior that posit physically existing causative factors and those that, on the other hand, posit hypothetical entities in the form of "useful fictions" has a long history. The influence of the works of Jeremy Bentham and Hans Vaihinger, as well as the later influence of Francis…

  6. Democracy Exported, History Expunged: John Dewey's Trip to Turkey and the Challenge of Building "Civilised" Nations for Democratic Life

    ERIC Educational Resources Information Center

    Cole, Jeremy

    2014-01-01

    In 1924, John Dewey travelled to Turkey to make recommendations on the Turkish educational system. According to many existing accounts, Dewey brought a sorely needed progressive educational perspective to a nation emerging from centuries of despair. On the whole, these accounts dismiss the Ottoman legacy and overlook how Dewey's historical…

  7. Educating for Success: The Legacy of an All-Black School in Southeast Kansas

    ERIC Educational Resources Information Center

    Patterson, Jean A.; Mickelson, Kathryn A.; Petersen, Jan L.; Gross, Diane S.

    2008-01-01

    The authors present findings from an oral history of the all-Black Douglass School, which existed in Parsons, Kansas from 1908-1958. The oral history of the school is significant for several reasons: (a) it adds to our understanding of segregated schools outside the South and northern urban centers, (b) the school was razed in 1962, and very…

  8. The SIRTF Legacy Observing Program

    NASA Astrophysics Data System (ADS)

    Greenhouse, M. A.; Leisawitz, D.; Gehrz, R. D.; Clemens, D. P.; Force, Sirtf Community Task

    1997-12-01

    Legacy Observations and General Observations(GO)are separate categories in which SIRTF observing time will be allocated through peer reviewed community proposals. The Legacy Program will embrace several projects, each headed by a Legacy Principal Investigator. Legacy Observations are distinguished from General Observations by the following three criteria: [1] the project is a large, coherent investigation whose scientific goals can not be met by a number of smaller, uncoordinated projects; [2] the data will be of both general and lasting importance to the broad astronomical community and of immediate utility in motivating and planning follow-on GO investigations with SIRTF; and [3] the data (unprocessed, fully processed, and at intermediate steps in processing) will be placed in a public data base immediately and with no proprietary period. The goals of the SIRTF Legacy program are: [1] enable community use of SIRTF for large coherent survey observations, [2] provide prompt community access to SIRTF survey data, and [3] enable GO program observations based on Legacy program results. A likely attribute (but not a requirement) for Legacy projects is that they may involve hundreds, and perhaps thousands, of hours of observing time. It is anticipated that as much as 6000 hours of telescope time will be allocated through the Legacy program. To meet Legacy program goal [3], allocation of as much as 70% of SIRTF's first year on orbit to Legacy projects may be necessary, and the observing phase of the Legacy program will be completed during the following year. A Legacy call for proposals will be issued 1 year prior to launch or sooner, and will be open to all scientists and science topics. In this poster, we display Legacy program definition and schedule items that will be of interest to those intending to propose under this unique opportunity.

  9. Recent Developments and Challenges Implementing New and Improved Stress Intensity Factor (K) Solutions in NASGRO for Damage Tolerance Analyses

    NASA Technical Reports Server (NTRS)

    Cardinal, Joseph W.; McClung, R. Craig; Lee, Yi-Der; Guo, Yajun; Beek, Joachim M.

    2014-01-01

    Fatigue crack growth analysis software has been available to damage tolerance analysts for many years in either commercial products or via proprietary in-house codes. The NASGRO software has been publicly available since the mid-80s (known as NASA/FLAGRO up to 1999) and since 2000 has been sustained and further developed by a collaborative effort between Southwest Research Institute® (SwRI®), the NASA Johnson Space Center (JSC), and the members of the NASGRO Industrial Consortium. Since the stress intensity factor (K) is the foundation of fracture mechanics and damage tolerance analysis of aircraft structures, a significant focus of development efforts in the past fifteen years has been geared towards enhancing legacy K solutions and developing new and efficient numerical K solutions that can handle the complicated stress gradients computed by today’s analysts using detailed finite element models of fatigue critical locations. This paper provides an overview of K solutions that have been recently implemented or improved for the analysis of geometries such as two unequal through cracks at a hole and two unequal corner cracks at a hole, as well as state-of-the-art weight function models capable of computing K in the presence of univariant and/or bivariant stress gradients and complicated residual stress distributions. Some historical background is provided to review how common K solutions have evolved over the years, including selective examples from the literature and from new research. Challenges and progress in rectifying discrepancies between older legacy solutions and newer models are reviewed as well as approaches and challenges for verification and validation of K solutions. Finally, a summary of current challenges and future research and development needs is presented. A key theme throughout the presentation of this paper will be how members of the aerospace industry have collaborated with software developers to develop a practical analysis tool that is used world-wide to support new design as well as the ongoing sustainment and airworthiness of commercial and military aircraft.

  10. Leveraging Existing Mission Tools in a Re-Usable, Component-Based Software Environment

    NASA Technical Reports Server (NTRS)

    Greene, Kevin; Grenander, Sven; Kurien, James; z,s (fshir. z[orttr); z,scer; O'Reilly, Taifun

    2006-01-01

    Emerging methods in component-based software development offer significant advantages but may seem incompatible with existing mission operations applications. In this paper we relate our positive experiences integrating existing mission applications into component-based tools we are delivering to three missions. In most operations environments, a number of software applications have been integrated together to form the mission operations software. In contrast, with component-based software development chunks of related functionality and data structures, referred to as components, can be individually delivered, integrated and re-used. With the advent of powerful tools for managing component-based development, complex software systems can potentially see significant benefits in ease of integration, testability and reusability from these techniques. These benefits motivate us to ask how component-based development techniques can be relevant in a mission operations environment, where there is significant investment in software tools that are not component-based and may not be written in languages for which component-based tools even exist. Trusted and complex software tools for sequencing, validation, navigation, and other vital functions cannot simply be re-written or abandoned in order to gain the advantages offered by emerging component-based software techniques. Thus some middle ground must be found. We have faced exactly this issue, and have found several solutions. Ensemble is an open platform for development, integration, and deployment of mission operations software that we are developing. Ensemble itself is an extension of an open source, component-based software development platform called Eclipse. Due to the advantages of component-based development, we have been able to vary rapidly develop mission operations tools for three surface missions by mixing and matching from a common set of mission operation components. We have also had to determine how to integrate existing mission applications for sequence development, sequence validation, and high level activity planning, and other functions into a component-based environment. For each of these, we used a somewhat different technique based upon the structure and usage of the existing application.

  11. Towards Test Driven Development for Computational Science with pFUnit

    NASA Technical Reports Server (NTRS)

    Rilee, Michael L.; Clune, Thomas L.

    2014-01-01

    Developers working in Computational Science & Engineering (CSE)/High Performance Computing (HPC) must contend with constant change due to advances in computing technology and science. Test Driven Development (TDD) is a methodology that mitigates software development risks due to change at the cost of adding comprehensive and continuous testing to the development process. Testing frameworks tailored for CSE/HPC, like pFUnit, can lower the barriers to such testing, yet CSE software faces unique constraints foreign to the broader software engineering community. Effective testing of numerical software requires a comprehensive suite of oracles, i.e., use cases with known answers, as well as robust estimates for the unavoidable numerical errors associated with implementation with finite-precision arithmetic. At first glance these concerns often seem exceedingly challenging or even insurmountable for real-world scientific applications. However, we argue that this common perception is incorrect and driven by (1) a conflation between model validation and software verification and (2) the general tendency in the scientific community to develop relatively coarse-grained, large procedures that compound numerous algorithmic steps.We believe TDD can be applied routinely to numerical software if developers pursue fine-grained implementations that permit testing, neatly side-stepping concerns about needing nontrivial oracles as well as the accumulation of errors. We present an example of a successful, complex legacy CSE/HPC code whose development process shares some aspects with TDD, which we contrast with current and potential capabilities. A mix of our proposed methodology and framework support should enable everyday use of TDD by CSE-expert developers.

  12. A Cloud-based, Open-Source, Command-and-Control Software Paradigm for Space Situational Awareness (SSA)

    NASA Astrophysics Data System (ADS)

    Melton, R.; Thomas, J.

    With the rapid growth in the number of space actors, there has been a marked increase in the complexity and diversity of software systems utilized to support SSA target tracking, indication, warning, and collision avoidance. Historically, most SSA software has been constructed with "closed" proprietary code, which limits interoperability, inhibits the code transparency that some SSA customers need to develop domain expertise, and prevents the rapid injection of innovative concepts into these systems. Open-source aerospace software, a rapidly emerging, alternative trend in code development, is based on open collaboration, which has the potential to bring greater transparency, interoperability, flexibility, and reduced development costs. Open-source software is easily adaptable, geared to rapidly changing mission needs, and can generally be delivered at lower costs to meet mission requirements. This paper outlines Ball's COSMOS C2 system, a fully open-source, web-enabled, command-and-control software architecture which provides several unique capabilities to move the current legacy SSA software paradigm to an open source model that effectively enables pre- and post-launch asset command and control. Among the unique characteristics of COSMOS is the ease with which it can integrate with diverse hardware. This characteristic enables COSMOS to serve as the command-and-control platform for the full life-cycle development of SSA assets, from board test, to box test, to system integration and test, to on-orbit operations. The use of a modern scripting language, Ruby, also permits automated procedures to provide highly complex decision making for the tasking of SSA assets based on both telemetry data and data received from outside sources. Detailed logging enables quick anomaly detection and resolution. Integrated real-time and offline data graphing renders the visualization of the both ground and on-orbit assets simple and straightforward.

  13. Conversion of HSPF Legacy Model to a Platform-Independent, Open-Source Language

    NASA Astrophysics Data System (ADS)

    Heaphy, R. T.; Burke, M. P.; Love, J. T.

    2015-12-01

    Since its initial development over 30 years ago, the Hydrologic Simulation Program - FORTAN (HSPF) model has been used worldwide to support water quality planning and management. In the United States, HSPF receives widespread endorsement as a regulatory tool at all levels of government and is a core component of the EPA's Better Assessment Science Integrating Point and Nonpoint Sources (BASINS) system, which was developed to support nationwide Total Maximum Daily Load (TMDL) analysis. However, the model's legacy code and data management systems have limitations in their ability to integrate with modern software, hardware, and leverage parallel computing, which have left voids in optimization, pre-, and post-processing tools. Advances in technology and our scientific understanding of environmental processes that have occurred over the last 30 years mandate that upgrades be made to HSPF to allow it to evolve and continue to be a premiere tool for water resource planners. This work aims to mitigate the challenges currently facing HSPF through two primary tasks: (1) convert code to a modern widely accepted, open-source, high-performance computing (hpc) code; and (2) convert model input and output files to modern widely accepted, open-source, data model, library, and binary file format. Python was chosen as the new language for the code conversion. It is an interpreted, object-oriented, hpc code with dynamic semantics that has become one of the most popular open-source languages. While python code execution can be slow compared to compiled, statically typed programming languages, such as C and FORTRAN, the integration of Numba (a just-in-time specializing compiler) has allowed this challenge to be overcome. For the legacy model data management conversion, HDF5 was chosen to store the model input and output. The code conversion for HSPF's hydrologic and hydraulic modules has been completed. The converted code has been tested against HSPF's suite of "test" runs and shown good agreement and similar execution times while using the Numba compiler. Continued verification of the accuracy of the converted code against more complex legacy applications and improvement upon execution times by incorporating an intelligent network change detection tool is currently underway, and preliminary results will be presented.

  14. Alternatives Analysis Amchitka Island Mud Pit Cap Repair, Amchitka, Alaska January 2016

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Darr, Paul S.

    2016-01-01

    The U.S. Department of Energy (DOE) Office of Legacy Management (LM) manages the Nevada Offsites program, which includes a series of reclaimed drilling mud impoundments on Amchitka Island, Alaska (Figure 1). Navarro Research and Engineering, Inc. is the Legacy Management Support contractor (the Contractor) for LM. The Contractor has procured Tetra Tech, Inc. to provide engineering support to the Amchitka mud pit reclamation project. The mud pit caps were damaged during a 7.9-magnitude earthquake that occurred in 2014. The goals of the current project are to investigate conditions at the mud pit impoundments, identify feasible alternatives for repair of themore » cover systems and the contents, and estimate relative costs of repair alternatives. This report presents descriptions of the sites and past investigations, existing conditions, summaries of various repair/mitigation alternatives, and direct, unburdened, order-of-magnitude (-15% to +50%) associated costs.« less

  15. Bereaved mothers' and fathers' perceptions of a legacy intervention for parents of infants in the NICU.

    PubMed

    Akard, T F; Duffy, M; Hord, A; Randall, A; Sanders, A; Adelstein, K; Anani, U E; Gilmer, M J

    2018-01-01

    Legacy-making, actions or behaviors aimed at being remembered, may be one strategy to enhance coping and improve grief outcomes for bereaved parents and siblings. While legacy interventions have been developed and tested in pediatric and adult populations, legacy activities specific to bereaved parents in the neonatal intensive care unit remain unexplored. This study explored bereaved parents' perceptions of a digital storytelling legacy-making intervention for parents after the death of an infant. Six bereaved mothers and fathers participated in a focus group interview three to 12 months after the death of an infant in the NICU. A semi-structured interview guide with open-ended questions was used to obtain parent self-reports. Qualitative content analysis identified emerging themes. Four major themes emerged regarding participants' perceptions of a legacy intervention: (a) parents' willingness to participate in a legacy intervention, (b) parents' suggestions for a feasible intervention, (c) parents' suggestions for an acceptable intervention, and (d) parents' perceived benefits of legacy-making. Participants reported that a legacy-making intervention via digital storytelling would be feasible, acceptable, and beneficial for NICU parents. Study results support the need and desire for legacy-making services to be developed and offered in the NICU.

  16. Simulation-To-Flight (STF-1): A Mission to Enable CubeSat Software-Based Validation and Verification

    NASA Technical Reports Server (NTRS)

    Morris, Justin; Zemerick, Scott; Grubb, Matt; Lucas, John; Jaridi, Majid; Gross, Jason N.; Ohi, Nicholas; Christian, John A.; Vassiliadis, Dimitris; Kadiyala, Anand; hide

    2016-01-01

    The Simulation-to-Flight 1 (STF-1) CubeSat mission aims to demonstrate how legacy simulation technologies may be adapted for flexible and effective use on missions using the CubeSat platform. These technologies, named NASA Operational Simulator (NOS), have demonstrated significant value on several missions such as James Webb Space Telescope, Global Precipitation Measurement, Juno, and Deep Space Climate Observatory in the areas of software development, mission operations/training, verification and validation (V&V), test procedure development and software systems check-out. STF-1 will demonstrate a highly portable simulation and test platform that allows seamless transition of mission development artifacts to flight products. This environment will decrease development time of future CubeSat missions by lessening the dependency on hardware resources. In addition, through a partnership between NASA GSFC, the West Virginia Space Grant Consortium and West Virginia University, the STF-1 CubeSat will hosts payloads for three secondary objectives that aim to advance engineering and physical-science research in the areas of navigation systems of small satellites, provide useful data for understanding magnetosphere-ionosphere coupling and space weather, and verify the performance and durability of III-V Nitride-based materials.

  17. The CARMEN software as a service infrastructure.

    PubMed

    Weeks, Michael; Jessop, Mark; Fletcher, Martyn; Hodge, Victoria; Jackson, Tom; Austin, Jim

    2013-01-28

    The CARMEN platform allows neuroscientists to share data, metadata, services and workflows, and to execute these services and workflows remotely via a Web portal. This paper describes how we implemented a service-based infrastructure into the CARMEN Virtual Laboratory. A Software as a Service framework was developed to allow generic new and legacy code to be deployed as services on a heterogeneous execution framework. Users can submit analysis code typically written in Matlab, Python, C/C++ and R as non-interactive standalone command-line applications and wrap them as services in a form suitable for deployment on the platform. The CARMEN Service Builder tool enables neuroscientists to quickly wrap their analysis software for deployment to the CARMEN platform, as a service without knowledge of the service framework or the CARMEN system. A metadata schema describes each service in terms of both system and user requirements. The search functionality allows services to be quickly discovered from the many services available. Within the platform, services may be combined into more complicated analyses using the workflow tool. CARMEN and the service infrastructure are targeted towards the neuroscience community; however, it is a generic platform, and can be targeted towards any discipline.

  18. Rebuilding our legacy inventories: identifying the good and fixing the bad

    Treesearch

    Dave Gilbert; Keith Tudor; Sam Otukol; Karen Jahraus

    2000-01-01

    Several challenges faced the Province of British Columbia in the early 1990's as we undertook a major redesign of our inventory systems: a provincial legislated demand requiring the chief forester to develop and maintain an inventory of the land and forests in British Columbia; existing inventories of the 94.5 million ha of the province; the need to meet new...

  19. Domain Specific Language Support for Exascale. Final Project Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baden, Scott

    The project developed a domain specific translator enable legacy MPI source code to tolerate communication delays, which are increasing over time due to technological factors. The translator performs source-to-source translation that incorporates semantic information into the translation process. The output of the translator is a C program runs as a data driven program, and uses an existing run time to overlap communication automatically

  20. The Value of Doctrine for a Developing Organization

    DTIC Science & Technology

    2009-12-01

    increasingly public topic since the events of September 11, 2001. Customs and Border Protection (CBP) is one of the Department of Homeland Security’s...CBP was created, the majority of the existing organization came from two legacy agencies, U.S. Customs and the Immigration and Naturalization Service...Collaboration, Merger, Trust, Customs And Border Protection, CBP 16. PRICE CODE 17. SECURITY CLASSIFICATION OF REPORT Unclassified 18. SECURITY

  1. Cognitive/emotional models for human behavior representation in 3D avatar simulations

    NASA Astrophysics Data System (ADS)

    Peterson, James K.

    2004-08-01

    Simplified models of human cognition and emotional response are presented which are based on models of auditory/ visual polymodal fusion. At the core of these models is a computational model of Area 37 of the temporal cortex which is based on new isocortex models presented recently by Grossberg. These models are trained using carefully chosen auditory (musical sequences), visual (paintings) and higher level abstract (meta level) data obtained from studies of how optimization strategies are chosen in response to outside managerial inputs. The software modules developed are then used as inputs to character generation codes in standard 3D virtual world simulations. The auditory and visual training data also enable the development of simple music and painting composition generators which significantly enhance one's ability to validate the cognitive model. The cognitive models are handled as interacting software agents implemented as CORBA objects to allow the use of multiple language coding choices (C++, Java, Python etc) and efficient use of legacy code.

  2. The Need and Keys for a New Generation Network Adjustment Software

    NASA Astrophysics Data System (ADS)

    Colomina, I.; Blázquez, M.; Navarro, J. A.; Sastre, J.

    2012-07-01

    Orientation and calibration of photogrammetric and remote sensing instruments is a fundamental capacity of current mapping systems and a fundamental research topic. Neither digital remote sensing acquisition systems nor direct orientation gear, like INS and GNSS technologies, made block adjustment obsolete. On the contrary, the continuous flow of new primary data acquisition systems has challenged the capacity of the legacy block adjustment systems - in general network adjustment systems - in many aspects: extensibility, genericity, portability, large data sets capacity, metadata support and many others. In this article, we concentrate on the extensibility and genericity challenges that current and future network systems shall face. For this purpose we propose a number of software design strategies with emphasis on rigorous abstract modeling that help in achieving simplicity, genericity and extensibility together with the protection of intellectual proper rights in a flexible manner. We illustrate our suggestions with the general design approach of GENA, the generic extensible network adjustment system of GeoNumerics.

  3. Development of a Multi-Disciplinary Computing Environment (MDICE)

    NASA Technical Reports Server (NTRS)

    Kingsley, Gerry; Siegel, John M., Jr.; Harrand, Vincent J.; Lawrence, Charles; Luker, Joel J.

    1999-01-01

    The growing need for and importance of multi-component and multi-disciplinary engineering analysis has been understood for many years. For many applications, loose (or semi-implicit) coupling is optimal, and allows the use of various legacy codes without requiring major modifications. For this purpose, CFDRC and NASA LeRC have developed a computational environment to enable coupling between various flow analysis codes at several levels of fidelity. This has been referred to as the Visual Computing Environment (VCE), and is being successfully applied to the analysis of several aircraft engine components. Recently, CFDRC and AFRL/VAAC (WL) have extended the framework and scope of VCE to enable complex multi-disciplinary simulations. The chosen initial focus is on aeroelastic aircraft applications. The developed software is referred to as MDICE-AE, an extensible system suitable for integration of several engineering analysis disciplines. This paper describes the methodology, basic architecture, chosen software technologies, salient library modules, and the current status of and plans for MDICE. A fluid-structure interaction application is described in a separate companion paper.

  4. The definitive analysis of the Bendandi's methodology performed with a specific software

    NASA Astrophysics Data System (ADS)

    Ballabene, Adriano; Pescerelli Lagorio, Paola; Georgiadis, Teodoro

    2015-04-01

    The presentation aims to clarify the "Method Bendandi" supposed, in the past, to be able to forecast earthquakes and never let expressly resolved by the geophysicist from Faenza to posterity. The geoethics implications of the Bendandi's forecasts, and those that arise around the speculation of possible earthquakes inferred from suppositories "Bendandiane" methodologies, rose up in previous years caused by social alarms during supposed occurrences of earthquakes which never happened but where widely spread by media following some 'well informed' non conventional scientists. The analysis was conducted through an extensive literature search of the archive 'Raffaele Bendandi' at Geophy sical Observatory of Faenza and the forecasts analyzed utilising a specially developed software, called "Bendandiano Dashboard", that can reproduce the planetary configurations reported in the graphs made by Italian geophysicist. This analysis should serve to clarify 'definitively' what are the basis of the Bendandi's calculations as well as to prevent future unwarranted warnings issued on the basis of supposed prophecies and illusory legacy documents.

  5. Software reliability models for critical applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pham, H.; Pham, M.

    This report presents the results of the first phase of the ongoing EG G Idaho, Inc. Software Reliability Research Program. The program is studying the existing software reliability models and proposes a state-of-the-art software reliability model that is relevant to the nuclear reactor control environment. This report consists of three parts: (1) summaries of the literature review of existing software reliability and fault tolerant software reliability models and their related issues, (2) proposed technique for software reliability enhancement, and (3) general discussion and future research. The development of this proposed state-of-the-art software reliability model will be performed in the secondmore » place. 407 refs., 4 figs., 2 tabs.« less

  6. Software reliability models for critical applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pham, H.; Pham, M.

    This report presents the results of the first phase of the ongoing EG&G Idaho, Inc. Software Reliability Research Program. The program is studying the existing software reliability models and proposes a state-of-the-art software reliability model that is relevant to the nuclear reactor control environment. This report consists of three parts: (1) summaries of the literature review of existing software reliability and fault tolerant software reliability models and their related issues, (2) proposed technique for software reliability enhancement, and (3) general discussion and future research. The development of this proposed state-of-the-art software reliability model will be performed in the second place.more » 407 refs., 4 figs., 2 tabs.« less

  7. The Scientific Uplink and User Support System for SIRTF

    NASA Astrophysics Data System (ADS)

    Heinrichsen, I.; Chavez, J.; Hartley, B.; Mei, Y.; Potts, S.; Roby, T.; Turek, G.; Valjavec, E.; Wu, X.

    The Space Infrared Telescope Facility (SIRTF) is one of NASA's Great Observatory missions, scheduled for launch in 2001. As such its ground segment design is driven by the requirement to provide strong support for the entire astronomical community starting with the call for Legacy Proposals in early 2000. In this contribution, we present the astronomical user interface and the design of the server software that comprises the Scientific Uplink System for SIRTF. The software architecture is split into three major parts: A front-end Java application deployed to the astronomical community providing the capabilities to visualize and edit proposals and the associated lists of observations. This observer toolkit provides templates to define all parameters necessary to carry out the required observations. A specialized version of this software, based on the same overall architecture, is used internal to the SIRTF Science Center to prepare calibration and engineering observations. A Weblogic (TM) based middleware component brokers the transactions with the servers, astronomical image and catalog sources as well as the SIRTF operational databases. Several server systems perform the necessary computations, to obtain resource estimates, target visibilities and to access the instrument models for signal to noise calculations. The same server software is used internally at a later stage to derive the detailed command sequences needed by the SIRTF instruments and spacecraft to execute a given observation.

  8. Terra Harvest software architecture

    NASA Astrophysics Data System (ADS)

    Humeniuk, Dave; Klawon, Kevin

    2012-06-01

    Under the Terra Harvest Program, the DIA has the objective of developing a universal Controller for the Unattended Ground Sensor (UGS) community. The mission is to define, implement, and thoroughly document an open architecture that universally supports UGS missions, integrating disparate systems, peripherals, etc. The Controller's inherent interoperability with numerous systems enables the integration of both legacy and future UGS System (UGSS) components, while the design's open architecture supports rapid third-party development to ensure operational readiness. The successful accomplishment of these objectives by the program's Phase 3b contractors is demonstrated via integration of the companies' respective plug-'n'-play contributions that include controllers, various peripherals, such as sensors, cameras, etc., and their associated software drivers. In order to independently validate the Terra Harvest architecture, L-3 Nova Engineering, along with its partner, the University of Dayton Research Institute, is developing the Terra Harvest Open Source Environment (THOSE), a Java Virtual Machine (JVM) running on an embedded Linux Operating System. The Use Cases on which the software is developed support the full range of UGS operational scenarios such as remote sensor triggering, image capture, and data exfiltration. The Team is additionally developing an ARM microprocessor-based evaluation platform that is both energy-efficient and operationally flexible. The paper describes the overall THOSE architecture, as well as the design decisions for some of the key software components. Development process for THOSE is discussed as well.

  9. Russian Experience in the Regulatory Supervision of the Uranium Legacy Sites - 12441

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kiselev, M.F.; Romanov, V.V.; Shandala, N.K.

    2012-07-01

    Management of the uranium legacy is accompanied with environmental impact intensity of which depends on the amount of the waste generated, the extent of that waste localization and environmental spreading. The question is: how hazardous is such impact on the environment and human health? The criterion for safety assurance is adequate regulation of the uranium legacy. Since the establishment of the uranium industry, the well done regulatory system operates in the FMBA of Russia. Such system covers inter alia, the uranium legacy. This system includes the extent laboratory network of independent control and supervision, scientific researches, regulative practices. The currentmore » Russian normative and legal basis of the regulation and its application practice has a number of problems relating to the uranium legacy, connected firstly with the environmental remediation. To improve the regulatory system, the urgent tasks are: -To introduce the existing exposure situation into the national laws and standards in compliance with the ICRP system. - To develop criteria for site remediation and return, by stages, to uncontrolled uses. The similar criteria have been developed within the Russian-Norwegian cooperation for the purpose of remediation of the sites for temporary storage of SNF and RW. - To consider possibilities and methods of optimization for the remediation strategies under development. - To separate the special category - RW resulted from uranium ore mining and dressing. The current Russian RW classification is based on the waste subdivision in terms of the specific activities. Having in mind the new RW-specific law, we receive the opportunity to separate some special category - RW originated from the uranium mining and milling. Introduction of such category can simplify significantly the situation with management of waste of uranium mining and milling processes. Such approach is implemented in many countries and approved by IAEA. The category of 'RW originated from uranium mining and milling' is to be introduced as the legal acts and regulatory documents. The recent ICRP recommendations provide the flexible approaches for solving of such tasks. The FMBA of Russia recognizes the problems of radiation safety assurance related to the legacy of the former USSR in the uranium mining industry. Some part of the regulatory problems assumes to be solved within the EurAsEC inter-state target program 'Reclamation of the territories of the EurAsEC member states affected by the uranium mining and milling facilities'. Using the example of the uranium legacy sites in Kyrgyz and Tajikistan which could result in the tran-boundary disasters and require urgent reclamation, the experience will be gained to be used in other states as well. Harmonization of the national legislations and regulative documents on radiation safety assurance is envisaged. (authors)« less

  10. SPHEREx: Science Opportunities for the Astronomical Community

    NASA Astrophysics Data System (ADS)

    Cooray, Asantha; SPHEREx Science Team

    2018-01-01

    SPHEREx, a mission in NASA's Medium Explorer (MIDEX) program that was selected for Phase A study in August 2017, will perform an all-sky near-infrared spectral survey between 0.75 - 5.0 microns. The survey will reach 18.3 AB mag (5 sigma) in R=41 filters, with R=135 coverage between 4.2 - 5.0 microns. The key science topics of the SPHEREx team are: (a) primordial non-Gaussianity through 3-dimensional galaxy clustering; (b) extragalactic background light fluctuations; and (c) ices and biogenic molecules in the interstellar medium and towards protoplanetary environments.The large legacy dataset of SPHEREx will enable a large number of scientific studies and find interesting targets for follow-up observations with Hubble, JWST, ALMA, among other facilities. The SPHEREx catalog will include 1.4 billion galaxies, with redshifts secured for more than 10 and 120 million with fractional accuracies in error/(1+z) better than 0.3% and 3%, respectively. The spectral coverage and resolution provided by SPHEREx are adequate to determine redshifts for most WISE-detected sources with an accuracy better than 3%. The catalog will contain close to 1.5 million quasars including 300 bright QSOs at z > 7 during the epoch of reionization, based on observational extrapolations. The catalog will be adequate to obtain redshifts for all 25,000 galaxy clusters expected to be detected in X-rays with e-Rosita. SPHEREx produces all-sky maps of the Galactic emission lines, including hydrocarbon emission around 3 microns.In this poster, we will show example science studies the broader astronomical community will be able to lead using the SPHEREx database. We will also outline existing plans within the SPHEREx team to develop software tools to enable easy access to the data and to conduct catalog searches, and ways in which the community can provide input to the SPHEREx Science Team on scientific studies and data/software requirements for those studies. The team is eager to develop best software tools and facilitate easy access on a timely schedule to allow a large number of scientific applications and for target selection for JWST observations.

  11. SPHEREx: Science Opportunities for the Astronomical Community

    NASA Astrophysics Data System (ADS)

    Cooray, Asantha R.; SPHEREx Science Team

    2016-01-01

    SPHEREx, a mission in NASA's Small Explorer (SMEX) program that was selected for Phase A study in July 2015, will perform an all-sky near-infrared spectral survey between 0.75 - 4.8 microns, reaching 19th mag (5sigma) in narrow R=40 filters. The key science topics of the SPHEREx team are: (a) primordial non-Gaussianity through 3-dimensional galaxy clustering; (b) extragalactic background light fluctuations; and (c) ices and biogenic molecules in the interstellar medium and towards protoplanetary environments.The large legacy dataset of SPHEREx will enable a large number of scientific studies and find interesting targets for follow-up observations with Hubble, JWST, ALMA, among other facilities. The SPHEREx catalog will include 1.5 billion galaxies with redshifts secured for more than 10 and 120 million with fractional accuracies in error/(1+z) better than 0.3% and 3%, respectively. The spectral coverage and resolution provided by SPHEREx are adequate to determine redshifts for all WISE-detected sources with an accuracy better than 3%. The catalog will contain close to 1.5 million quasars including several hundred bright QSOs seen during the epoch of reionization. The catalog will be adequate to obtain redshifts for all 25,000 galaxy clusters expected to be detected in X-rays with e-Rosita. SPHEREx could also produce all-sky maps of the Galactic emission lines, including hydrocarbon emission around 3 microns.In this poster, we will discuss the data release schedule and some example science studies the broader astronomical community will beable to lead using the SPHEREx database. We will also outline existing plans within the SPHEREx team to develop software tools to enable easy access to the data and to conduct catalog searches, and ways in which the community can provide input to the SPHEREx Science Team on scientific studies and data/software requirements for those studies. The team is eager to develop best software tools and facilitate easy access on a timely schedule to allow a large number of scientific applications and for target selection for JWST observations.

  12. By Hand or Not By-Hand: A Case Study of Alternative Approaches to Parallelize CFD Applications

    NASA Technical Reports Server (NTRS)

    Yan, Jerry C.; Bailey, David (Technical Monitor)

    1997-01-01

    While parallel processing promises to speed up applications by several orders of magnitude, the performance achieved still depends upon several factors, including the multiprocessor architecture, system software, data distribution and alignment, as well as the methods used for partitioning the application and mapping its components onto the architecture. The existence of the Gorden Bell Prize given out at Supercomputing every year suggests that while good performance can be attained for real applications on general purpose multiprocessors, the large investment in man-power and time still has to be repeated for each application-machine combination. As applications and machine architectures become more complex, the cost and time-delays for obtaining performance by hand will become prohibitive. Computer users today can turn to three possible avenues for help: parallel libraries, parallel languages and compilers, interactive parallelization tools. The success of these methodologies, in turn, depends on proper application of data dependency analysis, program structure recognition and transformation, performance prediction as well as exploitation of user supplied knowledge. NASA has been developing multidisciplinary applications on highly parallel architectures under the High Performance Computing and Communications Program. Over the past six years, the transition of underlying hardware and system software have forced the scientists to spend a large effort to migrate and recede their applications. Various attempts to exploit software tools to automate the parallelization process have not produced favorable results. In this paper, we report our most recent experience with CAPTOOL, a package developed at Greenwich University. We have chosen CAPTOOL for three reasons: 1. CAPTOOL accepts a FORTRAN 77 program as input. This suggests its potential applicability to a large collection of legacy codes currently in use. 2. CAPTOOL employs domain decomposition to obtain parallelism. Although the fact that not all kinds of parallelism are handled may seem unappealing, many NASA applications in computational aerosciences as well as earth and space sciences are amenable to domain decomposition. 3. CAPTOOL generates code for a large variety of environments employed across NASA centers: MPI/PVM on network of workstations to the IBS/SP2 and CRAY/T3D.

  13. Which In- and Out-of-School Factors Explain Variations in Learning across Different Socio Economic Groups? Findings from South Africa

    ERIC Educational Resources Information Center

    Smith, Michele C.

    2011-01-01

    Previous studies on the role of the school in influencing attainment in South African schools have concluded that the inequalities which are known to exist in these are still largely due to the legacy of the Apartheid system. More recently, policy focus has been on narrowing the gap between the attainment of different socio-economic groups by…

  14. The Principle of Mass in Relation to Transformation and the Contemporary Operational Environment

    DTIC Science & Technology

    2003-06-06

    specifically show how the legacy forces used mass. For the Korean War , the thesis will analyze the Battle for the Imjin River, April 1951. The Battle...critical components and has existed as a United States Army principle of war since 1921. The United States Army military is currently undergoing vast changes...30 3. Gloster’s Battle for the Imjin .................................................................... 37 4. Battle for Hue Areas of

  15. Quantitative evaluation of legacy phosphorus and its spatial distribution.

    PubMed

    Lou, Hezhen; Zhao, Changsen; Yang, Shengtian; Shi, Liuhua; Wang, Yue; Ren, Xiaoyu; Bai, Juan

    2018-04-01

    A phosphorus resource crisis threatens the security of global crop production, especially in developing countries like China and Brazil. Legacy phosphorus (legacy-P), which is left behind in agricultural soil by over-fertilization, can help address this issue as a new resource in the soil phosphorus pool. However, issues involved with calculating and defining the spatial distribution of legacy-P hinder its future utilization. To resolve these issues, this study applied remote sensing and ecohydrological modeling to precisely quantify legacy-P and define its spatial distribution in China's Sanjiang Plain from 2000 to 2014. The total legacy-P in the study area was calculated as 579,090 t with an annual average of 38,600 t; this comprises 51.83% of the phosphorus fertilizer applied annually. From 2000 to 2014, the annual amount of legacy-P increased by more than 3.42-fold, equivalent to a 2460-ton increase each year. The spatial distribution of legacy-P showed heterogeneity and agglomeration in this area, with peaks in cultivated land experiencing long-term agricultural development. This study supplies a new approach to finding legacy-P in soil as a precondition for future utilization. Once its spatial distribution is known, legacy-P can be better utilized in agriculture to help alleviate the phosphorus resource crisis. Copyright © 2018 Elsevier Ltd. All rights reserved.

  16. 31 CFR 357.26 - Direct Deposit.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... AND BILLS HELD IN LEGACY TREASURY DIRECT Legacy Treasury Direct Book-Entry Securities System (Legacy... security shall be by direct deposit unless it is deemed necessary by the Department to make payment by... account. Where the Legacy Treasury Direct ® securities account is in the name of individual(s) in their...

  17. Predicting the sky from 30 MHz to 800 GHz: the extended Global Sky Model

    NASA Astrophysics Data System (ADS)

    Liu, Adrian

    We propose to construct the extended Global Sky Model (eGSM), a software package and associated data products that are capable of generating maps of the sky at any frequency within a broad range (30 MHz to 800 GHz). The eGSM is constructed from archival data, and its outputs will include not only "best estimate" sky maps, but also accurate error bars and the ability to generate random realizations of missing modes in the input data. Such views of the sky are crucial in the practice of precision cosmology, where our ability to constrain cosmological parameters and detect new phenomena (such as B-mode signatures from primordial gravitational waves, or spectral distortions of the Cosmic Microwave Background; CMB) rests crucially on our ability to remove systematic foreground contamination. Doing so requires empirical measurements of the foreground sky brightness (such as that arising from Galactic synchrotron radiation, among other sources), which are typically performed only at select narrow wavelength ranges. We aim to transcend traditional wavelength limits by optimally combining existing data to provide a comprehensive view of the foreground sky at any frequency within the broad range of 30 MHz to 800 GHz. Previous efforts to interpolate between multi-frequency maps resulted in the Global Sky Model (GSM) of de Oliveira-Costa et al. (2008), a software package that outputs foreground maps at any frequency of the user's choosing between 10 MHz and 100 GHz. However, the GSM has a number of shortcomings. First and foremost, the GSM does not include the latest archival data from the Planck satellite. Multi-frequency models depend crucially on data from Planck, WMAP, and COBE to provide high-frequency "anchor" maps. Another crucial shortcoming is the lack of error bars in the output maps. Finally, the GSM is only able to predict temperature (i.e., total intensity) maps, and not polarization information. With the recent release of Planck's polarized data products, the time is ripe for the inclusion of polarization and a general update of the GSM. In its first two phases, our proposed eGSM project will incorporate new data and improve analysis methods to eliminate all of the aforementioned flaws. The eGSM will have broad implications for future cosmological probes, including surveys of the highly redshifted 21 cm line (such as the proposed Dark Ages Radio Explorer satellite mission) and CMB experiments (such as the Primordial Inflation Polarization Explorer and the Primordial Inflation Explorer) targeting primordial B-mode polarization or spectral distortions. Forecasting exercises for such future experiments must include polarized foregrounds below current detection limits. The third phase of the eGSM will result in a software package that provides random realizations of dim polarized foregrounds that are below the sensitivities of current instruments. This requires the quantification of non-Gaussian and non-isotropic statistics of existing foreground surveys, adding value to existing archival maps. eGSM data products will be publicly hosted on the Legacy Archive for Microwave Background Data Analysis (LAMBDA) archive, including a publicly released code that enables future foreground surveys (whether ground-based or space-based) to easily incorporate additional data into the existing archive, further refining our model and maximizing the impact of existing archives beyond the lifetime of this proposal.

  18. Frederik Ruysch (1638–1731): Historical perspective and contemporary analysis of his teratological legacy

    PubMed Central

    Boer, Lucas; Radziun, Anna B.

    2016-01-01

    The Peter the Great Museum of Anthropology and Ethnography (Kunstkamera) in Saint Petersburg is the oldest museum in Russia. It keeps the remains of the anatomical collection of the world‐famous 17th century Dutch anatomist Frederik Ruysch. This unique collection was bought and shipped in 1717 by Czar Peter the Great, and presently still comprises more than 900 specimens, a modest number of which concerns specimens with congenital anomalies. We searched for teratological clues in the existing collection and in all his descriptions and correspondence regarding specimens and cases he encountered during his career as doctor anatomiae and chief instructor of the surgeons and midwives in Amsterdam. A total of 63 teratological specimens and case descriptions were identified in this legacy, including some exceedingly rare anomalies. As it turns out, Ruysch was the first to describe several of the conditions we encountered, including intracranial teratoma, enchondromatosis, and Majewski syndrome. Although his comments pose an interesting view on how congenital anomalies were scientifically perceived in early 18th century Europe, Ruysch mostly refrained from explaining the causes of the conditions he encountered. Instead, he dedicated himself to careful descriptions of his specimens. Almost 300 years after his demise, Ruysch's legacy still impresses and inspires both scientists and lay men. © 2016 The Authors. American Journal of Medical Genetics Part A Published by Wiley Periodicals, Inc. PMID:27126916

  19. Rates, levels, and determinants of electronic health record system adoption: a study of hospitals in Riyadh, Saudi Arabia.

    PubMed

    Aldosari, Bakheet

    2014-05-01

    Outside a small number of OECD countries, little information exists regarding the rates, levels, and determinants of hospital electronic health record (EHR) system adoption. This study examines EHR system adoption in Riyadh, Saudi Arabia. Respondents from 22 hospitals were surveyed regarding the implementation, maintenance, and improvement phases of EHR system adoption. Thirty-seven items were graded on a three-point scale of preparedness/completion. Measured determinants included hospital size, level of care, ownership, and EHR system development team composition. Eleven of the hospitals had implemented fully functioning EHR systems, eight had systems in progress, and three had not adopted a system. Sixteen different systems were being used across the 19 adopting hospitals. Differential adoption levels were positively related to hospital size and negatively to the level of care (secondary versus tertiary). Hospital ownership (nonprofit versus private) and development team composition showed mixed effects depending on the particular adoption phase being considered. Adoption rates compare favourably with those reported from other countries and other districts in Saudi Arabia, but wide variations exist among hospitals in the levels of adoption of individual items. General weaknesses in the implementation phase concern the legacy of paper data systems, including document scanning and data conversion; in the maintenance phase concern updating/maintaining software; and in the improvement phase concern the communication and exchange of health information. This study is the first to investigate the level and determinants of EHR system adoption for public, other nonprofit, and private hospitals in Saudi Arabia. Wide interhospital variations in adoption bear implications for policy-making and funding intervention. Identified areas of weakness require action to increase the degree of adoption and usefulness of EHR systems. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  20. A proposal for an SDN-based SIEPON architecture

    NASA Astrophysics Data System (ADS)

    Khalili, Hamzeh; Sallent, Sebastià; Piney, José Ramón; Rincón, David

    2017-11-01

    Passive Optical Network (PON) elements such as Optical Line Terminal (OLT) and Optical Network Units (ONUs) are currently managed by inflexible legacy network management systems. Software-Defined Networking (SDN) is a new networking paradigm that improves the operation and management of networks. In this paper, we propose a novel architecture, based on the SDN concept, for Ethernet Passive Optical Networks (EPON) that includes the Service Interoperability standard (SIEPON). In our proposal, the OLT is partially virtualized and some of its functionalities are allocated to the core network management system, while the OLT itself is replaced by an OpenFlow (OF) switch. A new MultiPoint MAC Control (MPMC) sublayer extension based on the OpenFlow protocol is presented. This would allow the SDN controller to manage and enhance the resource utilization, flow monitoring, bandwidth assignment, quality-of-service (QoS) guarantees, and energy management of the optical network access, to name a few possibilities. The OpenFlow switch is extended with synchronous ports to retain the time-critical nature of the EPON network. OpenFlow messages are also extended with new functionalities to implement the concept of EPON Service Paths (ESPs). Our simulation-based results demonstrate the effectiveness of the new architecture, while retaining a similar (or improved) performance in terms of delay and throughput when compared to legacy PONs.

  1. Legacy sample disposition project. Volume 2: Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gurley, R.N.; Shifty, K.L.

    1998-02-01

    This report describes the legacy sample disposition project at the Idaho Engineering and Environmental Laboratory (INEEL), which assessed Site-wide facilities/areas to locate legacy samples and owner organizations and then characterized and dispositioned these samples. This project resulted from an Idaho Department of Environmental Quality inspection of selected areas of the INEEL in January 1996, which identified some samples at the Test Reactor Area and Idaho Chemical Processing Plant that had not been characterized and dispositioned according to Resource Conservation and Recovery Act (RCRA) requirements. The objective of the project was to manage legacy samples in accordance with all applicable environmentalmore » and safety requirements. A systems engineering approach was used throughout the project, which included collecting the legacy sample information and developing a system for amending and retrieving the information. All legacy samples were dispositioned by the end of 1997. Closure of the legacy sample issue was achieved through these actions.« less

  2. 31 CFR 357.20 - Securities account in Legacy Treasury Direct ®.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Securities System (Legacy Treasury Direct) § 357.20 Securities account in Legacy Treasury Direct ®. (a) Account. A securities account consists of: (1) An account master record, and (2) A securities portfolio... 31 Money and Finance: Treasury 2 2010-07-01 2010-07-01 false Securities account in Legacy Treasury...

  3. 31 CFR 357.24 - Availability and disclosure of Legacy Treasury Direct ® records.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... REGULATIONS GOVERNING BOOK-ENTRY TREASURY BONDS, NOTES AND BILLS HELD IN LEGACY TREASURY DIRECT Legacy... Direct account are held confidential. Consistent with the Privacy Act (5 U.S.C. 552a), information... identify the Legacy Treasury Direct account. (c) Conditions for release. A request for information will be...

  4. Collaborative Manufacturing Management in Networked Supply Chains

    NASA Astrophysics Data System (ADS)

    Pouly, Michel; Naciri, Souleiman; Berthold, Sébastien

    ERP systems provide information management and analysis to industrial companies and support their planning activities. They are currently mostly based on theoretical values (averages) of parameters and not on the actual, real shop floor data, leading to disturbance of the planning algorithms. On the other hand, sharing data between manufacturers, suppliers and customers becomes very important to ensure reactivity towards markets variability. This paper proposes software solutions to address these requirements and methods to automatically capture the necessary corresponding shop floor information. In order to share data produced by different legacy systems along the collaborative networked supply chain, we propose to use the Generic Product Model developed by Hitachi to extract, translate and store heterogeneous ERP data.

  5. Architectural Visualization of C/C++ Source Code for Program Comprehension

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Panas, T; Epperly, T W; Quinlan, D

    2006-09-01

    Structural and behavioral visualization of large-scale legacy systems to aid program comprehension is still a major challenge. The challenge is even greater when applications are implemented in flexible and expressive languages such as C and C++. In this paper, we consider visualization of static and dynamic aspects of large-scale scientific C/C++ applications. For our investigation, we reuse and integrate specialized analysis and visualization tools. Furthermore, we present a novel layout algorithm that permits a compressive architectural view of a large-scale software system. Our layout is unique in that it allows traditional program visualizations, i.e., graph structures, to be seen inmore » relation to the application's file structure.« less

  6. I-deas TMG to NX Space Systems Thermal Model Conversion and Computational Performance Comparison

    NASA Technical Reports Server (NTRS)

    Somawardhana, Ruwan

    2011-01-01

    CAD/CAE packages change on a continuous basis as the power of the tools increase to meet demands. End -users must adapt to new products as they come to market and replace legacy packages. CAE modeling has continued to evolve and is constantly becoming more detailed and complex. Though this comes at the cost of increased computing requirements Parallel processing coupled with appropriate hardware can minimize computation time. Users of Maya Thermal Model Generator (TMG) are faced with transitioning from NX I -deas to NX Space Systems Thermal (SST). It is important to understand what differences there are when changing software packages We are looking for consistency in results.

  7. Hierarchical Storage Management at the NASA Center for Computational Sciences: From UniTree to SAM-QFS

    NASA Technical Reports Server (NTRS)

    Salmon, Ellen; Tarshish, Adina; Palm, Nancy; Patel, Sanjay; Saletta, Marty; Vanderlan, Ed; Rouch, Mike; Burns, Lisa; Duffy, Daniel; Caine, Robert

    2004-01-01

    This paper presents the data management issues associated with a large center like the NCCS and how these issues are addressed. More specifically, the focus of this paper is on the recent transition from a legacy UniTree (Legato) system to a SAM-QFS (Sun) system. Therefore, this paper will describe the motivations, from both a hardware and software perspective, for migrating from one system to another. Coupled with the migration from UniTree into SAM-QFS, the complete mass storage environment was upgraded to provide high availability, redundancy, and enhanced performance. This paper will describe the resulting solution and lessons learned throughout the migration process.

  8. Features of the Upgraded Imaging for Hypersonic Experimental Aeroheating Testing (IHEAT) Software

    NASA Technical Reports Server (NTRS)

    Mason, Michelle L.; Rufer, Shann J.

    2016-01-01

    The Imaging for Hypersonic Experimental Aeroheating Testing (IHEAT) software is used at the NASA Langley Research Center to analyze global aeroheating data on wind tunnel models tested in the Langley Aerothermodynamics Laboratory. One-dimensional, semi-infinite heating data derived from IHEAT are used in the design of thermal protection systems for hypersonic vehicles that are exposed to severe aeroheating loads, such as reentry vehicles during descent and landing procedures. This software program originally was written in the PV-WAVE(Registered Trademark) programming language to analyze phosphor thermography data from the two-color, relative-intensity system developed at Langley. To increase the efficiency, functionality, and reliability of IHEAT, the program was migrated to MATLAB(Registered Trademark) syntax and compiled as a stand-alone executable file labeled version 4.0. New features of IHEAT 4.0 include the options to perform diagnostic checks of the accuracy of the acquired data during a wind tunnel test, to extract data along a specified multi-segment line following a feature such as a leading edge or a streamline, and to batch process all of the temporal frame data from a wind tunnel run. Results from IHEAT 4.0 were compared on a pixel level to the output images from the legacy software to validate the program. The absolute differences between the heat transfer data output from the two programs were on the order of 10(exp -5) to 10(exp -7). IHEAT 4.0 replaces the PV-WAVE(Registered Trademark) version as the production software for aeroheating experiments conducted in the hypersonic facilities at NASA Langley.

  9. IDEF5 Ontology Description Capture Method: Concept Paper

    NASA Technical Reports Server (NTRS)

    Menzel, Christopher P.; Mayer, Richard J.

    1990-01-01

    The results of research towards an ontology capture method referred to as IDEF5 are presented. Viewed simply as the study of what exists in a domain, ontology is an activity that can be understood to be at work across the full range of human inquiry prompted by the persistent effort to understand the world in which it has found itself - and which it has helped to shape. In the contest of information management, ontology is the task of extracting the structure of a given engineering, manufacturing, business, or logistical domain and storing it in an usable representational medium. A key to effective integration is a system ontology that can be accessed and modified across domains and which captures common features of the overall system relevant to the goals of the disparate domains. If the focus is on information integration, then the strongest motivation for ontology comes from the need to support data sharing and function interoperability. In the correct architecture, an enterprise ontology base would allow th e construction of an integrated environment in which legacy systems appear to be open architecture integrated resources. If the focus is on system/software development, then support for the rapid acquisition of reliable systems is perhaps the strongest motivation for ontology. Finally, ontological analysis was demonstrated to be an effective first step in the construction of robust knowledge based systems.

  10. The GBIF Integrated Publishing Toolkit: Facilitating the Efficient Publishing of Biodiversity Data on the Internet

    PubMed Central

    Robertson, Tim; Döring, Markus; Guralnick, Robert; Bloom, David; Wieczorek, John; Braak, Kyle; Otegui, Javier; Russell, Laura; Desmet, Peter

    2014-01-01

    The planet is experiencing an ongoing global biodiversity crisis. Measuring the magnitude and rate of change more effectively requires access to organized, easily discoverable, and digitally-formatted biodiversity data, both legacy and new, from across the globe. Assembling this coherent digital representation of biodiversity requires the integration of data that have historically been analog, dispersed, and heterogeneous. The Integrated Publishing Toolkit (IPT) is a software package developed to support biodiversity dataset publication in a common format. The IPT’s two primary functions are to 1) encode existing species occurrence datasets and checklists, such as records from natural history collections or observations, in the Darwin Core standard to enhance interoperability of data, and 2) publish and archive data and metadata for broad use in a Darwin Core Archive, a set of files following a standard format. Here we discuss the key need for the IPT, how it has developed in response to community input, and how it continues to evolve to streamline and enhance the interoperability, discoverability, and mobilization of new data types beyond basic Darwin Core records. We close with a discussion how IPT has impacted the biodiversity research community, how it enhances data publishing in more traditional journal venues, along with new features implemented in the latest version of the IPT, and future plans for more enhancements. PMID:25099149

  11. Sustainable Software Decisions for Long-term Projects (Invited)

    NASA Astrophysics Data System (ADS)

    Shepherd, A.; Groman, R. C.; Chandler, C. L.; Gaylord, D.; Sun, M.

    2013-12-01

    Adopting new, emerging technologies can be difficult for established projects that are positioned to exist for years to come. In some cases the challenge lies in the pre-existing software architecture. In others, the challenge lies in the fluctuation of resources like people, time and funding. The Biological and Chemical Oceanography Data Management Office (BCO-DMO) was created in late 2006 by combining the data management offices for the U.S. GLOBEC and U.S. JGOFS programs to publish data for researchers funded by the National Science Foundation (NSF). Since its inception, BCO-DMO has been supporting access and discovery of these data through web-accessible software systems, and the office has worked through many of the challenges of incorporating new technologies into its software systems. From migrating human readable, flat file metadata storage into a relational database, and now, into a content management system (Drupal) to incorporating controlled vocabularies, new technologies can radically affect the existing software architecture. However, through the use of science-driven use cases, effective resource management, and loosely coupled software components, BCO-DMO has been able to adapt its existing software architecture to adopt new technologies. One of the latest efforts at BCO-DMO revolves around applying metadata semantics for publishing linked data in support of data discovery. This effort primarily affects the metadata web interface software at http://bco-dmo.org and the geospatial interface software at http://mapservice.bco-dmo.org/. With guidance from science-driven use cases and consideration of our resources, implementation decisions are made using a strategy to loosely couple the existing software systems to the new technologies. The results of this process led to the use of REST web services and a combination of contributed and custom Drupal modules for publishing BCO-DMO's content using the Resource Description Framework (RDF) via an instance of the Virtuoso Open-Source triplestore.

  12. Maintaining Quality and Confidence in Open-Source, Evolving Software: Lessons Learned with PFLOTRAN

    NASA Astrophysics Data System (ADS)

    Frederick, J. M.; Hammond, G. E.

    2017-12-01

    Software evolution in an open-source framework poses a major challenge to a geoscientific simulator, but when properly managed, the pay-off can be enormous for both the developers and the community at large. Developers must juggle implementing new scientific process models, adopting increasingly efficient numerical methods and programming paradigms, changing funding sources (or total lack of funding), while also ensuring that legacy code remains functional and reported bugs are fixed in a timely manner. With robust software engineering and a plan for long-term maintenance, a simulator can evolve over time incorporating and leveraging many advances in the computational and domain sciences. In this positive light, what practices in software engineering and code maintenance can be employed within open-source development to maximize the positive aspects of software evolution and community contributions while minimizing its negative side effects? This presentation will discusses steps taken in the development of PFLOTRAN (www.pflotran.org), an open source, massively parallel subsurface simulator for multiphase, multicomponent, and multiscale reactive flow and transport processes in porous media. As PFLOTRAN's user base and development team continues to grow, it has become increasingly important to implement strategies which ensure sustainable software development while maintaining software quality and community confidence. In this presentation, we will share our experiences and "lessons learned" within the context of our open-source development framework and community engagement efforts. Topics discussed will include how we've leveraged both standard software engineering principles, such as coding standards, version control, and automated testing, as well unique advantages of object-oriented design in process model coupling, to ensure software quality and confidence. We will also be prepared to discuss the major challenges faced by most open-source software teams, such as on-boarding new developers or one-time contributions, dealing with competitors or lookie-loos, and other downsides of complete transparency, as well as our approach to community engagement, including a user group email list, hosting short courses and workshops for new users, and maintaining a website. SAND2017-8174A

  13. A USB 2.0 computer interface for the UCO/Lick CCD cameras

    NASA Astrophysics Data System (ADS)

    Wei, Mingzhi; Stover, Richard J.

    2004-09-01

    The new UCO/Lick Observatory CCD camera uses a 200 MHz fiber optic cable to transmit image data and an RS232 serial line for low speed bidirectional command and control. Increasingly RS232 is a legacy interface supported on fewer computers. The fiber optic cable requires either a custom interface board that is plugged into the mainboard of the image acquisition computer to accept the fiber directly or an interface converter that translates the fiber data onto a widely used standard interface. We present here a simple USB 2.0 interface for the UCO/Lick camera. A single USB cable connects to the image acquisition computer and the camera's RS232 serial and fiber optic cables plug into the USB interface. Since most computers now support USB 2.0 the Lick interface makes it possible to use the camera on essentially any modern computer that has the supporting software. No hardware modifications or additions to the computer are needed. The necessary device driver software has been written for the Linux operating system which is now widely used at Lick Observatory. The complete data acquisition software for the Lick CCD camera is running on a variety of PC style computers as well as an HP laptop.

  14. Design, implementation and practice of JBEI-ICE: an open source biological part registry platform and tools.

    PubMed

    Ham, Timothy S; Dmytriv, Zinovii; Plahar, Hector; Chen, Joanna; Hillson, Nathan J; Keasling, Jay D

    2012-10-01

    The Joint BioEnergy Institute Inventory of Composable Elements (JBEI-ICEs) is an open source registry platform for managing information about biological parts. It is capable of recording information about 'legacy' parts, such as plasmids, microbial host strains and Arabidopsis seeds, as well as DNA parts in various assembly standards. ICE is built on the idea of a web of registries and thus provides strong support for distributed interconnected use. The information deposited in an ICE installation instance is accessible both via a web browser and through the web application programming interfaces, which allows automated access to parts via third-party programs. JBEI-ICE includes several useful web browser-based graphical applications for sequence annotation, manipulation and analysis that are also open source. As with open source software, users are encouraged to install, use and customize JBEI-ICE and its components for their particular purposes. As a web application programming interface, ICE provides well-developed parts storage functionality for other synthetic biology software projects. A public instance is available at public-registry.jbei.org, where users can try out features, upload parts or simply use it for their projects. The ICE software suite is available via Google Code, a hosting site for community-driven open source projects.

  15. DoDLive | Department of Defense Blog

    Science.gov Websites

    her surgical career and her marriage to a Navy chief. Surgeon and Spouse: How One Navy Wife Balances Memorial Day Deal Legacy Mentors Play Key Role In TAPS Good Grief Camps Legacy mentors at the Memorial Day themselves. Legacy Mentors Play Key Role In TAPS Good Grief Camps Legacy Mentors Play Key Role In TAPS Good

  16. A Changing Legacy for Dead Rock Stars

    NASA Astrophysics Data System (ADS)

    Farihi, Jay; Wilson, Tom; Gaensicke, Boris; Wilson, David

    2017-10-01

    The discovery of irregular transiting events toward the polluted white dwarf WD1145+017 has recently spurred new observational and theoretical efforts to place these evolved planetary systems into context. Yet despite over three dozen systems that were first discovered and later characterized by Spitzer, we do not yet know the frequency and size of incoming bodies that are tidally disrupted, nor the collective effect they have on pre-existing disks. A few models have come forth since the last call for proposals, predicting active changes in dust production-accretion rates, some of which are on decade-long timescales. We propose to test these models by re-observing all polluted white dwarfs with an infrared excess from dust, most of which were first observed a decade prior. This simple test to search for changes in dust emission can be done in only 15.1 hours. We expect these data to have legacy value for future models, and to inform on the most active targets for follow up with JWST.

  17. Legacy source of mercury in an urban stream-wetland ecosystem in central North Carolina, USA.

    PubMed

    Deonarine, Amrika; Hsu-Kim, Heileen; Zhang, Tong; Cai, Yong; Richardson, Curtis J

    2015-11-01

    In the United States, aquatic mercury contamination originates from point and non-point sources to watersheds. Here, we studied the contribution of mercury in urban runoff derived from historically contaminated soils and the subsequent production of methylmercury in a stream-wetland complex (Durham, North Carolina), the receiving water of this runoff. Our results demonstrated that the mercury originated from the leachate of grass-covered athletic fields. A fraction of mercury in this soil existed as phenylmercury, suggesting that mercurial anti-fungal compounds were historically applied to this soil. Further downstream in the anaerobic sediments of the stream-wetland complex, a fraction (up to 9%) of mercury was converted to methylmercury, the bioaccumulative form of the metal. Importantly, the concentrations of total mercury and methylmercury were reduced to background levels within the stream-wetland complex. Overall, this work provides an example of a legacy source of mercury that should be considered in urban watershed models and watershed management. Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. Cross-platform normalization of microarray and RNA-seq data for machine learning applications

    PubMed Central

    Thompson, Jeffrey A.; Tan, Jie

    2016-01-01

    Large, publicly available gene expression datasets are often analyzed with the aid of machine learning algorithms. Although RNA-seq is increasingly the technology of choice, a wealth of expression data already exist in the form of microarray data. If machine learning models built from legacy data can be applied to RNA-seq data, larger, more diverse training datasets can be created and validation can be performed on newly generated data. We developed Training Distribution Matching (TDM), which transforms RNA-seq data for use with models constructed from legacy platforms. We evaluated TDM, as well as quantile normalization, nonparanormal transformation, and a simple log2 transformation, on both simulated and biological datasets of gene expression. Our evaluation included both supervised and unsupervised machine learning approaches. We found that TDM exhibited consistently strong performance across settings and that quantile normalization also performed well in many circumstances. We also provide a TDM package for the R programming language. PMID:26844019

  19. Bridging the gap: linking a legacy hospital information system with a filmless radiology picture archiving and communications system within a nonhomogeneous environment.

    PubMed

    Rubin, R K; Henri, C J; Cox, R D

    1999-05-01

    A health level 7 (HL7)-conformant data link to exchange information between the mainframe hospital information system (HIS) of our hospital and our home-grown picture archiving and communications system (PACS) is a result of a collaborative effort between the HIS department and the PACS development team. Based of the ability to link examination requisitions and image studies, applications have been generated to optimise workflow and to improve the reliability and distribution of radiology information. Now, images can be routed to individual radiologists and clinicians; worklists facilitate radiology reporting; applications exist to create, edit, and view reports and images via the internet; and automated quality control now limits the incidence of "lost" cases and errors in image routing. By following the HL7 standard to develop the gateway to the legacy system, the development of a radiology information system for booking, reading, reporting, and billing remains universal and does not preclude the option to integrate off-the-shelf commercial products.

  20. The epigenetic landscape in the course of time: Conrad Hal Waddington's methodological impact on the life sciences.

    PubMed

    Baedke, Jan

    2013-12-01

    It seems that the reception of Conrad Hal Waddington's work never really gathered speed in mainstream biology. This paper, offering a transdisciplinary survey of approaches using his epigenetic landscape images, argues that (i) Waddington's legacy is much broader than is usually recognized--it is widespread across the life sciences (e.g. stem cell biology, developmental psychology and cultural anthropology). In addition, I will show that (ii) there exist as yet unrecognized heuristic roles, especially in model building and theory formation, which Waddington's images play within his work. These different methodological facets envisioned by Waddington are used as a natural framework to analyze and classify the manners of usage of epigenetic landscape images in post-Waddingtonian 'landscape approaches'. This evaluation of Waddington's pictorial legacy reveals that there are highly diverse lines of traditions in the life sciences, which are deeply rooted in Waddington's methodological work. Copyright © 2013 Elsevier Ltd. All rights reserved.

  1. Delivery of laboratory data with World Wide Web technology.

    PubMed

    Hahn, A W; Leon, M A; Klein-Leon, S; Allen, G K; Boon, G D; Patrick, T B; Klimczak, J C

    1997-01-01

    We have developed an experimental World Wide Web (WWW) based system to deliver laboratory results to clinicians in our Veterinary Medical Teaching Hospital. Laboratory results are generated by the clinical pathology section of our Veterinary Medical Diagnostic Laboratory and stored in a legacy information system. This system does not interface directly to the hospital information system, and it cannot be accessed directly by clinicians. Our "meta" system first parses routine print reports and then instantiates the data into a modern, open-architecture relational database using a data model constructed with currently accepted international standards for data representation and communication. The system does not affect either of the existing legacy systems. Location-independent delivery of patient data is via a secure WWW based system which maximizes usability and allows "value-added" graphic representations. The data can be viewed with any web browser. Future extensibility and intra- and inter-institutional compatibility served as key design criteria. The system is in the process of being evaluated using accepted methods of assessment of information technologies.

  2. Organizational management practices for achieving software process improvement

    NASA Technical Reports Server (NTRS)

    Kandt, Ronald Kirk

    2004-01-01

    The crisis in developing software has been known for over thirty years. Problems that existed in developing software in the early days of computing still exist today. These problems include the delivery of low-quality products, actual development costs that exceed expected development costs, and actual development time that exceeds expected development time. Several solutions have been offered to overcome out inability to deliver high-quality software, on-time and within budget. One of these solutions involves software process improvement. However, such efforts often fail because of organizational management issues. This paper discusses business practices that organizations should follow to improve their chances of initiating and sustaining successful software process improvement efforts.

  3. Legacy effects in linked ecological-soil-geomorphic systems of drylands

    USGS Publications Warehouse

    Monger, Curtis; Sala, Osvaldo E.; Duniway, Michael C.; Goldfus, Haim; Meir, Isaac A.; Poch, Rosa M.; Throop, Heather L.; Vivoni, Enrique R.

    2015-01-01

    A legacy effect refers to the impacts that previous conditions have on current processes or properties. Legacies have been recognized by many disciplines, from physiology and ecology to anthropology and geology. Within the context of climatic change, ecological legacies in drylands (eg vegetative patterns) result from feedbacks between biotic, soil, and geomorphic processes that operate at multiple spatial and temporal scales. Legacy effects depend on (1) the magnitude of the original phenomenon, (2) the time since the occurrence of the phenomenon, and (3) the sensitivity of the ecological–soil–geomorphic system to change. Here we present a conceptual framework for legacy effects at short-term (days to months), medium-term (years to decades), and long-term (centuries to millennia) timescales, which reveals the ubiquity of such effects in drylands across research disciplines.

  4. Software Analysis of New Space Gravity Data for Geophysics and Climate Research

    NASA Technical Reports Server (NTRS)

    Deese, Rupert; Ivins, Erik R.; Fielding, Eric J.

    2012-01-01

    Both the Gravity Recovery and Climate Experiment (GRACE) and Gravity field and steady-state Ocean Circulation Explorer (GOCE) satellites are returning rich data for the study of the solid earth, the oceans, and the climate. Current software analysis tools do not provide researchers with the ease and flexibility required to make full use of this data. We evaluate the capabilities and shortcomings of existing software tools including Mathematica, the GOCE User Toolbox, the ICGEM's (International Center for Global Earth Models) web server, and Tesseroids. Using existing tools as necessary, we design and implement software with the capability to produce gridded data and publication quality renderings from raw gravity data. The straight forward software interface marks an improvement over previously existing tools and makes new space gravity data more useful to researchers. Using the software we calculate Bouguer anomalies of the gravity tensor's vertical component in the Gulf of Mexico, Antarctica, and the 2010 Maule earthquake region. These maps identify promising areas of future research.

  5. A Study of Clinically Related Open Source Software Projects

    PubMed Central

    Hogarth, Michael A.; Turner, Stuart

    2005-01-01

    Open source software development has recently gained significant interest due to several successful mainstream open source projects. This methodology has been proposed as being similarly viable and beneficial in the clinical application domain as well. However, the clinical software development venue differs significantly from the mainstream software venue. Existing clinical open source projects have not been well characterized nor formally studied so the ‘fit’ of open source in this domain is largely unknown. In order to better understand the open source movement in the clinical application domain, we undertook a study of existing open source clinical projects. In this study we sought to characterize and classify existing clinical open source projects and to determine metrics for their viability. This study revealed several findings which we believe could guide the healthcare community in its quest for successful open source clinical software projects. PMID:16779056

  6. Olympic Health Legacy; Essentials for Lasting Development of Host City.

    PubMed

    Lee, Young-Hee; Kim, Jung Moon

    2013-03-01

    The purpose of the Olympic Games should be to contribute to the social development by leaving behind economic, cultural and environmental legacies to the hosting region. While tangible examples such as venues are often recognized as representative legacies of the Olympics, intangible aspects such as the environment, culture, policy and human resources have been gaining in importance. The Olympic Games, at its most fundamental level, is a sporting event. Sports not only is closely related to the physical health, but is also instrumental to fostering mental health through inspiration. One of the most important sports legacies was the general change in the population's perception on sports and physical activities; due to such change, people were able to enjoy sports as part of healthy and active everyday life and benefit physically. However, compared to tangible legacies such as the facilities, social legacies such as the general health and their planning, execution and achievements are hard to monitor. Therefore, for the Olympics to leave behind socio-cultural legacies that contribute to the development of the hosting region, there must be a thorough business plan that takes into account region-specific purpose, and is divided into stages such as before, during and after the Games. Should the 2018 Winter Olympic Games hope to create continuing contribution to its hosting region, it must leave behind 'Health Legacies' that will enhance the happiness of the hosting region's population. To this end, establishment of region-specific purpose and systematic promotion of business via detailed analysis of precedents are a must. This article aim to review the health legacy endeavors of past host cities and suggest the appropriate forms of health legacy of 2018 Pyeongchang Winter Olympic and Paralympic Games.

  7. Biological legacies: Direct early ecosystem recovery and food web reorganization after a volcanic eruption in Alaska

    USGS Publications Warehouse

    Walker, Lawrence R.; Sikes, Derek S.; DeGange, Anthony R.; Jewett, Stephen C.; Michaelson, Gary; Talbot, Sandra L.; Talbot, Stephen S.; Wang, Bronwen; Williams, Jeffrey C.

    2014-01-01

    Attempts to understand how communities assemble following a disturbance are challenged by the difficulty of determining the relative importance of stochastic and deterministic processes. Biological legacies, which result from organisms that survive a disturbance, can favour deterministic processes in community assembly and improve predictions of successional trajectories. Recently disturbed ecosystems are often so rapidly colonized by propagules that the role of biological legacies is obscured. We studied biological legacies on a remote volcanic island in Alaska following a devastating eruption where the role of colonization from adjacent communities was minimized. The role of biological legacies in the near shore environment was not clear, because although some kelp survived, they were presumably overwhelmed by the many vagile propagules in a marine environment. The legacy concept was most applicable to terrestrial invertebrates and plants that survived in remnants of buried soil that were exposed by post-eruption erosion. If the legacy concept is extended to include ex situ survival by transient organisms, then it was also applicable to the island's thousands of seabirds, because the seabirds survived the eruption by leaving the island and have begun to return and rebuild their nests as local conditions improve. Our multi-trophic examination of biological legacies in a successional context suggests that the relative importance of biological legacies varies with the degree of destruction, the availability of colonizing propagules, the spatial and temporal scales under consideration, and species interactions. Understanding the role of biological legacies in community assembly following disturbances can help elucidate the relative importance of colonists versus survivors, the role of priority effects among the colonists, convergence versus divergence of successional trajectories, the influence of spatial heterogeneity, and the role of island biogeographical concepts.

  8. Strategy in the Robotic Age: A Case for Autonomous Warfare

    DTIC Science & Technology

    2014-09-01

    6. Robots and Robotics The term robot is a loaded word. For many people it conjures a vision of fictional characters from movies like The...released in the early 1930s to review the experiences of WWI, it was censored , and a version modified to maintain the institutional legacies was...apprehensive, and doctrine was non-existent. Today, America is emerging from two wars and subsequently a war-weary public. The United States is a

  9. SPHEREx: Science Opportunities for Solar System Astronomy

    NASA Astrophysics Data System (ADS)

    Lisse, Carey Michael; SPHEREx Science Team

    2018-01-01

    SPHEREx, a mission in NASA's Medium Explorer (MIDEX) program that was selected for Phase A study in August 2017, will perform an all-sky near-infrared spectral survey between 0.75 - 5.0 µm in R = 41 filters, and with R = 135 coverage from 4.2 - 5.0 µm, reaching L ~ 19 (5-sigma).SPHEREx has high potential for solar system science. The 96-band survey will cover the entire sky 4 times over the course of 2 years, including thousands of foreground solar system asteroids, comets, Trojans, and KBOs. By canvassing the entire solar system for 2 years, SPHEREx has the potential not only to achieve a relatively complete sensitivity limited survey of the solar system's bodies, but also some capability to search for variation in these bodies over time.For example, the large legacy dataset of SPHEREx will update the WISE catalogue of asteroid sizes and albedos by providing a spectral survey of tens of thousands of bodies. It will provide spectral classification of hundreds of Trojan asteroids, allowing for direct comparison to the asteroid results. It will extend optical surveys of comet composition by dynamical type to hundreds of objects in the NIR, while determining water/dust/CO/CO2 activity vs distance. SPHEREx will also map in great temporal and spatial detail the zodiacal dust debris disk cloud that these bodies produce, providing an unprecedented level of information concerning the sources and sinks of this material.In this paper, we discuss the data release schedule and some example science studies the planetary astronomy community will be able to access using the SPHEREx database. We also outline existing plans within the SPHEREx team to develop software tools to enable easy access to the data and to conduct catalog searches, and ways in which the community can provide input to the SPHEREx Science Team on scientific studies and data/software requirements for those studies, enabling a large number of scientific studies while finding interesting targets for follow-up observations with Hubble, JWST, ALMA, etc.

  10. Plug-and -Play Model Architecture and Development Environment for Powertrain/Propulsion System - Final CRADA Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rousseau, Aymeric

    2013-02-01

    Several tools already exist to develop detailed plant model, including GT-Power, AMESim, CarSim, and SimScape. The objective of Autonomie is not to provide a language to develop detailed models; rather, Autonomie supports the assembly and use of models from design to simulation to analysis with complete plug-and-play capabilities. Autonomie provides a plug-and-play architecture to support this ideal use of modeling and simulation for math-based automotive control system design. Models in the standard format create building blocks, which are assembled at runtime into a simulation model of a vehicle, system, subsystem, or component to simulate. All parts of the graphical usermore » interface (GUI) are designed to be flexible to support architectures, systems, components, and processes not yet envisioned. This allows the software to be molded to individual uses, so it can grow as requirements and technical knowledge expands. This flexibility also allows for implementation of legacy code, including models, controller code, processes, drive cycles, and post-processing equations. A library of useful and tested models and processes is included as part of the software package to support a full range of simulation and analysis tasks, immediately. Autonomie also includes a configuration and database management front end to facilitate the storage, versioning, and maintenance of all required files, such as the models themselves, the model’s supporting files, test data, and reports. During the duration of the CRADA, Argonne has worked closely with GM to implement and demonstrate each one of their requirements. A use case was developed by GM for every requirement and demonstrated by Argonne. Each of the new features were verified by GM experts through a series of Gate. Once all the requirements were validated they were presented to the directors as part of GM Gate process.« less

  11. Wings In Orbit: Scientific and Engineering Legacies of the Space Shuttle

    NASA Technical Reports Server (NTRS)

    Hale, N. Wayne (Editor); Lulla, Kamlesh (Editor); Lane, Helen W. (Editor); Chapline, Gail (Editor)

    2010-01-01

    This Space Shuttle book project reviews Wings In Orbit-scientific and engineering legacies of the Space Shuttle. The contents include: 1) Magnificent Flying Machine-A Cathedral to Technology; 2) The Historical Legacy; 3) The Shuttle and its Operations; 4) Engineering Innovations; 5) Major Scientific Discoveries; 6) Social, Cultural, and Educational Legacies; 7) Commercial Aerospace Industries and Spin-offs; and 8) The Shuttle continuum, Role of Human Spaceflight.

  12. Evaluating the sale of a nonprofit health system to a for-profit hospital management company: the Legacy Experience.

    PubMed Central

    King, J G; Avery, J E

    1999-01-01

    OBJECTIVE: To introduce and develop a decision model that can be used by the leadership of nonprofit healthcare organizations to assist them in evaluating whether selling to a for-profit organization is in their community's best interest. STUDY SETTING/DATA SOURCES: A case study of the planning process and decision model that Legacy Health System used to evaluate whether to sell to a for-profit hospital management company and use the proceeds of the sale to establish a community health foundation. Data sources included financial statements of benchmark organizations, internal company records, and numerous existing studies. STUDY DESIGN: The development of the multivariate model was based on insight gathered through a review of the current literature regarding the conversion of nonprofit healthcare organizations. DATA COLLECTION/EXTRACTION METHODS: The effect that conversion from nonprofit to for-profit status would have on each variable was estimated based on assumptions drawn from the current literature and on an analysis of Legacy and for-profit hospital company data. PRINCIPAL FINDINGS: The results of the decision model calculations indicate that the sale of Legacy to a for-profit firm and the subsequent creation of a community foundation would have a negative effect on the local community. CONCLUSIONS: The use of the decision model enabled senior management and trustees to systematically address the conversion question and to conclude that continuing to operate as a nonprofit organization would provide the most benefit to the local community. The model will prove useful to organizations that decide to sell to a for-profit organization as well as those that choose to continue nonprofit operations. For those that decide to sell, the model will assist in minimizing any potential negative effect that conversion may have on the community. The model will help those who choose not to sell to develop a better understanding of the organization's value to the community. PMID:10201854

  13. From Bridges and Rockets, Lessons for Software Systems

    NASA Technical Reports Server (NTRS)

    Holloway, C. Michael

    2004-01-01

    Although differences exist between building software systems and building physical structures such as bridges and rockets, enough similarities exist that software engineers can learn lessons from failures in traditional engineering disciplines. This paper draws lessons from two well-known failures the collapse of the Tacoma Narrows Bridge in 1940 and the destruction of the space shuttle Challenger in 1986 and applies these lessons to software system development. The following specific applications are made: (1) the verification and validation of a software system should not be based on a single method, or a single style of methods; (2) the tendency to embrace the latest fad should be overcome; and (3) the introduction of software control into safety-critical systems should be done cautiously.

  14. Monte Carlo Methods in Materials Science Based on FLUKA and ROOT

    NASA Technical Reports Server (NTRS)

    Pinsky, Lawrence; Wilson, Thomas; Empl, Anton; Andersen, Victor

    2003-01-01

    A comprehensive understanding of mitigation measures for space radiation protection necessarily involves the relevant fields of nuclear physics and particle transport modeling. One method of modeling the interaction of radiation traversing matter is Monte Carlo analysis, a subject that has been evolving since the very advent of nuclear reactors and particle accelerators in experimental physics. Countermeasures for radiation protection from neutrons near nuclear reactors, for example, were an early application and Monte Carlo methods were quickly adapted to this general field of investigation. The project discussed here is concerned with taking the latest tools and technology in Monte Carlo analysis and adapting them to space applications such as radiation shielding design for spacecraft, as well as investigating how next-generation Monte Carlos can complement the existing analytical methods currently used by NASA. We have chosen to employ the Monte Carlo program known as FLUKA (A legacy acronym based on the German for FLUctuating KAscade) used to simulate all of the particle transport, and the CERN developed graphical-interface object-oriented analysis software called ROOT. One aspect of space radiation analysis for which the Monte Carlo s are particularly suited is the study of secondary radiation produced as albedoes in the vicinity of the structural geometry involved. This broad goal of simulating space radiation transport through the relevant materials employing the FLUKA code necessarily requires the addition of the capability to simulate all heavy-ion interactions from 10 MeV/A up to the highest conceivable energies. For all energies above 3 GeV/A the Dual Parton Model (DPM) is currently used, although the possible improvement of the DPMJET event generator for energies 3-30 GeV/A is being considered. One of the major tasks still facing us is the provision for heavy ion interactions below 3 GeV/A. The ROOT interface is being developed in conjunction with the CERN ALICE (A Large Ion Collisions Experiment) software team through an adaptation of their existing AliROOT (ALICE Using ROOT) architecture. In order to check our progress against actual data, we have chosen to simulate the ATIC14 (Advanced Thin Ionization Calorimeter) cosmic-ray astrophysics balloon payload as well as neutron fluences in the Mir spacecraft. This paper contains a summary of status of this project, and a roadmap to its successful completion.

  15. Fault Tree Analysis Application for Safety and Reliability

    NASA Technical Reports Server (NTRS)

    Wallace, Dolores R.

    2003-01-01

    Many commercial software tools exist for fault tree analysis (FTA), an accepted method for mitigating risk in systems. The method embedded in the tools identifies a root as use in system components, but when software is identified as a root cause, it does not build trees into the software component. No commercial software tools have been built specifically for development and analysis of software fault trees. Research indicates that the methods of FTA could be applied to software, but the method is not practical without automated tool support. With appropriate automated tool support, software fault tree analysis (SFTA) may be a practical technique for identifying the underlying cause of software faults that may lead to critical system failures. We strive to demonstrate that existing commercial tools for FTA can be adapted for use with SFTA, and that applied to a safety-critical system, SFTA can be used to identify serious potential problems long before integrator and system testing.

  16. GALFIT-CORSAIR: Implementing the Core-Sérsic Model Into GALFIT

    NASA Astrophysics Data System (ADS)

    Bonfini, Paolo

    2014-10-01

    We introduce GALFIT-CORSAIR: a publicly available, fully retro-compatible modification of the 2D fitting software GALFIT (v.3) which adds an implementation of the core-Sersic model. We demonstrate the software by fitting the images of NGC 5557 and NGC 5813, which have been previously identified as core-Sersic galaxies by their 1D radial light profiles. These two examples are representative of different dust obscuration conditions, and of bulge/disk decomposition. To perform the analysis, we obtained deep Hubble Legacy Archive (HLA) mosaics in the F555W filter (~V-band). We successfully reproduce the results of the previous 1D analysis, modulo the intrinsic differences between the 1D and the 2D fitting procedures. The code and the analysis procedure described here have been developed for the first coherent 2D analysis of a sample of core-Sersic galaxies, which will be presented in a forth-coming paper. As the 2D analysis provides better constraining on multi-component fitting, and is fully seeing-corrected, it will yield complementary constraints on the missing mass in depleted galaxy cores.

  17. Internet Distribution of Spacecraft Telemetry Data

    NASA Technical Reports Server (NTRS)

    Specht, Ted; Noble, David

    2006-01-01

    Remote Access Multi-mission Processing and Analysis Ground Environment (RAMPAGE) is a Java-language server computer program that enables near-real-time display of spacecraft telemetry data on any authorized client computer that has access to the Internet and is equipped with Web-browser software. In addition to providing a variety of displays of the latest available telemetry data, RAMPAGE can deliver notification of an alarm by electronic mail. Subscribers can then use RAMPAGE displays to determine the state of the spacecraft and formulate a response to the alarm, if necessary. A user can query spacecraft mission data in either binary or comma-separated-value format by use of a Web form or a Practical Extraction and Reporting Language (PERL) script to automate the query process. RAMPAGE runs on Linux and Solaris server computers in the Ground Data System (GDS) of NASA's Jet Propulsion Laboratory and includes components designed specifically to make it compatible with legacy GDS software. The client/server architecture of RAMPAGE and the use of the Java programming language make it possible to utilize a variety of competitive server and client computers, thereby also helping to minimize costs.

  18. Advancing Understanding of the Role of Belowground Processes in Terrestrial Carbon Sinks trhrough Ground-Penetrating Radar. Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Day, Frank P.

    2015-02-06

    Coarse roots play a significant role in belowground carbon cycling and will likely play an increasingly crucial role in belowground carbon sequestration as atmospheric CO 2 levels continue to rise, yet they are one of the most difficult ecosystem parameters to quantify. Despite promising results with ground-penetrating radar (GPR) as a nondestructive method of quantifying biomass of coarse roots, this application of GPR is in its infancy and neither the complete potential nor limitations of the technology have been fully evaluated. The primary goals and questions of this study fell into four groups: (1) GPR methods: Can GPR detect changemore » in root biomass over time, differentiate live roots from dead roots, differentiate between coarse roots, fine roots bundled together, and a fine root mat, remain effective with varied soil moisture, and detect shadowed roots (roots hidden below larger roots); (2) CO 2 enrichment study at Kennedy Space Center in Brevard County, Florida: Are there post-fire legacy effects of CO 2 fertilization on plant carbon pools following the end of CO 2application ? (3) Disney Wilderness Study: What is the overall coarse root biomass and potential for belowground carbon storage in a restored longleaf pine flatwoods system? Can GPR effectively quantify coarse roots in soils that are wetter than the previous sites and that have a high percentage of saw palmetto rhizomes present? (4) Can GPR accurately represent root architecture in a three-dimensional model? When the user is familiar with the equipment and software in a setting that minimizes unsuitable conditions, GPR is a relatively precise, non-destructive, useful tool for estimating coarse root biomass. However, there are a number of cautions and guidelines that should be followed to minimize inaccuracies or situations that are untenable for GPR use. GPR appears to be precise as it routinely predicts highly similar values for a given area across multiple scanning events; however, it appears to lack sufficient accuracy at small scales. Knowledge of soil conditions and their effects on GPR wave propagation and reception are paramount for the collection of useful data. Strong familiarity with the software and equipment is both important and necessary for GPR use in estimating coarse root biomass. GPR must be utilized at low soil moisture levels in order to accurately represent existing coarse root structures. Our results from Disney Wilderness Preserve highlight the need for a strong understanding of the limitations of GPR, specifically knowledge of root structures (saw palmetto rhizomes) or environmental factors (low moisture content) that may hinder its application within a given system. The 3D modeling of course roots with GPR appears quite promising, as it has become more accurate and precise as the software has advanced and become more robust, but there is still a need for more precision before it will likely be able to model anything more than simple root systems comprised mostly of large diameter roots. Our results from Kennedy Space Center suggest that there are legacy effects from CO 2 fertilization in the form of more root mass providing a greater capacity for aboveground plant regrowth following fire, even 7 years after treatment ended.« less

  19. A genomic overview of the population structure of Salmonella.

    PubMed

    Alikhan, Nabil-Fareed; Zhou, Zhemin; Sergeant, Martin J; Achtman, Mark

    2018-04-01

    For many decades, Salmonella enterica has been subdivided by serological properties into serovars or further subdivided for epidemiological tracing by a variety of diagnostic tests with higher resolution. Recently, it has been proposed that so-called eBurst groups (eBGs) based on the alleles of seven housekeeping genes (legacy multilocus sequence typing [MLST]) corresponded to natural populations and could replace serotyping. However, this approach lacks the resolution needed for epidemiological tracing and the existence of natural populations had not been independently validated by independent criteria. Here, we describe EnteroBase, a web-based platform that assembles draft genomes from Illumina short reads in the public domain or that are uploaded by users. EnteroBase implements legacy MLST as well as ribosomal gene MLST (rMLST), core genome MLST (cgMLST), and whole genome MLST (wgMLST) and currently contains over 100,000 assembled genomes from Salmonella. It also provides graphical tools for visual interrogation of these genotypes and those based on core single nucleotide polymorphisms (SNPs). eBGs based on legacy MLST are largely consistent with eBGs based on rMLST, thus demonstrating that these correspond to natural populations. rMLST also facilitated the selection of representative genotypes for SNP analyses of the entire breadth of diversity within Salmonella. In contrast, cgMLST provides the resolution needed for epidemiological investigations. These observations show that genomic genotyping, with the assistance of EnteroBase, can be applied at all levels of diversity within the Salmonella genus.

  20. Frederik Ruysch (1638-1731): Historical perspective and contemporary analysis of his teratological legacy.

    PubMed

    Boer, Lucas; Radziun, Anna B; Oostra, Roelof-Jan

    2017-01-01

    The Peter the Great Museum of Anthropology and Ethnography (Kunstkamera) in Saint Petersburg is the oldest museum in Russia. It keeps the remains of the anatomical collection of the world-famous 17th century Dutch anatomist Frederik Ruysch. This unique collection was bought and shipped in 1717 by Czar Peter the Great, and presently still comprises more than 900 specimens, a modest number of which concerns specimens with congenital anomalies. We searched for teratological clues in the existing collection and in all his descriptions and correspondence regarding specimens and cases he encountered during his career as doctor anatomiae and chief instructor of the surgeons and midwives in Amsterdam. A total of 63 teratological specimens and case descriptions were identified in this legacy, including some exceedingly rare anomalies. As it turns out, Ruysch was the first to describe several of the conditions we encountered, including intracranial teratoma, enchondromatosis, and Majewski syndrome. Although his comments pose an interesting view on how congenital anomalies were scientifically perceived in early 18th century Europe, Ruysch mostly refrained from explaining the causes of the conditions he encountered. Instead, he dedicated himself to careful descriptions of his specimens. Almost 300 years after his demise, Ruysch's legacy still impresses and inspires both scientists and lay men. © 2016 The Authors. American Journal of Medical Genetics Part A Published by Wiley Periodicals, Inc. © 2016 The Authors. American Journal of Medical Genetics Part A Published by Wiley Periodicals, Inc.

  1. The colonial context of Filipino American immigrants' psychological experiences.

    PubMed

    David, E J R; Nadal, Kevin L

    2013-07-01

    Because of the long colonial history of Filipinos and the highly Americanized climate of postcolonial Philippines, many scholars from various disciplines have speculated that colonialism and its legacies may play major roles in Filipino emigration to the United States. However, there are no known empirical studies in psychology that specifically investigate whether colonialism and its effects have influenced the psychological experiences of Filipino American immigrants prior to their arrival in the United States. Further, there is no existing empirical study that specifically investigates the extent to which colonialism and its legacies continue to influence Filipino American immigrants' mental health. Thus, using interviews (N = 6) and surveys (N = 219) with Filipino American immigrants, two studies found that colonialism and its consequences are important factors to consider when conceptualizing the psychological experiences of Filipino American immigrants. Specifically, the findings suggest that (a) Filipino American immigrants experienced ethnic and cultural denigration in the Philippines prior to their U.S. arrival, (b) ethnic and cultural denigration in the Philippines and in the United States may lead to the development of colonial mentality (CM), and (c) that CM may have negative mental health consequences among Filipino American immigrants. The two studies' findings suggest that the Filipino American immigration experience cannot be completely captured by the voluntary immigrant narrative, as they provide empirical support to the notion that the Filipino American immigration experience needs to be understood in the context of colonialism and its most insidious psychological legacy- CM. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  2. Involuntary eye motion correction in retinal optical coherence tomography: Hardware or software solution?

    PubMed

    Baghaie, Ahmadreza; Yu, Zeyun; D'Souza, Roshan M

    2017-04-01

    In this paper, we review state-of-the-art techniques to correct eye motion artifacts in Optical Coherence Tomography (OCT) imaging. The methods for eye motion artifact reduction can be categorized into two major classes: (1) hardware-based techniques and (2) software-based techniques. In the first class, additional hardware is mounted onto the OCT scanner to gather information about the eye motion patterns during OCT data acquisition. This information is later processed and applied to the OCT data for creating an anatomically correct representation of the retina, either in an offline or online manner. In software based techniques, the motion patterns are approximated either by comparing the acquired data to a reference image, or by considering some prior assumptions about the nature of the eye motion. Careful investigations done on the most common methods in the field provides invaluable insight regarding future directions of the research in this area. The challenge in hardware-based techniques lies in the implementation aspects of particular devices. However, the results of these techniques are superior to those obtained from software-based techniques because they are capable of capturing secondary data related to eye motion during OCT acquisition. Software-based techniques on the other hand, achieve moderate success and their performance is highly dependent on the quality of the OCT data in terms of the amount of motion artifacts contained in them. However, they are still relevant to the field since they are the sole class of techniques with the ability to be applied to legacy data acquired using systems that do not have extra hardware to track eye motion. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Analyzing and designing object-oriented missile simulations with concurrency

    NASA Astrophysics Data System (ADS)

    Randorf, Jeffrey Allen

    2000-11-01

    A software object model for the six degree-of-freedom missile modeling domain is presented. As a precursor, a domain analysis of the missile modeling domain was started, based on the Feature-Oriented Domain Analysis (FODA) technique described by the Software Engineering Institute (SEI). It was subsequently determined the FODA methodology is functionally equivalent to the Object Modeling Technique. The analysis used legacy software documentation and code from the ENDOSIM, KDEC, and TFrames 6-DOF modeling tools, including other technical literature. The SEI Object Connection Architecture (OCA) was the template for designing the object model. Three variants of the OCA were considered---a reference structure, a recursive structure, and a reference structure with augmentation for flight vehicle modeling. The reference OCA design option was chosen for maintaining simplicity while not compromising the expressive power of the OMT model. The missile architecture was then analyzed for potential areas of concurrent computing. It was shown how protected objects could be used for data passing between OCA object managers, allowing concurrent access without changing the OCA reference design intent or structure. The implementation language was the 1995 release of Ada. OCA software components were shown how to be expressed as Ada child packages. While acceleration of several low level and other high operations level are possible on proper hardware, there was a 33% degradation of 4th order Runge-Kutta integrator performance of two simultaneous ordinary differential equations using Ada tasking on a single processor machine. The Defense Department's High Level Architecture was introduced and explained in context with the OCA. It was shown the HLA and OCA were not mutually exclusive architectures, but complimentary. HLA was shown as an interoperability solution, with the OCA as an architectural vehicle for software reuse. Further directions for implementing a 6-DOF missile modeling environment are discussed.

  4. Additive Manufacturing: Which DLA-Managed Legacy Parts are Potential AM Candidates

    DTIC Science & Technology

    2016-07-01

    R G ADDITIVE MANUFACTURING : WHICH DLA-MANAGED LEGACY PARTS ARE POTENTIAL AM CANDIDATES? REPORT DL501T1 J UL Y 2016...L Y 2 0 1 6 ADDITIVE MANUFACTURING : WHICH DLA-MANAGED LEGACY PARTS ARE POTENTIAL AM CANDIDATES? REPORT DL501T1 Thomas K . Pa rk s...DESIGNATED BY OTHER OFFICIAL DOCUMENTATION. LMI © 2016. ALL RIGHTS RESERVED. iii Additive Manufacturing : Which DLA-Managed Legacy Parts Are

  5. A Study to Identify the Critical Success Factors for ERP Implementation in an Indian SME: A Case Based Approach

    NASA Astrophysics Data System (ADS)

    Upadhyay, Parijat; Dan, Pranab K.

    To achieve synergy across product lines, businesses are implementing a set of standard business applications and consistent data definitions across all business units. ERP packages are extremely useful in integrating a global company and provide a "common language" throughout the company. Companies are not only implementing a standardized application but is also moving to a common architecture and infrastructure. For many companies, a standardized software rollout is a good time to do some consolidation of their IT infrastructure across various locations. Companies are also finding that the ERP solutions help them get rid of their legacy systems, most of which may not be compliant with the modern day business requirements.

  6. Analysis of Ten Reverse Engineering Tools

    NASA Astrophysics Data System (ADS)

    Koskinen, Jussi; Lehmonen, Tero

    Reverse engineering tools can be used in satisfying the information needs of software maintainers. Especially in case of maintaining large-scale legacy systems tool support is essential. Reverse engineering tools provide various kinds of capabilities to provide the needed information to the tool user. In this paper we analyze the provided capabilities in terms of four aspects: provided data structures, visualization mechanisms, information request specification mechanisms, and navigation features. We provide a compact analysis of ten representative reverse engineering tools for supporting C, C++ or Java: Eclipse Java Development Tools, Wind River Workbench (for C and C++), Understand (for C++), Imagix 4D, Creole, Javadoc, Javasrc, Source Navigator, Doxygen, and HyperSoft. The results of the study supplement the earlier findings in this important area.

  7. SMURF: SubMillimeter User Reduction Facility

    NASA Astrophysics Data System (ADS)

    Jenness, Tim; Chapin, Edward L.; Berry, David S.; Gibb, Andy G.; Tilanus, Remo P. J.; Balfour, Jennifer; Tilanus, Vincent; Currie, Malcolm J.

    2013-10-01

    SMURF reduces submillimeter single-dish continuum and heterodyne data. It is mainly targeted at data produced by the James Clerk Maxwell Telescope but data from other telescopes have been reduced using the package. SMURF is released as part of the bundle that comprises Starlink (ascl:1110.012) and most of the packages that use it. The two key commands are MAKEMAP for the creation of maps from sub millimeter continuum data and MAKECUBE for the creation of data cubes from heterodyne array instruments. The software can also convert data from legacy JCMT file formats to the modern form to allow it to be processed by MAKECUBE. SMURF is a core component of the ORAC-DR (ascl:1310.001) data reduction pipeline for JCMT.

  8. Investigation into the development of computer aided design software for space based sensors

    NASA Technical Reports Server (NTRS)

    Pender, C. W.; Clark, W. L.

    1987-01-01

    The described effort is phase one of the development of a Computer Aided Design (CAD) software to be used to perform radiometric sensor design. The software package will be referred to as SCAD and is directed toward the preliminary phase of the design of space based sensor system. The approach being followed is to develop a modern, graphic intensive, user friendly software package using existing software as building blocks. The emphasis will be directed toward the development of a shell containing menus, smart defaults, and interfaces, which can accommodate a wide variety of existing application software packages. The shell will offer expected utilities such as graphics, tailored menus, and a variety of drivers for I/O devices. Following the development of the shell, the development of SCAD is planned as chiefly selection and integration of appropriate building blocks. The phase one development activities have included: the selection of hardware which will be used with SCAD; the determination of the scope of SCAD; the preliminary evaluation of a number of software packages for applicability to SCAD; determination of a method for achieving required capabilities where voids exist; and then establishing a strategy for binding the software modules into an easy to use tool kit.

  9. Software engineering as an engineering discipline

    NASA Technical Reports Server (NTRS)

    Gibbs, Norman

    1988-01-01

    The goals of the Software Engineering Institute's Education Program are as follows: to increase the number of highly qualified software engineers--new software engineers and existing practitioners; and to be the leading center of expertise for software engineering education and training. A discussion of these goals is presented in vugraph form.

  10. Nitrate retention capacity of milldam-impacted legacy sediments and relict A horizon soils

    NASA Astrophysics Data System (ADS)

    Weitzman, Julie N.; Kaye, Jason P.

    2017-05-01

    While eutrophication is often attributed to contemporary nutrient pollution, there is growing evidence that past practices, like the accumulation of legacy sediment behind historic milldams, are also important. Given their prevalence, there is a critical need to understand how N flows through, and is retained in, legacy sediments to improve predictions and management of N transport from uplands to streams in the context of climatic variability and land-use change. Our goal was to determine how nitrate (NO3-) is cycled through the soil of a legacy-sediment-strewn stream before and after soil drying. We extracted 10.16 cm radius intact soil columns that extended 30 cm into each of the three significant soil horizons at Big Spring Run (BSR) in Lancaster, Pennsylvania: surface legacy sediment characterized by a newly developing mineral A horizon soil, mid-layer legacy sediment consisting of mineral B horizon soil and a dark, organic-rich, buried relict A horizon soil. Columns were first preincubated at field capacity and then isotopically labeled nitrate (15NO3-) was added and allowed to drain to estimate retention. The columns were then air-dried and subsequently rewet with N-free water and allowed to drain to quantify the drought-induced loss of 15NO3- from the different horizons. We found the highest initial 15N retention in the mid-layer legacy sediment (17 ± 4 %) and buried relict A soil (14 ± 3 %) horizons, with significantly lower retention in the surface legacy sediment (6 ± 1 %) horizon. As expected, rewetting dry soil resulted in 15N losses in all horizons, with the greatest losses in the buried relict A horizon soil, followed by the mid-layer legacy sediment and surface legacy sediment horizons. The 15N remaining in the soil following the post-drought leaching was highest in the mid-layer legacy sediment, intermediate in the surface legacy sediment, and lowest in the buried relict A horizon soil. Fluctuations in the water table at BSR which affect saturation of the buried relict A horizon soil could lead to great loses of NO3- from the soil, while vertical flow through the legacy-sediment-rich soil profile that originates in the surface has the potential to retain more NO3-. Restoration that seeks to reconnect the groundwater and surface water, which will decrease the number of drying-rewetting events imposed on the relict A horizon soils, could initially lead to increased losses of NO3- to nearby stream waters.

  11. Sport and exercise medicine and the Olympic health legacy

    PubMed Central

    2012-01-01

    London 2012 is the first Olympic and Paralympic Games to explicitly try and develop socioeconomic legacies for which success indicators are specified - the highest profile of which was to deliver a health legacy by getting two million more people more active by 2012. This editorial highlights how specialists in Sport and Exercise Medicine can contribute towards increasing physical activity participation in the UK, as well as how the National Centre for Sport and Exercise Medicine might be a useful vehicle for delivering an Olympic health legacy. Key challenges are also discussed such as acquisition of funding to support new physical activity initiatives, appropriate allocation of resources, and how to assess the impact of legacy initiatives. PMID:22813079

  12. Sport and exercise medicine and the Olympic health legacy.

    PubMed

    Tew, Garry A; Copeland, Robert J; Till, Simon H

    2012-07-19

    London 2012 is the first Olympic and Paralympic Games to explicitly try and develop socioeconomic legacies for which success indicators are specified - the highest profile of which was to deliver a health legacy by getting two million more people more active by 2012. This editorial highlights how specialists in Sport and Exercise Medicine can contribute towards increasing physical activity participation in the UK, as well as how the National Centre for Sport and Exercise Medicine might be a useful vehicle for delivering an Olympic health legacy. Key challenges are also discussed such as acquisition of funding to support new physical activity initiatives, appropriate allocation of resources, and how to assess the impact of legacy initiatives.

  13. Prognostics and Health Management in Nuclear Power Plants: A Review of Technologies and Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coble, Jamie B.; Ramuhalli, Pradeep; Bond, Leonard J.

    This report reviews the current state of the art of prognostics and health management (PHM) for nuclear power systems and related technology currently applied in field or under development in other technological application areas, as well as key research needs and technical gaps for increased use of PHM in nuclear power systems. The historical approach to monitoring and maintenance in nuclear power plants (NPPs), including the Maintenance Rule for active components and Aging Management Plans for passive components, are reviewed. An outline is given for the technical and economic challenges that make PHM attractive for both legacy plants through Lightmore » Water Reactor Sustainability (LWRS) and new plant designs. There is a general introduction to PHM systems for monitoring, fault detection and diagnostics, and prognostics in other, non-nuclear fields. The state of the art for health monitoring in nuclear power systems is reviewed. A discussion of related technologies that support the application of PHM systems in NPPs, including digital instrumentation and control systems, wired and wireless sensor technology, and PHM software architectures is provided. Appropriate codes and standards for PHM are discussed, along with a description of the ongoing work in developing additional necessary standards. Finally, an outline of key research needs and opportunities that must be addressed in order to support the application of PHM in legacy and new NPPs is presented.« less

  14. Web-Based Geographic Information System Tool for Accessing Hanford Site Environmental Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Triplett, Mark B.; Seiple, Timothy E.; Watson, David J.

    Data volume, complexity, and access issues pose severe challenges for analysts, regulators and stakeholders attempting to efficiently use legacy data to support decision making at the U.S. Department of Energy’s (DOE) Hanford Site. DOE has partnered with the Pacific Northwest National Laboratory (PNNL) on the PHOENIX (PNNL-Hanford Online Environmental Information System) project, which seeks to address data access, transparency, and integration challenges at Hanford to provide effective decision support. PHOENIX is a family of spatially-enabled web applications providing quick access to decades of valuable scientific data and insight through intuitive query, visualization, and analysis tools. PHOENIX realizes broad, public accessibilitymore » by relying only on ubiquitous web-browsers, eliminating the need for specialized software. It accommodates a wide range of users with intuitive user interfaces that require little or no training to quickly obtain and visualize data. Currently, PHOENIX is actively hosting three applications focused on groundwater monitoring, groundwater clean-up performance reporting, and in-tank monitoring. PHOENIX-based applications are being used to streamline investigative and analytical processes at Hanford, saving time and money. But more importantly, by integrating previously isolated datasets and developing relevant visualization and analysis tools, PHOENIX applications are enabling DOE to discover new correlations hidden in legacy data, allowing them to more effectively address complex issues at Hanford.« less

  15. Development of a Unix/VME data acquisition system

    NASA Astrophysics Data System (ADS)

    Miller, M. C.; Ahern, S.; Clark, S. M.

    1992-01-01

    The current status of a Unix-based VME data acquisition development project is described. It is planned to use existing Fortran data collection software to drive the existing CAMAC electronics via a VME CAMAC branch driver card and associated Daresbury Unix driving software. The first usable Unix driver has been written and produces single-action CAMAC cycles from test software. The data acquisition code has been implemented in test mode under Unix with few problems and effort is now being directed toward finalizing calls to the CAMAC-driving software and ultimate evaluation of the complete system.

  16. Observatory software for the Maunakea Spectroscopic Explorer

    NASA Astrophysics Data System (ADS)

    Vermeulen, Tom; Isani, Sidik; Withington, Kanoa; Ho, Kevin; Szeto, Kei; Murowinski, Rick

    2016-07-01

    The Canada-France-Hawaii Telescope is currently in the conceptual design phase to redevelop its facility into the new Maunakea Spectroscopic Explorer (MSE). MSE is designed to be the largest non-ELT optical/NIR astronomical telescope, and will be a fully dedicated facility for multi-object spectroscopy over a broad range of spectral resolutions. This paper outlines the software and control architecture envisioned for the new facility. The architecture will be designed around much of the existing software infrastructure currently used at CFHT as well as the latest proven opensource software. CFHT plans to minimize risk and development time by leveraging existing technology.

  17. Test Operations Procedure (TOP) 02-1-100 Anthropomorphic Test Device Operation and Setup

    DTIC Science & Technology

    2016-02-09

    using the Data Acquisition for Anthropomorphic Test Devices (D4D) in vehicle vulnerability testing. The D4D is an onboard data acquisition system ( DAS ...for Anthropomorphic Test Devices (D4D)** in vehicle vulnerability testing. The D4D is an onboard data acquisition system ( DAS ) that is intended for...use with the Hybrid II/III ATD’s. The D4D was developed to augment the existing DAS system, the legacy Versatile Information Systems Integrated On

  18. How black holes saved relativity

    NASA Astrophysics Data System (ADS)

    Prescod-Weinstein, Chanda

    2016-02-01

    While there have been many popular-science books on the historical and scientific legacy of Albert Einstein's general theory of relativity, a gap exists in the literature for a definitive, accessible history of the theory's most famous offshoot: black holes. In Black Hole, the science writer Marcia Bartusiak aims for a discursive middle ground, writing solely about black holes at a level suitable for both high-school students and more mature readers while also giving some broader scientific context for black-hole research.

  19. MATERNAL PERCEPTIONS OF PARENTING FOLLOWING AN EVIDENCE-BASED PARENTING PROGRAM: A QUALITATIVE STUDY OF LEGACY FOR CHILDRENTM.

    PubMed

    Hartwig, Sophie A; Robinson, Lara R; Comeau, Dawn L; Claussen, Angelika H; Perou, Ruth

    2017-07-01

    This article presents the findings of a qualitative study of maternal perceptions of parenting following participation in Legacy for Children TM (Legacy), an evidence-based parenting program for low-income mothers of young children and infants. To further examine previous findings and better understand participant experiences, we analyzed semistructured focus-group discussions with predominantly Hispanic and Black, non-Hispanic Legacy mothers at two sites (n = 166) using thematic analysis and grounded theory techniques. The qualitative study presented here investigated how mothers view their parenting following participation in Legacy, allowing participants to describe their experience with the program in their own words, thus capturing an "insider" perspective. Mothers at both sites communicated knowledge and use of positive parenting practices targeted by the goals of Legacy; some site-specific differences emerged related to these parenting practices. These findings align with the interpretation of quantitative results from the randomized controlled trials and further demonstrate the significance of the Legacy program in promoting positive parenting for mothers living in poverty. This study emphasizes the importance of understanding real-world context regarding program efficacy and the benefit of using qualitative research to understand participant experiences. © 2017 Michigan Association for Infant Mental Health.

  20. NASA software specification and evaluation system design, part 2

    NASA Technical Reports Server (NTRS)

    1976-01-01

    A survey and analysis of the existing methods, tools and techniques employed in the development of software are presented along with recommendations for the construction of reliable software. Functional designs for software specification language, and the data base verifier are presented.

  1. Automated Software Development Workstation (ASDW)

    NASA Technical Reports Server (NTRS)

    Fridge, Ernie

    1990-01-01

    Software development is a serious bottleneck in the construction of complex automated systems. An increase of the reuse of software designs and components has been viewed as a way to relieve this bottleneck. One approach to achieving software reusability is through the development and use of software parts composition systems. A software parts composition system is a software development environment comprised of a parts description language for modeling parts and their interfaces, a catalog of existing parts, a composition editor that aids a user in the specification of a new application from existing parts, and a code generator that takes a specification and generates an implementation of a new application in a target language. The Automated Software Development Workstation (ASDW) is an expert system shell that provides the capabilities required to develop and manipulate these software parts composition systems. The ASDW is now in Beta testing at the Johnson Space Center. Future work centers on responding to user feedback for capability and usability enhancement, expanding the scope of the software lifecycle that is covered, and in providing solutions to handling very large libraries of reusable components.

  2. 31 CFR 357.28 - Transaction requests.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... BONDS, NOTES AND BILLS HELD IN LEGACY TREASURY DIRECT Legacy Treasury Direct Book-Entry Securities System (Legacy Treasury Direct) § 357.28 Transaction requests. (a) General. Unless otherwise authorized... owner of the security. In addition to any required certification, a transaction request form executed by...

  3. Experience with case tools in the design of process-oriented software

    NASA Astrophysics Data System (ADS)

    Novakov, Ognian; Sicard, Claude-Henri

    1994-12-01

    In Accelerator systems such as the CERN PS complex, process equipment has a life time which may exceed the typical life cycle of its related software. Taking into account the variety of such equipment, it is important to keep the analysis and design of the software in a system-independent form. This paper discusses the experience gathered in using commercial CASE tools for analysis, design and reverse engineering of different process-oriented software modules, with a principal emphasis on maintaining the initial analysis in a standardized form. Such tools have been in existence for several years, but this paper shows that they are not fully adapted to our needs. In particular, the paper stresses the problems of integrating such a tool into an existing data-base-dependent development chain, the lack of real-time simulation tools and of Object-Oriented concepts in existing commercial packages. Finally, the paper gives a broader view of software engineering needs in our particular context.

  4. Olympic Health Legacy; Essentials for Lasting Development of Host City

    PubMed Central

    Lee, Young-Hee; Kim, Jung Moon

    2013-01-01

    The purpose of the Olympic Games should be to contribute to the social development by leaving behind economic, cultural and environmental legacies to the hosting region. While tangible examples such as venues are often recognized as representative legacies of the Olympics, intangible aspects such as the environment, culture, policy and human resources have been gaining in importance. The Olympic Games, at its most fundamental level, is a sporting event. Sports not only is closely related to the physical health, but is also instrumental to fostering mental health through inspiration. One of the most important sports legacies was the general change in the population’s perception on sports and physical activities; due to such change, people were able to enjoy sports as part of healthy and active everyday life and benefit physically. However, compared to tangible legacies such as the facilities, social legacies such as the general health and their planning, execution and achievements are hard to monitor. Therefore, for the Olympics to leave behind socio-cultural legacies that contribute to the development of the hosting region, there must be a thorough business plan that takes into account region-specific purpose, and is divided into stages such as before, during and after the Games. Should the 2018 Winter Olympic Games hope to create continuing contribution to its hosting region, it must leave behind ‘Health Legacies’ that will enhance the happiness of the hosting region’s population. To this end, establishment of region-specific purpose and systematic promotion of business via detailed analysis of precedents are a must. This article aim to review the health legacy endeavors of past host cities and suggest the appropriate forms of health legacy of 2018 Pyeongchang Winter Olympic and Paralympic Games. PMID:26064832

  5. Northeastern Area Forest Legacy Program Yearbook 2006

    Treesearch

    Northeastern Area; State and Private Forestry

    2007-01-01

    The Forest Legacy Program protects important forestlands by purchasing conservation easements or fee titles from willing landowners whose lands are threatened with conversion to nonforest uses. Forest Legacy properties offer recreational opportunities, watershed protection, timber resources, wildlife habitat, and protection from development. NA-IN-02-07

  6. A Novel Coupling Pattern in Computational Science and Engineering Software

    EPA Science Inventory

    Computational science and engineering (CSE) software is written by experts of certain area(s). Due to the specialization, existing CSE software may need to integrate other CSE software systems developed by different groups of experts. The coupling problem is one of the challenges...

  7. A Novel Coupling Pattern in Computational Science and Engineering Software

    EPA Science Inventory

    Computational science and engineering (CSE) software is written by experts of certain area(s). Due to the specialization,existing CSE software may need to integrate other CSE software systems developed by different groups of experts. Thecoupling problem is one of the challenges f...

  8. Specifications for Thesaurus Software.

    ERIC Educational Resources Information Center

    Milstead, Jessica L.

    1991-01-01

    Presents specifications for software that is designed to support manual development and maintenance of information retrieval thesauri. Evaluation of existing software and design of custom software is discussed, requirements for integration with larger systems and for the user interface are described, and relationships among terms are discussed.…

  9. Flexible control techniques for a lunar base

    NASA Technical Reports Server (NTRS)

    Kraus, Thomas W.

    1992-01-01

    The fundamental elements found in every terrestrial control system can be employed in all lunar applications. These elements include sensors which measure physical properties, controllers which acquire sensor data and calculate a control response, and actuators which apply the control output to the process. The unique characteristics of the lunar environment will certainly require the development of new control system technology. However, weightlessness, harsh atmospheric conditions, temperature extremes, and radiation hazards will most significantly impact the design of sensors and actuators. The controller and associated control algorithms, which are the most complex element of any control system, can be derived in their entirety from existing technology. Lunar process control applications -- ranging from small-scale research projects to full-scale processing plants -- will benefit greatly from the controller advances being developed today. In particular, new software technology aimed at commercial process monitoring and control applications will almost completely eliminate the need for custom programs and the lengthy development and testing cycle they require. The applicability of existing industrial software to lunar applications has other significant advantages in addition to cost and quality. This software is designed to run on standard hardware platforms and takes advantage of existing LAN and telecommunications technology. Further, in order to exploit the existing commercial market, the software is being designed to be implemented by users of all skill levels -- typically users who are familiar with their process, but not necessarily with software or control theory. This means that specialized technical support personnel will not need to be on-hand, and the associated costs are eliminated. Finally, the latest industrial software designed for the commercial market is extremely flexible, in order to fit the requirements of many types of processing applications with little or no customization. This means that lunar process control projects will not be delayed by unforeseen problems or last minute process modifications. The software will include all of the tools needed to adapt to virtually any changes. In contrast to other space programs which required the development of tremendous amounts of custom software, lunar-based processing facilities will benefit from the use of existing software technology which is being proven in commercial applications on Earth.

  10. A prototype forensic toolkit for industrial-control-systems incident response

    NASA Astrophysics Data System (ADS)

    Carr, Nickolas B.; Rowe, Neil C.

    2015-05-01

    Industrial control systems (ICSs) are an important part of critical infrastructure in cyberspace. They are especially vulnerable to cyber-attacks because of their legacy hardware and software and the difficulty of changing it. We first survey the history of intrusions into ICSs, the more serious of which involved a continuing adversary presence on an ICS network. We discuss some common vulnerabilities and the categories of possible attacks, noting the frequent use of software written a long time ago. We propose a framework for designing ICS incident response under the constraints that no new software must be required and that interventions cannot impede the continuous processing that is the norm for such systems. We then discuss a prototype toolkit we built using the Windows Management Instrumentation Command-Line tool for host-based analysis and the Bro intrusion-detection software for network-based analysis. Particularly useful techniques we used were learning the historical range of parameters of numeric quantities so as to recognize anomalies, learning the usual addresses of connections to a node, observing Internet addresses (usually rare), observing anomalous network protocols such as unencrypted data transfers, observing unusual scheduled tasks, and comparing key files through registry entries and hash values to find malicious modifications. We tested our methods on actual data from ICSs including publicly-available data, voluntarily-submitted data, and researcher-provided "advanced persistent threat" data. We found instances of interesting behavior in our experiments. Intrusions were generally easy to see because of the repetitive nature of most processing on ICSs, but operators need to be motivated to look.

  11. MaROS: Information Management Service

    NASA Technical Reports Server (NTRS)

    Allard, Daniel A.; Gladden, Roy E.; Wright, Jesse J.; Hy, Franklin H.; Rabideau, Gregg R.; Wallick, Michael N.

    2011-01-01

    This software is provided by the Mars Relay Operations Service (MaROS) task to a variety of Mars projects for the purpose of coordinating communications sessions between landed spacecraft assets and orbiting spacecraft assets at Mars. The Information Management Service centralizes a set of functions previously distributed across multiple spacecraft operations teams, and as such, greatly improves visibility into the end-to-end strategic coordination process. Most of the process revolves around the scheduling of communications sessions between the spacecraft during periods of time when a landed asset on Mars is geometrically visible by an orbiting spacecraft. These relay sessions are used to transfer data both to and from the landed asset via the orbiting asset on behalf of Earth-based spacecraft operators. This software component is an application process running as a Java virtual machine. The component provides all service interfaces via a Representational State Transfer (REST) protocol over https to external clients. There are two general interaction modes with the service: upload and download of data. For data upload, the service must execute logic specific to the upload data type and trigger any applicable calculations including pass delivery latencies and overflight conflicts. For data download, the software must retrieve and correlate requested information and deliver to the requesting client. The provision of this service enables several key advancements over legacy processes and systems. For one, this service represents the first time that end-to-end relay information is correlated into a single shared repository. The software also provides the first multimission latency calculator; previous latency calculations had been performed on a mission-by-mission basis.

  12. Developing Legacy: Health Planning in the Host City of Porto Alegre for the 2014 Football World Cup.

    PubMed

    Witt, Regina Rigatto; Kotlhar, Mauro Kruter; Mesquita, Marilise Oliveira; Lima, Maria Alice Dias da Silva; Marin, Sandra Mara; Day, Carolina Baltar; Bandeira, Andrea Goncalves; Hutton, Alison

    2015-12-01

    To describe the process adopted to identify, classify, and evaluate legacy of health care planning in the host city of Porto Alegre for the Football World Cup 2014. There is an emerging interest in the need to demonstrate a sustainable health legacy from mass gatherings investments. Leaving a public health legacy for future host cities and countries is now an important part of planning for these events. The Ministry of Sports initiated and coordinated the development of projects in the host cities to identify actions, projects, and constructions to be developed to prepare for the World Cup. In Porto Alegre, a common structure was developed by the coordinating team to instruct legacy identification, classification, and evaluation. This structure was based on international documentary analysis (including official reports, policy documents, and web-based resources) and direct communication with recognized experts in the field. Sixteen total legacies were identified for health surveillance (5) and health services (11). They were classified according to the strategic area, organizations involved, dimension, typology, planned or unplanned, tangible or intangible, territorial coverage, and situation prior to the World Cup. Possible impacts were then assessed as positive, negative, and potentiating, and mitigating actions were indicated. The project allowed the identification, classification, and development of health legacy, including risk analysis, surveillance, mitigation measures, and provision of emergency medical care. Although the project intended the development of indicators to measure the identified legacies, evaluation was not possible at the time of publication due to time.

  13. System and Software Reliability (C103)

    NASA Technical Reports Server (NTRS)

    Wallace, Dolores

    2003-01-01

    Within the last decade better reliability models (hardware. software, system) than those currently used have been theorized and developed but not implemented in practice. Previous research on software reliability has shown that while some existing software reliability models are practical, they are no accurate enough. New paradigms of development (e.g. OO) have appeared and associated reliability models have been proposed posed but not investigated. Hardware models have been extensively investigated but not integrated into a system framework. System reliability modeling is the weakest of the three. NASA engineers need better methods and tools to demonstrate that the products meet NASA requirements for reliability measurement. For the new models for the software component of the last decade, there is a great need to bring them into a form that they can be used on software intensive systems. The Statistical Modeling and Estimation of Reliability Functions for Systems (SMERFS'3) tool is an existing vehicle that may be used to incorporate these new modeling advances. Adapting some existing software reliability modeling changes to accommodate major changes in software development technology may also show substantial improvement in prediction accuracy. With some additional research, the next step is to identify and investigate system reliability. System reliability models could then be incorporated in a tool such as SMERFS'3. This tool with better models would greatly add value in assess in GSFC projects.

  14. A web Accessible Framework for Discovery, Visualization and Dissemination of Polar Data

    NASA Astrophysics Data System (ADS)

    Kirsch, P. J.; Breen, P.; Barnes, T. D.

    2007-12-01

    A web accessible information framework, currently under development within the Physical Sciences Division of the British Antarctic Survey is described. The datasets accessed are generally heterogeneous in nature from fields including space physics, meteorology, atmospheric chemistry, ice physics, and oceanography. Many of these are returned in near real time over a 24/7 limited bandwidth link from remote Antarctic Stations and ships. The requirement is to provide various user groups - each with disparate interests and demands - a system incorporating a browsable and searchable catalogue; bespoke data summary visualization, metadata access facilities and download utilities. The system allows timely access to raw and processed datasets through an easily navigable discovery interface. Once discovered, a summary of the dataset can be visualized in a manner prescribed by the particular projects and user communities or the dataset may be downloaded, subject to accessibility restrictions that may exist. In addition, access to related ancillary information including software, documentation, related URL's and information concerning non-electronic media (of particular relevance to some legacy datasets) is made directly available having automatically been associated with a dataset during the discovery phase. Major components of the framework include the relational database containing the catalogue, the organizational structure of the systems holding the data - enabling automatic updates of the system catalogue and real-time access to data -, the user interface design, and administrative and data management scripts allowing straightforward incorporation of utilities, datasets and system maintenance.

  15. 48 CFR 27.409 - Solicitation provisions and contract clauses

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... section); (ii) For the acquisition of existing data, commercial computer software, or other existing data... United States (see paragraph (i)(1) of this section); (v) For architect-engineer services or construction... software, use the clause with its Alternate III. Any greater or lesser rights regarding the use...

  16. 48 CFR 27.409 - Solicitation provisions and contract clauses

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... section); (ii) For the acquisition of existing data, commercial computer software, or other existing data... United States (see paragraph (i)(1) of this section); (v) For architect-engineer services or construction... software, use the clause with its Alternate III. Any greater or lesser rights regarding the use...

  17. 48 CFR 27.409 - Solicitation provisions and contract clauses

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... section); (ii) For the acquisition of existing data, commercial computer software, or other existing data... United States (see paragraph (i)(1) of this section); (v) For architect-engineer services or construction... software, use the clause with its Alternate III. Any greater or lesser rights regarding the use...

  18. 48 CFR 27.409 - Solicitation provisions and contract clauses

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... section); (ii) For the acquisition of existing data, commercial computer software, or other existing data... United States (see paragraph (i)(1) of this section); (v) For architect-engineer services or construction... software, use the clause with its Alternate III. Any greater or lesser rights regarding the use...

  19. An Approach to Peabody's Gifts and Legacies.

    ERIC Educational Resources Information Center

    Parker, Franklin

    1994-01-01

    One in a collection of papers on self-made millionaire and educational philanthropist George Peabody offers a listing of his gifts and legacies, which fall into the categories of libraries, lyceums, athenaeum, art, music; science; model housing; education; exploration; patriotic causes; historical societies; hospitals; churches; legacies; and…

  20. 31 CFR 357.25 - Security interests.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... BONDS, NOTES AND BILLS HELD IN LEGACY TREASURY DIRECT Legacy Treasury Direct Book-Entry Securities System (Legacy Treasury Direct) § 357.25 Security interests. (a) General. The Department will not... 31 Money and Finance: Treasury 2 2010-07-01 2010-07-01 false Security interests. 357.25 Section...

  1. Mendel’s legacy lives through management of sugarcane pests

    USDA-ARS?s Scientific Manuscript database

    Entomology and classical Mendelian genetics have had a long association and Mendel’s legacy continues to live through sugarcane pests. In this paper, we discuss examples of that legacy as applied to conventional and molecular approaches to breeding for insect resistance. We also discuss the applicat...

  2. Software and the future of programming languages.

    PubMed

    Aho, Alfred V

    2004-02-27

    Although software is the key enabler of the global information infrastructure, the amount and extent of software in use in the world today are not widely understood, nor are the programming languages and paradigms that have been used to create the software. The vast size of the embedded base of existing software and the increasing costs of software maintenance, poor security, and limited functionality are posing significant challenges for the software R&D community.

  3. The Legacy Project: M. James Bensen

    ERIC Educational Resources Information Center

    Moye, Johnny J.

    2015-01-01

    Many vocational education, technology education, and now technology and engineering education leaders have made their mark on the profession. Their legacy is something that members of the profession enjoy and have the responsibility to continue to build upon. This is the sixth in a series of articles entitled "The Legacy Project" that…

  4. The Legacy Project: Lee H. Smalley, DTE

    ERIC Educational Resources Information Center

    Moye, Johnny J.

    2015-01-01

    Many vocational education, technology education, and now technology and engineering education leaders have made their mark on the profession. Their legacy is something that members of the profession enjoy and have a responsibility to continue and build upon. This is the seventh in a series of articles entitled "The Legacy Project," which…

  5. The Legacy Project--Ralph Bohn

    ERIC Educational Resources Information Center

    Moye, Johnny J.

    2013-01-01

    Many vocational education, technology education, and now technology and engineering education leaders have made their mark on their profession. Their legacy is something that members of the profession enjoy and have a responsibility to continue and build upon. This is the third in a series of articles entitled The Legacy Project, which focuses on…

  6. The Legacy Project: Donald P. Lauda

    ERIC Educational Resources Information Center

    Moye, Johnny J.

    2016-01-01

    Many vocational education, technology education, and now technology and engineering education leaders have made their mark on the teaching profession. Their legacy is something that members of the profession enjoy and have a responsibility to continue and build upon. The Legacy Project focuses on the lives and actions of leaders who have forged…

  7. Biological field stations: research legacies and sites for serendipity

    Treesearch

    William K. Michener; Keith L. Bildstein; Arthur McKee; Robert R. Parmenter; William W. Hargrove; Deedra McClearn; Mark Stromberg

    2009-01-01

    Biological field stations are distributed throughout North America, capturing much of the ecological variability present at the continental scale and encompassing many unique habitats. In addition to their role in supporting research and education, field stations offer legacies of data, specimens, and accumulated knowledge. Such legacies often provide the only...

  8. 77 FR 60714 - Information Collection Activities: Legacy Data Verification Process (LDVP); Submitted for Office...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-04

    ...-0008; OMB Number 1014-0009] Information Collection Activities: Legacy Data Verification Process (LDVP); Submitted for Office of Management and Budget (OMB) Review; Comment Request ACTION: 30-Day notice. SUMMARY... the Notice to Lessees (NTL) on the Legacy Data Verification Process (LDVP). This notice also provides...

  9. Precipitation legacy effects on dryland ecosystem carbon fluxes: direction, magnitude and biogeochemical carryovers

    USDA-ARS?s Scientific Manuscript database

    The precipitation legacy effect, defined as the impact of historical precipitation (PPT) on extant ecosystem dynamics, has been recognized as an important driver in shaping the temporal variability of dryland aboveground net primary production (ANPP) and soil respiration. How the PPT legacy influenc...

  10. 31 CFR 357.25 - Security interests.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 31 Money and Finance:Treasury 2 2012-07-01 2012-07-01 false Security interests. 357.25 Section 357... Legacy Treasury Direct Book-Entry Securities System (Legacy Treasury Direct) § 357.25 Security interests... security interest of any kind, including a pledge, in a security in Legacy Treasury Direct ® except as...

  11. 31 CFR 357.25 - Security interests.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 31 Money and Finance:Treasury 2 2011-07-01 2011-07-01 false Security interests. 357.25 Section 357... Legacy Treasury Direct Book-Entry Securities System (Legacy Treasury Direct) § 357.25 Security interests... security interest of any kind, including a pledge, in a security in Legacy Treasury Direct ® except as...

  12. 31 CFR 357.25 - Security interests.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 31 Money and Finance:Treasury 2 2013-07-01 2013-07-01 false Security interests. 357.25 Section 357... Legacy Treasury Direct Book-Entry Securities System (Legacy Treasury Direct) § 357.25 Security interests... security interest of any kind, including a pledge, in a security in Legacy Treasury Direct ® except as...

  13. Legacy STORET Level 5 | STORET Legacy Data Center | US ...

    EPA Pesticide Factsheets

    2007-05-16

    The U.S. Environmental Protection Agency (EPA) maintains two data management systems containing water quality information for the nation's waters: the Legacy Data Center (LDC), and STORET. The LDC is a static, archived database and STORET is an operational system actively being populated with water quality data.

  14. 31 CFR 357.27 - Reinvestment.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... AND BILLS HELD IN LEGACY TREASURY DIRECT Legacy Treasury Direct Book-Entry Securities System (Legacy... of a security may be reinvested at maturity in a new security in the same form of registration, provided a new security is then being offered by the Department and provision for reinvestment is made in...

  15. Philosophical Remarks on Nelson Mandela's Education Legacy

    ERIC Educational Resources Information Center

    Waghid, Yusef

    2014-01-01

    In this article, I reflect on Nelson Mandela's (Madiba, the clan name of Mandela) education legacy. I argue that Madiba's education legacy is constituted by three interrelated aspects: firstly, an education for non-violence guided by deliberation, compassion and reconciliation; secondly, education as responsibility towards the Other; and thirdly,…

  16. Past as Prologue: Educational Psychology's Legacy and Progeny

    ERIC Educational Resources Information Center

    Alexander, Patricia A.

    2018-01-01

    On the occasion of the 125th anniversary of the American Psychological Association, the legacies and progenies of the discipline of educational psychology are explored. To capture those legacies, transformational and influential contributions by educational psychologists to schools and society are described as key themes. Those themes entail: the…

  17. The 5L Instructional Design For Exploring Legacies through Biography

    ERIC Educational Resources Information Center

    Boulware, Beverly J.; Monroe, Eula E.; Wilcox, Bradley Ray

    2013-01-01

    People who have impacted generations have left legacies we can explore today through biographies. The 5L instructional design introduced in this article includes five components: Listen, Learn, Locate, Link, and Legacy. In the "Listen" section, teachers use storytelling and read-alouds to introduce individuals who shaped history. During…

  18. Shortcomings of existing systems for registration and legal protection of software products and possible ways to overcome them

    NASA Astrophysics Data System (ADS)

    Liapidevskiy, A. V.; Petrov, A. S.; Zhmud, V. A.; Sherubneva, I. G.

    2018-05-01

    The paper reveals the shortcomings of the existing system of registration and legal protection of software products. The system has too many disadvantages and shortcomings. Explanatory examples are given. Possible ways of overcoming these shortcomings are discussed. The paper also gives possible prospects for the use of new digital technologies. Also in the paper, the information is provided about the modern software components for protecting intellectual property rights of State corporations

  19. Drought-induced legacy effects in wood growth across the Eastern and Midwestern U.S. are mediated by site climate, tree age, and drought sensitivity

    NASA Astrophysics Data System (ADS)

    Kannenberg, S.; Maxwell, J. T.; Pederson, N.; D'Orangeville, L.; Phillips, R.

    2017-12-01

    While it is widely known that drought reduces carbon (C) uptake in temperate forests, tree growth can also remain stagnant post-drought despite favorable climatic conditions. While such "legacy effects" are well established, the degree to which these effects depend on species identity or variability in site conditions is poorly quantified. We sought to uncover how site, species, climate, and tree age interact to affect the presence and magnitude of legacy effects in temperate trees following drought. To do this, we assembled dendrochronological records of 18 common species across 94 sites in Eastern and Midwestern U.S. forests and quantified drought-induced changes in wood growth in the year of the drought (hereafter "drought sensitivity") and the years after the drought (i.e., legacy effects). We predicted that species particularly prone to hydraulic damage (e.g., oaks) would have the least drought sensitivity yet experience larger legacy effects, and that this effect would be exacerbated at arid sites. Across all species and sites, wood growth was reduced by 14% in the year of the drought and by 7% post-drought. Surprisingly, legacy effects were smaller for oak species and larger across species known to be more drought sensitive (e.g. tulip poplar, maple, birch). As a result, we observed a positive relationship between a species' drought sensitivity and that species' legacy effect. These legacy effects were similar in size across a range of drought severities. Surprisingly, legacy effects were smaller in more arid sites - contrary to previous investigations in dryland ecosystems - perhaps indicating the role of adaptation in mediating a tree's recovery from drought. In addition, many species actually decreased the size of their legacy effects as they aged, despite no change in drought responses. Our results run contrary to our predictions, as species with the greatest drought sensitivity had the least ability to recover, and that younger mesic forests- not arid systems - were the most prone to drought-induced legacy effects. Given that these reductions in wood growth are common across a wide range of sites and species, our results provide insight as to the consequences of drought for tree recovery, and knowledge of the ecosystems in which these effects predominate.

  20. Improving Building Energy Simulation Programs Through Diagnostic Testing (Fact Sheet)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    2012-02-01

    New test procedure evaluates quality and accuracy of energy analysis tools for the residential building retrofit market. Reducing the energy use of existing homes in the United States offers significant energy-saving opportunities, which can be identified through building simulation software tools that calculate optimal packages of efficiency measures. To improve the accuracy of energy analysis for residential buildings, the National Renewable Energy Laboratory's (NREL) Buildings Research team developed the Building Energy Simulation Test for Existing Homes (BESTEST-EX), a method for diagnosing and correcting errors in building energy audit software and calibration procedures. BESTEST-EX consists of building physics and utility billmore » calibration test cases, which software developers can use to compare their tools simulation findings to reference results generated with state-of-the-art simulation tools. Overall, the BESTEST-EX methodology: (1) Tests software predictions of retrofit energy savings in existing homes; (2) Ensures building physics calculations and utility bill calibration procedures perform to a minimum standard; and (3) Quantifies impacts of uncertainties in input audit data and occupant behavior. BESTEST-EX is helping software developers identify and correct bugs in their software, as well as develop and test utility bill calibration procedures.« less

  1. Managing configuration software of ground software applications with glueware

    NASA Technical Reports Server (NTRS)

    Larsen, B.; Herrera, R.; Sesplaukis, T.; Cheng, L.; Sarrel, M.

    2003-01-01

    This paper reports on a simple, low-cost effort to streamline the configuration of the uplink software tools. Even though the existing ground system consisted of JPL and custom Cassini software rather than COTS, we chose a glueware approach--reintegrating with wrappers and bridges and adding minimal new functionality.

  2. On Quality and Measures in Software Engineering

    ERIC Educational Resources Information Center

    Bucur, Ion I.

    2006-01-01

    Complexity measures are mainly used to estimate vital information about reliability and maintainability of software systems from regular analysis of the source code. Such measures also provide constant feedback during a software project to assist the control of the development procedure. There exist several models to classify a software product's…

  3. Discriminating Natural Variation from Legacies of Disturbance in Semi-Arid Forests, Southwestern USA

    NASA Astrophysics Data System (ADS)

    Swetnam, T. L.; Lynch, A. M.; Falk, D. A.; Yool, S. R.; Guertin, D. P.

    2014-12-01

    Characterizing differences in existing vegetation driven by natural variation versus disturbance legacies could become a critical component of applied forest management practice with important implications for monitoring ecologic succession and eco-hydrological interactions within the critical zone. Here we characterize variations in aerial LiDAR derived forest structure at individual tree scale in Arizona and New Mexico. Differences in structure result from both topographic and climatological variations and from natural and human related disturbances. We chose a priori undisturbed and disturbed sites that included preservation, development, logging and wildfire as exemplars. We compare two topographic indices, the topographic position index (TPI) and topographic wetness index (TWI), to two local indicators of spatial association (LISA): the Getis-Ord Gi and Anselin's Moran I. We found TPI and TWI correlate well to positive z-scores (tall trees in tall neighborhoods) in undisturbed areas and that disturbed areas are clearly defined by negative z-scores, in some cases better than what is visible from traditional orthophotography and existing GIS maps. These LISA methods also serve as a robust technique for creating like-clustered stands, i.e. common stands used in forest inventory monitoring. This research provides a significant advancement in the ability to (1) quantity variation in forest structure across topographically complex landscapes, (2) identify and map previously unrecorded disturbance locations, and (3) quantify the different impacts of disturbance within the perimeter of a stand or event at ecologically relevant scale.

  4. Inventory of Forts in Indonesia

    NASA Astrophysics Data System (ADS)

    Rinandi, N.; Suryaningsih, F.

    2015-08-01

    The great archipelago in Indonesia with its wealthy and various nature, the products and commodities of tropic agriculture and the rich soil, was through the centuries a region of interest for other countries all over the world. For several reasons some of these countries came to Indonesia to establish their existence and tried to monopolize the trading. These countries such as the Portuguese, the Spanish, the Dutch and the British built strengthened trade stations which later became forts all over Indonesia to defend their interest. The archipelago of Indonesia possesses a great number of fortification-works as legacies of native rulers and those which were built by European trading companies and later became colonial powers in the 16th to the 19th centuries. These legacies include those specific structures built as a defence system during pre and within the period of World War II. These fortresses are nowadaysvaluable subjects, because they might be considered as shared heritage among these countries and Indonesia. It's important to develop a vision to preserve these particular subjects of heritage, because they are an interesting part of the Indonesian history and its cultural treasures. The Government of the Republic of Indonesia has national program to compile a comprehensive documentation of the existing condition of these various types of forts as cultural heritage. The result of the 3 years project was a comprehensive 442 forts database in Indonesia, which will be very valuable to the implementation of legal protection, preservation matters and adaptive re-use in the future.

  5. Uranium mining in Portugal: a review of the environmental legacies of the largest mines and environmental and human health impacts.

    PubMed

    Pereira, R; Barbosa, S; Carvalho, F P

    2014-04-01

    The history of uranium mining in Portugal during almost one century has followed international demand peaks of both radium and uranium, which in turn were driven by medical, military, and civil applications. Nowadays, following price drop in the 1980s, mining activities decreased and ceased in 2001. The current challenge is to deal with environmental legacies left by old uranium mines, mainly located in Viseu and Guarda districts. In 2001, based on several radiological surveys carried out, the Portuguese government assumed the remediation costs of abandoned mine areas for environmental safety and public health protection. Detailed environmental and public health risk assessments were performed under the scope of studies both requested by the government and by funded research projects. It was found that the existing risks, due to radiological and chemical exposures to metals and radionuclide's, were particularly high at the old milling facilities and mines where in situ and heap leaching of low-grade ore occurred. The different studies, involving both humans and non-human species from different trophic levels, demonstrated the existence of effects at different levels of biological organization (molecular, cellular, tissues, individuals, and populations) and on ecosystem services. To mitigate the risks, the environmental rehabilitation works at the Urgeiriça mine complex are almost complete, while at Cunha Baixa mine, they are presently in progress. These works and environmental improvements achieved and expected are described herein.

  6. SONS: The JCMT legacy survey of debris discs in the submillimetre

    NASA Astrophysics Data System (ADS)

    Holland, Wayne S.; Matthews, Brenda C.; Kennedy, Grant M.; Greaves, Jane S.; Wyatt, Mark C.; Booth, Mark; Bastien, Pierre; Bryden, Geoff; Butner, Harold; Chen, Christine H.; Chrysostomou, Antonio; Davies, Claire L.; Dent, William R. F.; Di Francesco, James; Duchêne, Gaspard; Gibb, Andy G.; Friberg, Per; Ivison, Rob J.; Jenness, Tim; Kavelaars, JJ; Lawler, Samantha; Lestrade, Jean-François; Marshall, Jonathan P.; Moro-Martin, Amaya; Panić, Olja; Phillips, Neil; Serjeant, Stephen; Schieven, Gerald H.; Sibthorpe, Bruce; Vican, Laura; Ward-Thompson, Derek; van der Werf, Paul; White, Glenn J.; Wilner, David; Zuckerman, Ben

    2017-09-01

    Debris discs are evidence of the ongoing destructive collisions between planetesimals, and their presence around stars also suggests that planets exist in these systems. In this paper, we present submillimetre images of the thermal emission from debris discs that formed the SCUBA-2 Observations of Nearby Stars (SONS) survey, one of seven legacy surveys undertaken on the James Clerk Maxwell Telescope between 2012 and 2015. The overall results of the survey are presented in the form of 850 μm (and 450 μm, where possible) images and fluxes for the observed fields. Excess thermal emission, over that expected from the stellar photosphere, is detected around 49 stars out of the 100 observed fields. The discs are characterized in terms of their flux density, size (radial distribution of the dust) and derived dust properties from their spectral energy distributions. The results show discs over a range of sizes, typically 1-10 times the diameter of the Edgeworth-Kuiper Belt in our Solar system. The mass of a disc, for particles up to a few millimetres in size, is uniquely obtainable with submillimetre observations and this quantity is presented as a function of the host stars' age, showing a tentative decline in mass with age. Having doubled the number of imaged discs at submillimetre wavelengths from ground-based, single-dish telescope observations, one of the key legacy products from the SONS survey is to provide a comprehensive target list to observe at high angular resolution using submillimetre/millimetre interferometers (e.g. Atacama Large Millimeter Array, Smithsonian Millimeter Array).

  7. Putting the Power of Configuration in the Hands of the Users

    NASA Technical Reports Server (NTRS)

    Al-Shihabi, Mary-Jo; Brown, Mark; Rigolini, Marianne

    2011-01-01

    Goal was to reduce the overall cost of human space flight while maintaining the most demanding standards for safety and mission success. In support of this goal, a project team was chartered to replace 18 legacy Space Shuttle nonconformance processes and systems with one fully integrated system Problem Reporting and Corrective Action (PRACA) processes provide a closed-loop system for the identification, disposition, resolution, closure, and reporting of all Space Shuttle hardware/software problems PRACA processes are integrated throughout the Space Shuttle organizational processes and are critical to assuring a safe and successful program Primary Project Objectives Develop a fully integrated system that provides an automated workflow with electronic signatures Support multiple NASA programs and contracts with a single "system" architecture Define standard processes, implement best practices, and minimize process variations

  8. Forget about data, deliver results

    NASA Astrophysics Data System (ADS)

    Walter, Roland

    2015-12-01

    High-energy astrophysics space missions have pioneered and demonstrated the power of legacy data sets for generating new discoveries, especially when analysed in ways original researchers could not have anticipated. The only way to ensure that the data of present observatories can be effectively used in the future is to allow users to perform on-the-fly data analysis to produce straightforwardly scientific results for any sky position, time and energy intervals without requiring mission specific software or detailed instrumental knowledge. Providing a straightforward interface to complex data and data analysis makes the data and the process of generating science results available to the public and higher education and promotes the visibility of the investment in science to the society. This is a fundamental step to transmit the values of science and to evolve towards a knowledge society.

  9. Usability Considerations in Developing a Graphic Interface for Intra Office Communications

    NASA Astrophysics Data System (ADS)

    Yammiyavar, Pradeep; Jain, Piyush

    This paper outlines the basis of incorporating functional features in a new GUI based software under development for addressing comprehensive communication and interaction needs within an office environment. Bench marking of features in existing communication software products such as Microsoft Outlook, IBM Lotusnotes, Office Communicator, Mozilla Thunderbird etc. was done by asking a set of questions related to the usage of these existing softwares. Usability issues were identified through a user survey involving 30 subjects of varied profiles (domain, designation, age etc.) in a corporate office. It is posited that existing software products that have been developed for a universal market may be highly underutilized or have redundant features especially for use as an intra office (within the same office) communication medium. Simultaneously they may not cater to some very contextual requirements of intra office communications. Based on the findings of the survey of feature preferences & usability of existing products, a simple 'person to person' communicating medium for intra office situations was visualized with a new interactive GUI. Usability issues that need to be considered for a new intra-office product have been brought out.

  10. The Use and Abuse of Socrates in Present Day Teaching.

    ERIC Educational Resources Information Center

    Rud, Anthony G., Jr.

    The philosophical and educational significance of the present study lies in its attempt to clarify the value and importance of the Socratic legacy for education, in addition to underscoring the difficulty of understanding that legacy. Perspectives on Socrates and his legacy, from Friedrich Nietzsche to contemporary critics, such as Bruce Kimball…

  11. Northeastern Area Forest Legacy Program Yearbook 2008

    Treesearch

    US Forest Service, Northeastern Area, State and Private Forestry

    2009-01-01

    The purpose of the Forest Legacy Program (FLP) is to protect environmentally important forest areas that are threatened by conversion to nonforest uses. The Forest Legacy Program is a partnership between participating States and the Forest Service, U.S. Department of Agriculture. These two entities work together to identify important forest lands and...

  12. Writing through the Labyrinth of Fears: The Legacy of Walter Dean Myers

    ERIC Educational Resources Information Center

    Tatum, Alfred W.

    2015-01-01

    This commentary discusses the legacy of Walter Dean Myers in relationship to advancing writing as an intellectual tool of protection for black male teens. Multiple implications are provided for teachers who want to engage black male teens to write fearlessly to extend the legacy of Walter Dean Myers.

  13. Designing a Leadership Legacy (L2) Framework

    ERIC Educational Resources Information Center

    Fierke, Kerry K.

    2015-01-01

    What does it mean to leave a "leadership legacy" in the organizations and communities in which we are involved? This mixed-methods research project will explore the stories of successful individuals who have left a leadership legacy. Specifically in this article, the preliminary research will share various components of a model to create…

  14. 21 CFR 801.57 - Discontinuation of legacy FDA identification numbers assigned to devices.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 8 2014-04-01 2014-04-01 false Discontinuation of legacy FDA identification... Device Identification § 801.57 Discontinuation of legacy FDA identification numbers assigned to devices... been assigned an FDA labeler code to facilitate use of NHRIC or NDC numbers may continue to use that...

  15. Leadership Legacies

    ERIC Educational Resources Information Center

    Grady, Marilyn L.

    2006-01-01

    Recent obituaries and testimonials to Coretta Scott King and Wendy Wasserstein are reminders of the leadership legacies of these women. About Coretta Scott King (1927-2006), Burch in "The Miami Herald" (February 1, 2006) stated "Coretta Scott King built a legacy from pain and progress, first as the wife who stood tall next to a man bent on…

  16. Effects of climate legacies on above- and belowground community assembly.

    PubMed

    Delgado-Baquerizo, Manuel; Eldridge, David J; Travers, Samantha K; Val, James; Oliver, Ian; Bissett, Andrew

    2018-05-11

    The role of climatic legacies in regulating community assembly of above- and belowground species in terrestrial ecosystems remains largely unexplored and poorly understood. Here, we report on two separate regional and continental empirical studies, including >500 locations, aiming to identify the relative importance of climatic legacies (climatic anomaly over the last 20,000 years) compared to current climates in predicting the relative abundance of ecological clusters formed by species strongly co-occurring within two independent above- and belowground networks. Climatic legacies explained a significant portion of the variation in the current community assembly of terrestrial ecosystems (up to 15.4%) that could not be accounted for by current climate, soil properties, and management. Changes in the relative abundance of ecological clusters linked to climatic legacies (e.g., past temperature) showed the potential to indirectly alter other clusters, suggesting cascading effects. Our work illustrates the role of climatic legacies in regulating ecosystem community assembly and provides further insights into possible winner and loser community assemblies under global change scenarios. © 2018 John Wiley & Sons Ltd.

  17. Changing disturbance regimes, ecological memory, and forest resilience

    USGS Publications Warehouse

    Johnstone, Jill F.; Allen, Craig D.; Franklin, Jerry F.; Frelich, Lee E.; Harvey, Brian J.; Higuera, Philip E.; Mack, Michelle C.; Meentemeyer, Ross K.; Metz, Margaret R.; Perry, George LW; Schoennagel, Tania; Turner, Monica G.

    2016-01-01

    Ecological memory is central to how ecosystems respond to disturbance and is maintained by two types of legacies – information and material. Species life-history traits represent an adaptive response to disturbance and are an information legacy; in contrast, the abiotic and biotic structures (such as seeds or nutrients) produced by single disturbance events are material legacies. Disturbance characteristics that support or maintain these legacies enhance ecological resilience and maintain a “safe operating space” for ecosystem recovery. However, legacies can be lost or diminished as disturbance regimes and environmental conditions change, generating a “resilience debt” that manifests only after the system is disturbed. Strong effects of ecological memory on post-disturbance dynamics imply that contingencies (effects that cannot be predicted with certainty) of individual disturbances, interactions among disturbances, and climate variability combine to affect ecosystem resilience. We illustrate these concepts and introduce a novel ecosystem resilience framework with examples of forest disturbances, primarily from North America. Identifying legacies that support resilience in a particular ecosystem can help scientists and resource managers anticipate when disturbances may trigger abrupt shifts in forest ecosystems, and when forests are likely to be resilient.

  18. The Dangers of Failure Masking in Fault-Tolerant Software: Aspects of a Recent In-Flight Upset Event

    NASA Technical Reports Server (NTRS)

    Johnson, C. W.; Holloway, C. M.

    2007-01-01

    On 1 August 2005, a Boeing Company 777-200 aircraft, operating on an international passenger flight from Australia to Malaysia, was involved in a significant upset event while flying on autopilot. The Australian Transport Safety Bureau's investigation into the event discovered that an anomaly existed in the component software hierarchy that allowed inputs from a known faulty accelerometer to be processed by the air data inertial reference unit (ADIRU) and used by the primary flight computer, autopilot and other aircraft systems. This anomaly had existed in original ADIRU software, and had not been detected in the testing and certification process for the unit. This paper describes the software aspects of the incident in detail, and suggests possible implications concerning complex, safety-critical, fault-tolerant software.

  19. An online database for plant image analysis software tools.

    PubMed

    Lobet, Guillaume; Draye, Xavier; Périlleux, Claire

    2013-10-09

    Recent years have seen an increase in methods for plant phenotyping using image analyses. These methods require new software solutions for data extraction and treatment. These solutions are instrumental in supporting various research pipelines, ranging from the localisation of cellular compounds to the quantification of tree canopies. However, due to the variety of existing tools and the lack of central repository, it is challenging for researchers to identify the software that is best suited for their research. We present an online, manually curated, database referencing more than 90 plant image analysis software solutions. The website, plant-image-analysis.org, presents each software in a uniform and concise manner enabling users to identify the available solutions for their experimental needs. The website also enables user feedback, evaluations and new software submissions. The plant-image-analysis.org database provides an overview of existing plant image analysis software. The aim of such a toolbox is to help users to find solutions, and to provide developers a way to exchange and communicate about their work.

  20. The HEP Software and Computing Knowledge Base

    NASA Astrophysics Data System (ADS)

    Wenaus, T.

    2017-10-01

    HEP software today is a rich and diverse domain in itself and exists within the mushrooming world of open source software. As HEP software developers and users we can be more productive and effective if our work and our choices are informed by a good knowledge of what others in our community have created or found useful. The HEP Software and Computing Knowledge Base, hepsoftware.org, was created to facilitate this by serving as a collection point and information exchange on software projects and products, services, training, computing facilities, and relating them to the projects, experiments, organizations and science domains that offer them or use them. It was created as a contribution to the HEP Software Foundation, for which a HEP S&C knowledge base was a much requested early deliverable. This contribution will motivate and describe the system, what it offers, its content and contributions both existing and needed, and its implementation (node.js based web service and javascript client app) which has emphasized ease of use for both users and contributors.

  1. How will I be remembered? Conserving the environment for the sake of one's legacy.

    PubMed

    Zaval, Lisa; Markowitz, Ezra M; Weber, Elke U

    2015-02-01

    Long time horizons and social distance are viewed as key psychological barriers to proenvironmental action, particularly regarding climate change. We suggest that these challenges can be turned into opportunities by making salient long-term goals and motives, thus shifting preferences between the present self and future others. We tested whether individuals' motivation to leave a positive legacy can be leveraged to increase engagement with climate change and other environmental problems. In a pilot study, we found that individual differences in legacy motivation were positively associated with proenvironmental behaviors and intentions. In a subsequent experiment, we demonstrated that priming legacy motives increased donations to an environmental charity, proenvironmental intentions, and climate-change beliefs. Domain-general legacy motives represent a previously understudied and powerful mechanism for promoting proenvironmental behavior. © The Author(s) 2015.

  2. Broadcasting GPS integrity information using Loran-C

    NASA Astrophysics Data System (ADS)

    Lo, Sherman Chih

    The United States Federal Aviation Administration (FAA) will adopt the Global Positioning System (GPS) as its primary navigation systems for aviation as stated by the Federal Radionavigation Plans (FRP) of 1996 and 1999. The FRP also proposes the reduction or termination of some existing radionavigation system in favor of GPS and satellite navigation. It may be beneficial to retain some of these existing terrestrial navigation systems if they can provide increased safety and redundancy to the GPS based architecture. One manner in which this can be done is by using or creating a data link on these existing radionavigation systems. These systems thus can provide both navigation and an additional broadcast of GPS integrity information. This thesis examines the use of terrestrial data links to provide Wide Area Augmentation System (WAAS) based GPS integrity information for aviation. The thesis focuses on using Loran-C to broadcast WAAS data. Analysis and experimental results demonstrating the capabilities of these designs are also discussed. Using Loran for this purpose requires increasing its data capacity. Many Loran modulation schemes are developed and analyzed. The data rates developed significantly increased the Loran data capacity. However, retaining compatibility with Loran legacy users resulted in data rates below the WARS data rate of 250 bps. As a result, this thesis also examines means of reducing the data requirements for WAAS information. While higher data rates offer improved performance and compatibility with WAAS, this thesis demonstrates that higher rates incur greater interference. Therefore, this work develops and considers a 108 bps and 167 bps Loran GPS integrity channel (LOGIC) design. The performance of the two designs illustrates some of the advantages and disadvantages of using a higher data rate. Analysis demonstrated means of maintaining integrity with these low data rate systems and determined the theoretical capabilities of the systems. The system was tested empirically by developing software that generated the LOGIC message and applied these messages to a GPS user. The resulting 108 bps and 167 bps systems demonstrated capability to provide lateral navigation/vertical navigation (LNAV/VNAV) and approach with vertical guidance (APV) respectively.

  3. PT-SAFE: a software tool for development and annunciation of medical audible alarms.

    PubMed

    Bennett, Christopher L; McNeer, Richard R

    2012-03-01

    Recent reports by The Joint Commission as well as the Anesthesia Patient Safety Foundation have indicated that medical audible alarm effectiveness needs to be improved. Several recent studies have explored various approaches to improving the audible alarms, motivating the authors to develop real-time software capable of comparing such alarms. We sought to devise software that would allow for the development of a variety of audible alarm designs that could also integrate into existing operating room equipment configurations. The software is meant to be used as a tool for alarm researchers to quickly evaluate novel alarm designs. A software tool was developed for the purpose of creating and annunciating audible alarms. The alarms consisted of annunciators that were mapped to vital sign data received from a patient monitor. An object-oriented approach to software design was used to create a tool that is flexible and modular at run-time, can annunciate wave-files from disk, and can be programmed with MATLAB by the user to create custom alarm algorithms. The software was tested in a simulated operating room to measure technical performance and to validate the time-to-annunciation against existing equipment alarms. The software tool showed efficacy in a simulated operating room environment by providing alarm annunciation in response to physiologic and ventilator signals generated by a human patient simulator, on average 6.2 seconds faster than existing equipment alarms. Performance analysis showed that the software was capable of supporting up to 15 audible alarms on a mid-grade laptop computer before audio dropouts occurred. These results suggest that this software tool provides a foundation for rapidly staging multiple audible alarm sets from the laboratory to a simulation environment for the purpose of evaluating novel alarm designs, thus producing valuable findings for medical audible alarm standardization.

  4. 49 CFR Appendix C to Part 236 - Safety Assurance Criteria and Processes

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... system (all its elements including hardware and software) must be designed to assure safe operation with... unsafe errors in the software due to human error in the software specification, design, or coding phases... (hardware or software, or both) are used in combination to ensure safety. If a common mode failure exists...

  5. The optimal community detection of software based on complex networks

    NASA Astrophysics Data System (ADS)

    Huang, Guoyan; Zhang, Peng; Zhang, Bing; Yin, Tengteng; Ren, Jiadong

    2016-02-01

    The community structure is important for software in terms of understanding the design patterns, controlling the development and the maintenance process. In order to detect the optimal community structure in the software network, a method Optimal Partition Software Network (OPSN) is proposed based on the dependency relationship among the software functions. First, by analyzing the information of multiple execution traces of one software, we construct Software Execution Dependency Network (SEDN). Second, based on the relationship among the function nodes in the network, we define Fault Accumulation (FA) to measure the importance of the function node and sort the nodes with measure results. Third, we select the top K(K=1,2,…) nodes as the core of the primal communities (only exist one core node). By comparing the dependency relationships between each node and the K communities, we put the node into the existing community which has the most close relationship. Finally, we calculate the modularity with different initial K to obtain the optimal division. With experiments, the method OPSN is verified to be efficient to detect the optimal community in various softwares.

  6. InSAR Scientific Computing Environment

    NASA Astrophysics Data System (ADS)

    Gurrola, E. M.; Rosen, P. A.; Sacco, G.; Zebker, H. A.; Simons, M.; Sandwell, D. T.

    2010-12-01

    The InSAR Scientific Computing Environment (ISCE) is a software development effort in its second year within the NASA Advanced Information Systems and Technology program. The ISCE will provide a new computing environment for geodetic image processing for InSAR sensors that will enable scientists to reduce measurements directly from radar satellites and aircraft to new geophysical products without first requiring them to develop detailed expertise in radar processing methods. The environment can serve as the core of a centralized processing center to bring Level-0 raw radar data up to Level-3 data products, but is adaptable to alternative processing approaches for science users interested in new and different ways to exploit mission data. The NRC Decadal Survey-recommended DESDynI mission will deliver data of unprecedented quantity and quality, making possible global-scale studies in climate research, natural hazards, and Earth's ecosystem. The InSAR Scientific Computing Environment is planned to become a key element in processing DESDynI data into higher level data products and it is expected to enable a new class of analyses that take greater advantage of the long time and large spatial scales of these new data, than current approaches. At the core of ISCE is both legacy processing software from the JPL/Caltech ROI_PAC repeat-pass interferometry package as well as a new InSAR processing package containing more efficient and more accurate processing algorithms being developed at Stanford for this project that is based on experience gained in developing processors for missions such as SRTM and UAVSAR. Around the core InSAR processing programs we are building object-oriented wrappers to enable their incorporation into a more modern, flexible, extensible software package that is informed by modern programming methods, including rigorous componentization of processing codes, abstraction and generalization of data models, and a robust, intuitive user interface with graduated exposure to the levels of sophistication, allowing novices to apply it readily for common tasks and experienced users to mine data with great facility and flexibility. The environment is designed to easily allow user contributions, enabling an open source community to extend the framework into the indefinite future. In this paper we briefly describe both the legacy and the new core processing algorithms and their integration into the new computing environment. We describe the ISCE component and application architecture and the features that permit the desired flexibility, extensibility and ease-of-use. We summarize the state of progress of the environment and the plans for completion of the environment and for its future introduction into the radar processing community.

  7. Comparison of the Infiniti vision and the series 20,000 Legacy systems.

    PubMed

    Fernández de Castro, Luis E; Solomon, Kerry D; Hu, Daniel J; Vroman, David T; Sandoval, Helga P

    2008-01-01

    To compare the efficiency of the Infiniti vision system and the Series 20,000 Legacy system phacoemulsification units during routine cataract extraction. Thirty-nine eyes of 39 patients were randomized to have their cataract removed using either the Infiniti or the Legacy system, both using the Neosonix handpiece. System settings were standardized. Ultrasound time, amount of balanced salt solution (BSS) used intraoperatively, and postoperative visual acuity at postoperative days 1, 7 and 30 were evaluated. Preoperatively, best corrected visual acuity was significantly worse in the Infiniti group compared to the Legacy group (0.38 +/- 0.23 and 0.21 +/- 0.16, respectively; p = 0.012). The mean phacoemulsification time was 39.6 +/- 22.9 s (range 6.0-102.0) for the Legacy group and 18.3 +/-19.1 s (range 1.0-80.0) for the Infiniti group (p = 0.001). The mean amounts of intraoperative BSS used were 117 +/- 37.7 ml (range 70-195) in the Legacy group and 85.3 +/- 38.9 ml (range 40-200) in the Infiniti group (p = 0.005). No differences in postoperative visual acuity were found. The ability to use higher flow rates and vacuum settings with the Infiniti vision system allowed for cataract removal with less phacoemulsification time than when using the Legacy system. Copyright 2008 S. Karger AG, Basel.

  8. A Dynamic Approach to Make CDS/ISIS Databases Interoperable over the Internet Using the OAI Protocol

    ERIC Educational Resources Information Center

    Jayakanth, F.; Maly, K.; Zubair, M.; Aswath, L.

    2006-01-01

    Purpose: A dynamic approach to making legacy databases, like CDS/ISIS, interoperable with OAI-compliant digital libraries (DLs). Design/methodology/approach: There are many bibliographic databases that are being maintained using legacy database systems. CDS/ISIS is one such legacy database system. It was designed and developed specifically for…

  9. The Legacy Project--William E. Dugger, Jr., DTE

    ERIC Educational Resources Information Center

    Moye, Johnny J.; Dugger, William E., Jr.

    2016-01-01

    This is the ninth in a series of articles entitled "The Legacy Project." The Legacy Project focuses on the lives and actions of leaders who have forged the educator profession into what it is today. Members of the profession owe a debt of gratitude to these leaders. One simple way to demonstrate that gratitude is to recognize these…

  10. Legacy Projects: Helping Young People Respond Productively to the Challenges of a Changing World

    ERIC Educational Resources Information Center

    Beghetto, Ronald A.

    2017-01-01

    How might educators help young people respond to current and future challenges of a changing world? In this article, I describe how educators can design Legacy Projects to provide young people with opportunities to make positive and lasting differences in their lives, schools, communities, and beyond. The connection between legacy projects and the…

  11. The influence of legacy P on lake water quality in a Midwestern agricultural watershed

    USDA-ARS?s Scientific Manuscript database

    Decades of fertilizer and manure application have led to a buildup of phosphorus (P) in agricultural soils and stream and lake sediments, commonly referred to as legacy P. Legacy P can provide a long-term source of P to surface waters where it causes eutrophication. Using a suite of numerical model...

  12. The Legacy Project: Franklin Owens

    ERIC Educational Resources Information Center

    Moye, Johnny J.; Owens, Franklin

    2017-01-01

    This is the ninth in a series of articles entitled "The Legacy Project." The Legacy Project focuses on the lives and actions of leaders who have forged our profession into what it is today. Members of the profession owe a debt of gratitude to these leaders. One simple way to demonstrate that gratitude is to recognize these leaders and…

  13. Social Justice and Human Rights in Education Policy Discourse: Assessing Nelson Mandela's Legacy

    ERIC Educational Resources Information Center

    Gebremedhin, Abrehet; Joshi, Devin

    2016-01-01

    Twenty years after South Africa's democratisation, Nelson Mandela's passing has prompted scholars to examine his legacy in various domains. Here we take a look at his legacy in education discourse. Tracing Mandela's thoughts and pronouncements on education we find two major emphases: a view of education as a practical means to economic…

  14. JMS Proxy and C/C++ Client SDK

    NASA Technical Reports Server (NTRS)

    Wolgast, Paul; Pechkam, Paul

    2007-01-01

    JMS Proxy and C/C++ Client SDK (JMS signifies "Java messaging service" and "SDK" signifies "software development kit") is a software package for developing interfaces that enable legacy programs (here denoted "clients") written in the C and C++ languages to communicate with each other via a JMS broker. This package consists of two main components: the JMS proxy server component and the client C library SDK component. The JMS proxy server component implements a native Java process that receives and responds to requests from clients. This component can run on any computer that supports Java and a JMS client. The client C library SDK component is used to develop a JMS client program running in each affected C or C++ environment, without need for running a Java virtual machine in the affected computer. A C client program developed by use of this SDK has most of the quality-of-service characteristics of standard Java-based client programs, including the following: Durable subscriptions; Asynchronous message receipt; Such standard JMS message qualities as "TimeToLive," "Message Properties," and "DeliveryMode" (as the quoted terms are defined in previously published JMS documentation); and Automatic reconnection of a JMS proxy to a restarted JMS broker.

  15. Web Program for Development of GUIs for Cluster Computers

    NASA Technical Reports Server (NTRS)

    Czikmantory, Akos; Cwik, Thomas; Klimeck, Gerhard; Hua, Hook; Oyafuso, Fabiano; Vinyard, Edward

    2003-01-01

    WIGLAF (a Web Interface Generator and Legacy Application Facade) is a computer program that provides a Web-based, distributed, graphical-user-interface (GUI) framework that can be adapted to any of a broad range of application programs, written in any programming language, that are executed remotely on any cluster computer system. WIGLAF enables the rapid development of a GUI for controlling and monitoring a specific application program running on the cluster and for transferring data to and from the application program. The only prerequisite for the execution of WIGLAF is a Web-browser program on a user's personal computer connected with the cluster via the Internet. WIGLAF has a client/server architecture: The server component is executed on the cluster system, where it controls the application program and serves data to the client component. The client component is an applet that runs in the Web browser. WIGLAF utilizes the Extensible Markup Language to hold all data associated with the application software, Java to enable platform-independent execution on the cluster system and the display of a GUI generator through the browser, and the Java Remote Method Invocation software package to provide simple, effective client/server networking.

  16. RELAP-7 Code Assessment Plan and Requirement Traceability Matrix

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoo, Junsoo; Choi, Yong-joon; Smith, Curtis L.

    2016-10-01

    The RELAP-7, a safety analysis code for nuclear reactor system, is under development at Idaho National Laboratory (INL). Overall, the code development is directed towards leveraging the advancements in computer science technology, numerical solution methods and physical models over the last decades. Recently, INL has also been putting an effort to establish the code assessment plan, which aims to ensure an improved final product quality through the RELAP-7 development process. The ultimate goal of this plan is to propose a suitable way to systematically assess the wide range of software requirements for RELAP-7, including the software design, user interface, andmore » technical requirements, etc. To this end, we first survey the literature (i.e., international/domestic reports, research articles) addressing the desirable features generally required for advanced nuclear system safety analysis codes. In addition, the V&V (verification and validation) efforts as well as the legacy issues of several recently-developed codes (e.g., RELAP5-3D, TRACE V5.0) are investigated. Lastly, this paper outlines the Requirement Traceability Matrix (RTM) for RELAP-7 which can be used to systematically evaluate and identify the code development process and its present capability.« less

  17. 2014 Runtime Systems Summit. Runtime Systems Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sarkar, Vivek; Budimlic, Zoran; Kulkani, Milind

    2016-09-19

    This report summarizes runtime system challenges for exascale computing, that follow from the fundamental challenges for exascale systems that have been well studied in past reports, e.g., [6, 33, 34, 32, 24]. Some of the key exascale challenges that pertain to runtime systems include parallelism, energy efficiency, memory hierarchies, data movement, heterogeneous processors and memories, resilience, performance variability, dynamic resource allocation, performance portability, and interoperability with legacy code. In addition to summarizing these challenges, the report also outlines different approaches to addressing these significant challenges that have been pursued by research projects in the DOE-sponsored X-Stack and OS/R programs. Sincemore » there is often confusion as to what exactly the term “runtime system” refers to in the software stack, we include a section on taxonomy to clarify the terminology used by participants in these research projects. In addition, we include a section on deployment opportunities for vendors and government labs to build on the research results from these projects. Finally, this report is also intended to provide a framework for discussing future research and development investments for exascale runtime systems, and for clarifying the role of runtime systems in exascale software.« less

  18. NASA Operational Simulator for Small Satellites (NOS3)

    NASA Technical Reports Server (NTRS)

    Zemerick, Scott

    2015-01-01

    The Simulation-to-Flight 1 (STF-1) CubeSat mission aims to demonstrate how legacy simulation technologies may be adapted for flexible and effective use on missions using the CubeSat platform. These technologies, named NASA Operational Simulator (NOS), have demonstrated significant value on several missions such as James Webb Space Telescope, Global Precipitation Measurement, Juno, and Deep Space Climate Observatory in the areas of software development, mission operationstraining, verification and validation (VV), test procedure development and software systems check-out. STF-1 will demonstrate a highly portable simulation and test platform that allows seamless transition of mission development artifacts to flight products. This environment will decrease development time of future CubeSat missions by lessening the dependency on hardware resources. In addition, through a partnership between NASA GSFC, the West Virginia Space Grant Consortium and West Virginia University, the STF-1 CubeSat will hosts payloads for three secondary objectives that aim to advance engineering and physical-science research in the areas of navigation systems of small satellites, provide useful data for understanding magnetosphere-ionosphere coupling and space weather, and verify the performance and durability of III-V Nitride-based materials.

  19. NEWFIRM Software--System Integration Using OPC

    NASA Astrophysics Data System (ADS)

    Daly, P. N.

    2004-07-01

    The NOAO Extremely Wide-Field Infra-Red Mosaic (NEWFIRM) camera is being built to satisfy the survey science requirements on the KPNO Mayall and CTIO Blanco 4m telescopes in an era of 8m+ aperture telescopes. Rather than re-invent the wheel, the software system to control the instrument has taken existing software packages and re-used what is appropriate. The result is an end-to-end observation control system using technology components from DRAMA, ORAC, observing tools, GWC, existing in-house motor controllers and new developments like the MONSOON pixel server.

  20. Managing Records for the Long Term - 12363

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Montgomery, John V.; Gueretta, Jeanie

    The U.S. Department of Energy (DOE) is responsible for managing vast amounts of information documenting historical and current operations. This information is critical to the operations of the DOE Office of Legacy Management. Managing legacy records and information is challenging in terms of accessibility and changing technology. The Office of Legacy Management is meeting these challenges by making records and information management an organizational priority. The Office of Legacy Management mission is to manage DOE post-closure responsibilities at former Cold War weapons sites to ensure the future protection of human health and the environment. These responsibilities include environmental stewardship andmore » long-term preservation and management of operational and environmental cleanup records associated with each site. A primary organizational goal for the Office of Legacy Management is to 'Preserve, Protect, and Share Records and Information'. Managing records for long-term preservation is an important responsibility. Adequate and dedicated resources and management support are required to perform this responsibility successfully. Records tell the story of an organization and may be required to defend an organization in court, provide historical information, identify lessons learned, or provide valuable information for researchers. Loss of records or the inability to retrieve records because of poor records management processes can have serious consequences and even lead to an organisation's downfall. Organizations must invest time and resources to establish a good records management program because of its significance to the organization as a whole. The Office of Legacy Management will continue to research and apply innovative ways of doing business to ensure that the organization stays at the forefront of effective records and information management. DOE is committed to preserving records that document our nation's Cold War legacy, and the Office of Legacy Management will keep records management as a high priority. (authors)« less

  1. SCaN Testbed Software Development and Lessons Learned

    NASA Technical Reports Server (NTRS)

    Kacpura, Thomas J.; Varga, Denise M.

    2012-01-01

    National Aeronautics and Space Administration (NASA) has developed an on-orbit, adaptable, Software Defined Radio (SDR)Space Telecommunications Radio System (STRS)-based testbed facility to conduct a suite of experiments to advance technologies, reduce risk, and enable future mission capabilities on the International Space Station (ISS). The SCAN Testbed Project will provide NASA, industry, other Government agencies, and academic partners the opportunity to develop and field communications, navigation, and networking technologies in the laboratory and space environment based on reconfigurable, SDR platforms and the STRS Architecture.The SDRs are a new technology for NASA, and the support infrastructure they require is different from legacy, fixed function radios. SDRs offer the ability to reconfigure on-orbit communications by changing software for new waveforms and operating systems to enable new capabilities or fix any anomalies, which was not a previous option. They are not stand alone devices, but required a new approach to effectively control them and flow data. This requires extensive software to be developed to utilize the full potential of these reconfigurable platforms. The paper focuses on development, integration and testing as related to the avionics processor system, and the software required to command, control, monitor, and interact with the SDRs, as well as the other communication payload elements. An extensive effort was required to develop the flight software and meet the NASA requirements for software quality and safety. The flight avionics must be radiation tolerant, and these processors have limited capability in comparison to terrestrial counterparts. A big challenge was that there are three SDRs onboard, and interfacing with multiple SDRs simultaneously complicatesd the effort. The effort also includes ground software, which is a key element for both the command of the payload, and displaying data created by the payload. The verification of the software was an extensive effort. The challenges of specifying a suitable test matrix with reconfigurable systems that offer numerous configurations is highlighted. Since the flight system testing requires methodical, controlled testing that limits risk, a nearly identical ground system to the on-orbit flight system was required to develop the software and write verification procedures before it was installed and tested on the flight system. The development of the SCAN testbed was an accelerated effort to meet launch constraints, and this paper discusses tradeoffs made to balance needed software functionality and still maintain the schedule. Future upgrades are discussed that optimize the avionics and allow experimenters to utilize the SCAN testbed potential.

  2. CometQuest: A Rosetta Adventure

    NASA Technical Reports Server (NTRS)

    Leon, Nancy J.; Fisher, Diane K.; Novati, Alexander; Chmielewski, Artur B.; Fitzpatrick, Austin J.; Angrum, Andrea

    2012-01-01

    This software is a higher-performance implementation of tiled WMS, with integral support for KML and time-varying data. This software is compliant with the Open Geospatial WMS standard, and supports KML natively as a WMS return type, including support for the time attribute. Regionated KML wrappers are generated that match the existing tiled WMS dataset. Ping and JPG formats are supported, and the software is implemented as an Apache 2.0 module that supports a threading execution model that is capable of supporting very high request rates. The module intercepts and responds to WMS requests that match certain patterns and returns the existing tiles. If a KML format that matches an existing pyramid and tile dataset is requested, regionated KML is generated and returned to the requesting application. In addition, KML requests that do not match the existing tile datasets generate a KML response that includes the corresponding JPG WMS request, effectively adding KML support to a backing WMS server.

  3. Tiled WMS/KML Server V2

    NASA Technical Reports Server (NTRS)

    Plesea, Lucian

    2012-01-01

    This software is a higher-performance implementation of tiled WMS, with integral support for KML and time-varying data. This software is compliant with the Open Geospatial WMS standard, and supports KML natively as a WMS return type, including support for the time attribute. Regionated KML wrappers are generated that match the existing tiled WMS dataset. Ping and JPG formats are supported, and the software is implemented as an Apache 2.0 module that supports a threading execution model that is capable of supporting very high request rates. The module intercepts and responds to WMS requests that match certain patterns and returns the existing tiles. If a KML format that matches an existing pyramid and tile dataset is requested, regionated KML is generated and returned to the requesting application. In addition, KML requests that do not match the existing tile datasets generate a KML response that includes the corresponding JPG WMS request, effectively adding KML support to a backing WMS server.

  4. Using health statistics: a Nightingale legacy.

    PubMed

    Schloman, B F

    2001-01-01

    No more forceful example of the value of using health statistics to understand and improve health conditions exists than displayed by Florence Nightingale. The recent book by Dossey (1999), Florence Nightingale: Mystic, Visionary, Healer, relates the dramatic tale of Nightingale s use of statistics to understand the causes of deaths in the Crimean War and of her advocacy to standardize the collection of medical data within the army and in civilian hospitals. For her, the use of health statistics was a major tool to improve health and influence public opinion.

  5. Generating code adapted for interlinking legacy scalar code and extended vector code

    DOEpatents

    Gschwind, Michael K

    2013-06-04

    Mechanisms for intermixing code are provided. Source code is received for compilation using an extended Application Binary Interface (ABI) that extends a legacy ABI and uses a different register configuration than the legacy ABI. First compiled code is generated based on the source code, the first compiled code comprising code for accommodating the difference in register configurations used by the extended ABI and the legacy ABI. The first compiled code and second compiled code are intermixed to generate intermixed code, the second compiled code being compiled code that uses the legacy ABI. The intermixed code comprises at least one call instruction that is one of a call from the first compiled code to the second compiled code or a call from the second compiled code to the first compiled code. The code for accommodating the difference in register configurations is associated with the at least one call instruction.

  6. Applying Model Based Systems Engineering to NASA's Space Communications Networks

    NASA Technical Reports Server (NTRS)

    Bhasin, Kul; Barnes, Patrick; Reinert, Jessica; Golden, Bert

    2013-01-01

    System engineering practices for complex systems and networks now require that requirement, architecture, and concept of operations product development teams, simultaneously harmonize their activities to provide timely, useful and cost-effective products. When dealing with complex systems of systems, traditional systems engineering methodology quickly falls short of achieving project objectives. This approach is encumbered by the use of a number of disparate hardware and software tools, spreadsheets and documents to grasp the concept of the network design and operation. In case of NASA's space communication networks, since the networks are geographically distributed, and so are its subject matter experts, the team is challenged to create a common language and tools to produce its products. Using Model Based Systems Engineering methods and tools allows for a unified representation of the system in a model that enables a highly related level of detail. To date, Program System Engineering (PSE) team has been able to model each network from their top-level operational activities and system functions down to the atomic level through relational modeling decomposition. These models allow for a better understanding of the relationships between NASA's stakeholders, internal organizations, and impacts to all related entities due to integration and sustainment of existing systems. Understanding the existing systems is essential to accurate and detailed study of integration options being considered. In this paper, we identify the challenges the PSE team faced in its quest to unify complex legacy space communications networks and their operational processes. We describe the initial approaches undertaken and the evolution toward model based system engineering applied to produce Space Communication and Navigation (SCaN) PSE products. We will demonstrate the practice of Model Based System Engineering applied to integrating space communication networks and the summary of its results and impact. We will highlight the insights gained by applying the Model Based System Engineering and provide recommendations for its applications and improvements.

  7. Improving throughput for temporal target nomination using existing infrastructure

    NASA Astrophysics Data System (ADS)

    Raeth, Peter G.

    2007-04-01

    Earlier, we reported on predictive anomaly detection (PAD) for nominating targets within data streams generated by persistent sensing and surveillance. This technique is purely temporal and does not directly depend on the physics attendant on the sensed environment. Since PAD adapts to evolving data streams, there are no determinacy assumptions. We showed PAD to be general across sensor types, demonstrating it using synthetic chaotic data and in audio, visual, and infrared applications. Defense-oriented demonstrations included explosions, muzzle flashes, and missile and aircraft detection. Experiments were ground-based and air-to-air. As new sensors come on line, PAD offers immediate data filtering and target nomination. Its results can be taken individually, pixel by pixel, for spectral analysis and material detection/identification. They can also be grouped for shape analysis, target identification, and track development. PAD analyses reduce data volume by around 95%, depending on target number and size, while still retaining all target indicators. While PAD's code is simple when compared to physics codes, PAD tends to build a huge model. A PAD model for 512 x 640 frames may contain 19,660,800 Gaussian basis functions. (PAD models grow linearly with the number of pixels and the frequency content, in the FFT sense, of the sensed scenario's background data). PAD's complexity in terms of computational and data intensity is an example of what one sees in new algorithms now in the R&D pipeline, especially as DoD seeks capability that runs fully automatic, with little to no human interaction. Work is needed to improve algorithms' throughput while employing existing infrastructure, yet allowing for growth in the types of hardware employed. In this present paper, we discuss a generic cluster interface for legacy codes that can be partitioned at the data level. The discussion's foundation is the growth of PAD models to accommodate a particular scenario and the need to reduce false alarms while preserving all targets. The discussion closes with a view of future software and hardware opportunities.

  8. Copyright Protection for Computer Software: Is There a Need for More Protection?

    ERIC Educational Resources Information Center

    Ku, Linlin

    Because the computer industry's expansion has been much faster than has the development of laws protecting computer software and since the practice of software piracy seems to be alive and well, the issue of whether existing laws can provide effective protection for software needs further discussion. Three bodies of law have been used to protect…

  9. Mobile User Objective System (MUOS)

    DTIC Science & Technology

    2015-12-01

    the current UHF Follow-On ( UFO ) constellation. MUOS includes the satellite constellation, a ground control and network management system, and a new...MUOS CAI. Each MUOS satellite carries a legacy payload similar to that flown on UFO -11. These legacy payloads will continue to support legacy...Antecedent Information The antecedent system to MUOS was the Ultra High Frequency (UHF) Follow-on ( UFO ) satellite communications program. Comparisons

  10. Land-use legacies from dry farming in the Park Valley area of Box Elder County

    USDA-ARS?s Scientific Manuscript database

    Last fall in this newsletter, we reported on the initiation of a study on the land-use legacies of dry farming in the Park Valley area. Land-use legacies are the long lasting impacts of historic land uses; such as, cultivation for dry farming. The Park Valley area and Box Elder County experienced ...

  11. Disturbance legacies of historic tie-drives persistently alter geomorphology and large wood characteristics in headwater streams, southeast Wyoming

    Treesearch

    Claire M. Ruffing; Melinda D. Daniels; Kathleen A. Dwire

    2015-01-01

    Instream wood is recognized as an integral component of stream morphology in forested areas. However, few studies have evaluated the legacy effects of historic wood removal activities and associated impacts on channel morphology, contemporary wood loading, and recruitment. This study investigates the role of historic tie-driving, a widespread channel disturbance legacy...

  12. Jack Wescott and Donald F. Smith. The Legacy Project

    ERIC Educational Resources Information Center

    Moye, Johnny J.; Wescott, Jack W.; Smith, Donald F.

    2017-01-01

    This is the tenth in a series of articles entitled "The Legacy Project." The Legacy Project focuses on the lives and actions of leaders who have forged our profession into what it is today. Members of the profession owe a debt of gratitude to these leaders. One simple way to demonstrate that gratitude is to recognize these leaders and…

  13. The Impact of Oregon's Pension Legacy Costs on New Teacher Turnover and Quality. CEDR Working Paper. WP #2016-5

    ERIC Educational Resources Information Center

    Cahill, Kevin E.; Dyke, Andrew; Tapogna, John

    2016-01-01

    Pension legacy costs can restrict the amount of resources available for current public education, potentially making it more difficult to attract and retain high-quality teachers. Oregon provides a useful case study in pension legacy costs because many school districts in the state are now reallocating General Fund expenditures to cover sizeable…

  14. 31 CFR 306.23 - Securities eligible to be held in the Legacy Treasury Direct® Book-entry Securities System.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Legacy Treasury Direct® Book-entry Securities System. 306.23 Section 306.23 Money and Finance: Treasury... Securities eligible to be held in the Legacy Treasury Direct® Book-entry Securities System. (a) Eligible... book-entry securities system. (b) Conversion of Registered Security to book-entry form to be held in...

  15. Managing a project's legacy: implications for organizations and project management

    NASA Technical Reports Server (NTRS)

    Cooper, Lynne P.; Hecht, Michael H.; Majchrzak, Ann

    2003-01-01

    Organizations that rely on projects to implement their products must find effective mechanisms for propagating lessons learned on one project throughout the organization. A broad view of what constitutes a project's 'legacy' is presented that includes not just the design products and leftover parts, but new processes, relationships, technology, skills, planning data, and performance metrics. Based on research evaluating knowledge reuse in innovative contexts, this paper presents an approach to project legacy management that focuses on collecting and using legacy knowledge to promote organizational learning and effective reuse, while addressing factors of post-project responsibility, information obsolescence, and the importance of ancillary contextual information. .

  16. Aging, Climate Change, and Legacy Thinking

    PubMed Central

    Fried, Linda; Moody, Rick

    2012-01-01

    Climate change is a complex, long-term public health challenge. Older people are especially susceptible to certain climate change impacts, such as heat waves. We suggest that older people may be a resource for addressing climate change because of their concern for legacy—for leaving behind values, attitudes, and an intact world to their children and grandchildren. We review the theoretical basis for “legacy thinking” among older people. We offer suggestions for research on this phenomenon, and for action to strengthen the sense of legacy. At a time when older populations are growing, understanding and promoting legacy thinking may offer an important strategy for addressing climate change. PMID:22698047

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kilpatrick, Laura E.; Cotter, Ed

    The U.S. Department of Energy (DOE) Office of Legacy Management is responsible for administering the DOE Uranium Leasing Program (ULP) and its 31 uranium lease tracts located in the Uravan Mineral Belt of southwestern Colorado (see Figure 1). In addition to administering the ULP for the last six decades, DOE has also undertaken the significant task of reclaiming a large number of abandoned uranium (legacy) mine sites and associated features located throughout the Uravan Mineral Belt. In 1995, DOE initiated a 3-year reconnaissance program to locate and delineate (through extensive on-the-ground mapping) the legacy mine sites and associated features containedmore » within the historically defined boundaries of its uranium lease tracts. During that same time frame, DOE recognized the lack of regulations pertaining to the reclamation of legacy mine sites and contacted the U.S. Bureau of Land Management (BLM) concerning the reclamation of legacy mine sites. In November 1995, The BLM Colorado State Office formally issued the United States Department of the Interior, Colorado Bureau of Land Management, Closure/Reclamation Guidelines, Abandoned Uranium Mine Sites as a supplement to its Solid Minerals Reclamation Handbook (H-3042-1). Over the next five-and-one-half years, DOE reclaimed the 161 legacy mine sites that had been identified on DOE withdrawn lands. By the late 1990's, the various BLM field offices in southwestern Colorado began to recognize DOE's experience and expertise in reclaiming legacy mine sites. During the ensuing 8 years, BLM funded DOE (through a series of task orders) to perform reclamation activities at 182 BLM mine sites. To date, DOE has reclaimed 372 separate and distinct legacy mine sites. During this process, DOE has learned many lessons and is willing to share those lessons with others in the reclamation industry because there are still many legacy mine sites not yet reclaimed. DOE currently administers 31 lease tracts (11,017 ha) that collectively contain over 220 legacy (abandoned) uranium mine sites. This contrasts to the millions of hectares administered by the BLM, the U.S. Forest Service, and other federal, tribal, and state agencies that contain thousands of such sites. DOE believes that the processes it has used provide a practical and cost-effective approach to abandoned uranium mine-site reclamation. Although the Federal Acquisition Regulations preclude DOE from competing with private industry, DOE is available to assist other governmental and tribal agencies in their reclamation efforts. (authors)« less

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Korinko, P.; Howard, S.; Maxwell, D.

    During final preparations for start of the PDCF Inner Can (IC) qualification effort, welding was performed on an automated weld system known as the PICN. During the initial weld, using a pedigree canister and plug, a weld defect was observed. The defect resulted in a hole in the sidewall of the canister, and it was observed that the plug sidewall had not been consumed. This was a new type of failure not seen during development and production of legacy Bagless Transfer Cans (FB-Line/Hanford). Therefore, a team was assembled to determine the root cause and to determine if the process couldmore » be improved. After several brain storming sessions (MS and T, R and D Engineering, PDC Project), an evaluation matrix was established to direct this effort. The matrix identified numerous activities that could be taken and then prioritized those activities. This effort was limited by both time and resources (the number of canisters and plugs available for testing was limited). A discovery process was initiated to evaluate the Vendor's IC fabrication process relative to legacy processes. There were no significant findings, however, some information regarding forging/anneal processes could not be obtained. Evaluations were conducted to compare mechanical properties of the PDC canisters relative to the legacy canisters. Some differences were identified, but mechanical properties were determined to be consistent with legacy materials. A number of process changes were also evaluated. A heat treatment procedure was established that could reduce the magnetic characteristics to levels similar to the legacy materials. An in-situ arc annealing process was developed that resulted in improved weld characteristics for test articles. Also several tack welds configurations were addressed, it was found that increasing the number of tack welds (and changing the sequence) resulted in decreased can to plug gaps and a more stable weld for test articles. Incorporating all of the process improvements for the actual can welding process, however, did not result in an improved weld geometry. Several possibilities for the lack of positive response exist, some of which are that (1) an insufficient number of test articles were welded under prototypic conditions, (2) the process was not optimized so that significant improvements were observable over the 'noise', and (3) the in-situ arc anneal closed the gap down too much so the can was unable to exhaust pressure ahead of the weld. Several operational and mechanical improvements were identified. The weld clamps were changed to a design consistent with those used in the legacy operations. A helium puff operation was eliminated; it is believed that this operation was the cause of the original weld defect. Also, timing of plug mast movement was found to correspond with weld irregularities. The timing of the movement was changed to occur during weld head travel between tacks. In the end a three sequential tack weld process followed by a pulse weld at the same current and travel speed as was used for the legacy processes was suggested for use during the IC qualification effort. Relative to legacy welds, the PDC IC weld demonstrates greater fluctuation in the region of the weld located between tack welds. However, canister weld response (canister to canister) is consistent and with the aid of the optical mapping system (for targeting the cut position) is considered adequate. DR measurements and METs show the PDC IC welds to have sufficient ligament length to ensure adequate canister pressure/impact capacity and to ensure adequate stub function. The PDC welding process has not been optimized as a result of this effort. Differences remain between the legacy BTC welds and the PDC IC weld, but these differences are not sufficient to prevent resumption of the current PDC IC qualification effort. During the PDC IC qualification effort, a total of 17 cans will be welded and a variety of tests/inspections will be performed. The extensive data collected during that qualification effort should be of a sufficient population to determine if additional weld process optimization is necessary prior to production release.« less

  19. Spatial 3D infrastructure: display-independent software framework, high-speed rendering electronics, and several new displays

    NASA Astrophysics Data System (ADS)

    Chun, Won-Suk; Napoli, Joshua; Cossairt, Oliver S.; Dorval, Rick K.; Hall, Deirdre M.; Purtell, Thomas J., II; Schooler, James F.; Banker, Yigal; Favalora, Gregg E.

    2005-03-01

    We present a software and hardware foundation to enable the rapid adoption of 3-D displays. Different 3-D displays - such as multiplanar, multiview, and electroholographic displays - naturally require different rendering methods. The adoption of these displays in the marketplace will be accelerated by a common software framework. The authors designed the SpatialGL API, a new rendering framework that unifies these display methods under one interface. SpatialGL enables complementary visualization assets to coexist through a uniform infrastructure. Also, SpatialGL supports legacy interfaces such as the OpenGL API. The authors" first implementation of SpatialGL uses multiview and multislice rendering algorithms to exploit the performance of modern graphics processing units (GPUs) to enable real-time visualization of 3-D graphics from medical imaging, oil & gas exploration, and homeland security. At the time of writing, SpatialGL runs on COTS workstations (both Windows and Linux) and on Actuality"s high-performance embedded computational engine that couples an NVIDIA GeForce 6800 Ultra GPU, an AMD Athlon 64 processor, and a proprietary, high-speed, programmable volumetric frame buffer that interfaces to a 1024 x 768 x 3 digital projector. Progress is illustrated using an off-the-shelf multiview display, Actuality"s multiplanar Perspecta Spatial 3D System, and an experimental multiview display. The experimental display is a quasi-holographic view-sequential system that generates aerial imagery measuring 30 mm x 25 mm x 25 mm, providing 198 horizontal views.

  20. Using OSG Computing Resources with (iLC)Dirac

    NASA Astrophysics Data System (ADS)

    Sailer, A.; Petric, M.; CLICdp Collaboration

    2017-10-01

    CPU cycles for small experiments and projects can be scarce, thus making use of all available resources, whether dedicated or opportunistic, is mandatory. While enabling uniform access to the LCG computing elements (ARC, CREAM), the DIRAC grid interware was not able to use OSG computing elements (GlobusCE, HTCondor-CE) without dedicated support at the grid site through so called ‘SiteDirectors’, which directly submit to the local batch system. This in turn requires additional dedicated effort for small experiments on the grid site. Adding interfaces to the OSG CEs through the respective grid middleware is therefore allowing accessing them within the DIRAC software without additional site-specific infrastructure. This enables greater use of opportunistic resources for experiments and projects without dedicated clusters or an established computing infrastructure with the DIRAC software. To allow sending jobs to HTCondor-CE and legacy Globus computing elements inside DIRAC the required wrapper classes were developed. Not only is the usage of these types of computing elements now completely transparent for all DIRAC instances, which makes DIRAC a flexible solution for OSG based virtual organisations, but it also allows LCG Grid Sites to move to the HTCondor-CE software, without shutting DIRAC based VOs out of their site. In these proceedings we detail how we interfaced the DIRAC system to the HTCondor-CE and Globus computing elements and explain the encountered obstacles and solutions developed, and how the linear collider community uses resources in the OSG.

  1. A new architecture for enterprise information systems.

    PubMed

    Covvey, H D; Stumpf, J J

    1999-01-01

    Irresistible economic and technical forces are forcing healthcare institutions to develop regionalized services such as consolidated or virtual laboratories. Technical realities, such as the lack of an enabling enterprise-level information technology (IT) integration infrastructure, the existence of legacy systems, and non-existent or embryonic enterprise-level IT services organizations, are delaying or frustrating the achievement of the desired configuration of shared services. On attempting to address this matter, we discover that the state-of-the-art in integration technology is not wholly adequate, and itself becomes a barrier to the full realization of shared healthcare services. In this paper we report new work from the field of Co-operative Information Systems that proposes a new architecture of systems that are intrinsically cooperation-enabled, and we extend this architecture to both the regional and national scales.

  2. Making interdisciplinary solid Earth modeling and analysis tools accessible in a diverse undergraduate and graduate classroom

    NASA Astrophysics Data System (ADS)

    Becker, T. W.

    2011-12-01

    I present results from ongoing, NSF-CAREER funded educational and research efforts that center around making numerical tools in seismology and geodynamics more accessible to a broader audience. The goal is not only to train students in quantitative, interdisciplinary research, but also to make methods more easily accessible to practitioners across disciplines. I describe the two main efforts that were funded, the Solid Earth Research and Teaching Environment (SEATREE, geosys.usc.edu/projects/seatree/), and a new Numerical Methods class. SEATREE is a modular and user-friendly software framework to facilitate using solid Earth research tools in the undergraduate and graduate classroom and for interdisciplinary, scientific collaboration. We use only open-source software, and most programming is done in the Python computer language. We strive to make use of modern software design and development concepts while remaining compatible with traditional scientific coding and existing, legacy software. Our goals are to provide a fully contained, yet transparent package that lets users operate in an easy, graphically supported "black box" mode, while also allowing to look under the hood, for example to conduct numerous forward models to explore parameter space. SEATREE currently has several implemented modules, including on global mantle flow, 2D phase velocity tomography, and 2D mantle convection and was used at the University of Southern California, Los Angeles, and at a 2010 CIDER summer school tutorial. SEATREE was developed in collaboration with engineering and computer science undergraduate students, some of which have gone on to work in Earth Science projects. In the long run, we envision SEATREE to contribute to new ways of sharing scientific research, and making (numerical) experiments truly reproducible again. The other project is a set of lecture notes and Matlab exercises on Numerical Methods in solid Earth, focusing on finite difference and element methods. The class has been taught several times at USC to a broad audience of Earth science students with very diverse levels of exposure to math and physics. It is our goal to bring everyone up to speed and empower students, and we have seen structural geology students with very little exposure to math go on to construct their own numerical models of pTt-paths in a core-complex setting. This exemplifies the goal of teaching students to both be able to put together simple numerical models from scratch, and, perhaps more importantly, to truly understand the basic concepts, capabilities, and pitfalls of the more powerful community codes that are being increasingly used. SEATREE and the Numerical Methods class material are freely available at geodynamics.usc.edu/~becker.

  3. Digital Actuator Technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ken Thomas; Ted Quinn; Jerry Mauck

    There are significant developments underway in new types of actuators for power plant active components. Many of these make use of digital technology to provide a wide array of benefits in performance of the actuators and in reduced burden to maintain them. These new product offerings have gained considerable acceptance in use in process plants. In addition, they have been used in conventional power generation very successfully. This technology has been proven to deliver the benefits promised and substantiate the claims of improved performance. The nuclear industry has been reluctant to incorporate digital actuator technology into nuclear plant designs duemore » to concerns due to a number of concerns. These could be summarized as cost, regulatory uncertainty, and a certain comfort factor with legacy analog technology. The replacement opportunity for these types of components represents a decision point for whether to invest in more modern technology that would provide superior operational and maintenance benefits. Yet, the application of digital technology has been problematic for the nuclear industry, due to qualification and regulatory issues. With some notable exceptions, the result has been a continuing reluctance to undertake the risks and uncertainties of implementing digital actuator technology when replacement opportunities present themselves. Rather, utilities would typically prefer to accept the performance limitations of the legacy analog actuator technologies to avoid impacts to project costs and schedules. The purpose of this report is to demonstrate that the benefits of digital actuator technology can be significant in terms of plant performance and that it is worthwhile to address the barriers currently holding back the widespread development and use of this technology. It addresses two important objectives in pursuit of the beneficial use of digital actuator technology for nuclear power plants: 1. To demonstrate the benefits of digital actuator technology over legacy analog sensor technology in both quantitative and qualitative ways. 2. To recognize and address the added difficulty of digital technology qualification, especially in regard to software common cause failure (SCCF), that is introduced by the use of digital actuator technology.« less

  4. The Legacy Project: A Case Study of Civic Capacity Building and Transformative Educational Leadership in a Community-Based Academic Enrichment Program

    ERIC Educational Resources Information Center

    Didlick-Davis, Celeste R.

    2016-01-01

    This study examines how a grassroots educational enrichment program in a small urban economically depressed area builds and uses civic capacity. Using qualitative data collected through a case study of the Legacy Academic Enrichment program in Middletown, Ohio, I identify factors that make Legacy sustainable and successful in a community that has…

  5. Aviation Safety Modeling and Simulation (ASMM) Propulsion Fleet Modeling: A Tool for Semi-Automatic Construction of CORBA-based Applications from Legacy Fortran Programs

    NASA Technical Reports Server (NTRS)

    Sang, Janche

    2003-01-01

    Within NASA's Aviation Safety Program, NASA GRC participates in the Modeling and Simulation Project called ASMM. NASA GRC s focus is to characterize the propulsion systems performance from a fleet management and maintenance perspective by modeling and through simulation predict the characteristics of two classes of commercial engines (CFM56 and GE90). In prior years, the High Performance Computing and Communication (HPCC) program funded, NASA Glenn in developing a large scale, detailed simulations for the analysis and design of aircraft engines called the Numerical Propulsion System Simulation (NPSS). Three major aspects of this modeling included the integration of different engine components, coupling of multiple disciplines, and engine component zooming at appropriate level fidelity, require relatively tight coupling of different analysis codes. Most of these codes in aerodynamics and solid mechanics are written in Fortran. Refitting these legacy Fortran codes with distributed objects can increase these codes reusability. Aviation Safety s modeling and simulation use in characterizing fleet management has similar needs. The modeling and simulation of these propulsion systems use existing Fortran and C codes that are instrumental in determining the performance of the fleet. The research centers on building a CORBA-based development environment for programmers to easily wrap and couple legacy Fortran codes. This environment consists of a C++ wrapper library to hide the details of CORBA and an efficient remote variable scheme to facilitate data exchange between the client and the server model. Additionally, a Web Service model should also be constructed for evaluation of this technology s use over the next two- three years.

  6. A Comparison of Van Allen Belt Radiation Environment Modeling Programs: AE8/AP8 Legacy, AE9/AP9, and SPENVIS

    NASA Technical Reports Server (NTRS)

    Reed, Evan; Pellish, Jonathan

    2016-01-01

    In the space surrounding Earth there exists an active radiation environment consisting mostly of electrons and protons that have been trapped by Earths magnetic field. This radiation, also known as the Van Allen Belts, has the potential to damage man-made satellites in orbit; thus, proper precautions must be taken to shield NASA assets from this phenomenon. Data on the Van Allen Belts has been collected continuously by a multitude of space-based instruments since the beginning of space exploration. Subsequently, using theory to fill in the gaps in the collected data, computer models have been developed that take in the orbital information of a hypothetical mission and output the expected particle fluence and flux for that orbit. However, as new versions of the modeling system are released, users are left wondering how the new version differs from the old. Therefore, we performed a comparison of three different editions of the modeling system: AE8/AP8 (legacy), which is included in the model 9 graphical user interface as an option for ones calculations, AE9/AP9, and the Space Environment Information System (SPENVIS), which is an online-based form of AE8/AP8 developed by NASA and the European Space Agency that changed the code to allow the program to extrapolate data to predict fluence and flux at higher energies. Although this evaluation is still ongoing, it is predicted that the model 8 (legacy) and SPENVIS version will have identical outputs with the exception of the extended energy levels from SPENVIS, while model 9 will provide different fluences than model 8 based on additional magnetic field descriptions and on-orbit data.

  7. The legacy of fossil fuels.

    PubMed

    Armaroli, Nicola; Balzani, Vincenzo

    2011-03-01

    Currently, over 80% of the energy used by mankind comes from fossil fuels. Harnessing coal, oil and gas, the energy resources contained in the store of our spaceship, Earth, has prompted a dramatic expansion in energy use and a substantial improvement in the quality of life of billions of individuals in some regions of the world. Powering our civilization with fossil fuels has been very convenient, but now we know that it entails severe consequences. We treat fossil fuels as a resource that anyone anywhere can extract and use in any fashion, and Earth's atmosphere, soil and oceans as a dump for their waste products, including more than 30 Gt/y of carbon dioxide. At present, environmental legacy rather than consistence of exploitable reserves, is the most dramatic problem posed by the relentless increase of fossil fuel global demand. Harmful effects on the environment and human health, usually not incorporated into the pricing of fossil fuels, include immediate and short-term impacts related to their discovery, extraction, transportation, distribution, and burning as well as climate change that are spread over time to future generations or over space to the entire planet. In this essay, several aspects of the fossil fuel legacy are discussed, such as alteration of the carbon cycle, carbon dioxide rise and its measurement, greenhouse effect, anthropogenic climate change, air pollution and human health, geoengineering proposals, land and water degradation, economic problems, indirect effects on the society, and the urgent need of regulatory efforts and related actions to promote a gradual transition out of the fossil fuel era. While manufacturing sustainable solar fuels appears to be a longer-time perspective, alternatives energy sources already exist that have the potential to replace fossil fuels as feedstocks for electricity production. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Migration of Older to New Digital Control Systems in Nuclear Power Plant Main Control Rooms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kovesdi, Casey Robert; Joe, Jeffrey Clark

    The United States (U.S.) Department of Energy (DOE) Office of Nuclear Energy (NE) has the primary mission to advance nuclear power by resolving socio-technical issues through research and development (R&D). One DOE-NE activity supporting this mission is the Light Water Reactor Sustainability (LWRS) program. LWRS has the overall objective to sustain the operation of existing commercial nuclear power plants (NPPs) through conducting R&D across multiple “pathways,” or R&D focus areas. The Advanced Instrumentation, Information, and Control (II&C) Systems Technologies pathway conducts targeted R&D to address aging and reliability concerns with the legacy instrumentation and control (I&C) and related information systemsmore » in operating U.S. NPPs. This work involves (1) ensuring that legacy analog II&C systems are not life-limiting issues for the LWR fleet, and (2) implementing digital II&C technology in a manner that enables broad innovation and business improvement in the NPP operating model. Under the LWRS Advanced II&C pathway, Human Factors experts at Idaho National Laboratory (INL) have been conducting R&D in support of NPP main control room (MCR) modernization activities. Work in prior years has focused on migrating analog I&C systems to new digital I&C systems (). In fiscal year 2016 (FY16), one new focus area for this research is migrating older digital I&C systems to new and advanced digital I&C systems. This report summarizes a plan for conducting a digital-to-digital migration of a legacy digital I&C system to a new digital I&C system in support of control room modernization activities.« less

  9. Case Study of Using High Performance Commercial Processors in Space

    NASA Technical Reports Server (NTRS)

    Ferguson, Roscoe C.; Olivas, Zulema

    2009-01-01

    The purpose of the Space Shuttle Cockpit Avionics Upgrade project (1999 2004) was to reduce crew workload and improve situational awareness. The upgrade was to augment the Shuttle avionics system with new hardware and software. A major success of this project was the validation of the hardware architecture and software design. This was significant because the project incorporated new technology and approaches for the development of human rated space software. An early version of this system was tested at the Johnson Space Center for one month by teams of astronauts. The results were positive, but NASA eventually cancelled the project towards the end of the development cycle. The goal to reduce crew workload and improve situational awareness resulted in the need for high performance Central Processing Units (CPUs). The choice of CPU selected was the PowerPC family, which is a reduced instruction set computer (RISC) known for its high performance. However, the requirement for radiation tolerance resulted in the re-evaluation of the selected family member of the PowerPC line. Radiation testing revealed that the original selected processor (PowerPC 7400) was too soft to meet mission objectives and an effort was established to perform trade studies and performance testing to determine a feasible candidate. At that time, the PowerPC RAD750s were radiation tolerant, but did not meet the required performance needs of the project. Thus, the final solution was to select the PowerPC 7455. This processor did not have a radiation tolerant version, but had some ability to detect failures. However, its cache tags did not provide parity and thus the project incorporated a software strategy to detect radiation failures. The strategy was to incorporate dual paths for software generating commands to the legacy Space Shuttle avionics to prevent failures due to the softness of the upgraded avionics.

  10. Case Study of Using High Performance Commercial Processors in a Space Environment

    NASA Technical Reports Server (NTRS)

    Ferguson, Roscoe C.; Olivas, Zulema

    2009-01-01

    The purpose of the Space Shuttle Cockpit Avionics Upgrade project was to reduce crew workload and improve situational awareness. The upgrade was to augment the Shuttle avionics system with new hardware and software. A major success of this project was the validation of the hardware architecture and software design. This was significant because the project incorporated new technology and approaches for the development of human rated space software. An early version of this system was tested at the Johnson Space Center for one month by teams of astronauts. The results were positive, but NASA eventually cancelled the project towards the end of the development cycle. The goal to reduce crew workload and improve situational awareness resulted in the need for high performance Central Processing Units (CPUs). The choice of CPU selected was the PowerPC family, which is a reduced instruction set computer (RISC) known for its high performance. However, the requirement for radiation tolerance resulted in the reevaluation of the selected family member of the PowerPC line. Radiation testing revealed that the original selected processor (PowerPC 7400) was too soft to meet mission objectives and an effort was established to perform trade studies and performance testing to determine a feasible candidate. At that time, the PowerPC RAD750s where radiation tolerant, but did not meet the required performance needs of the project. Thus, the final solution was to select the PowerPC 7455. This processor did not have a radiation tolerant version, but faired better than the 7400 in the ability to detect failures. However, its cache tags did not provide parity and thus the project incorporated a software strategy to detect radiation failures. The strategy was to incorporate dual paths for software generating commands to the legacy Space Shuttle avionics to prevent failures due to the softness of the upgraded avionics.

  11. Spacecraft Onboard Interface Services: Current Status and Roadmap

    NASA Astrophysics Data System (ADS)

    Prochazka, Marek; Lopez Trescastro, Jorge; Krueger, Sabine

    2016-08-01

    Spacecraft Onboard Interface Services (SOIS) is a set of CCSDS standards defining communication stack services to interact with hardware equipment onboard spacecraft. In 2014 ESA kicked off three parallel activities to critically review the SOIS standards, use legacy spacecraft flight software (FSW), make it compliant to a preselected subset of SOIS standards and make performance and architecture assessment. As a part of the three parallel activities, led by Airbus DS Toulouse, OHB Bremen and Thales Alenia Space Cannes respectively, it was to provide feedback back to ESA and CCSDS and also to propose a roadmap of transition towards an operational FSW system fully compliant to applicable SOIS standards. The objective of the paper is twofold: Firstly it is to summarise main results of the three parallel activities and secondly, based on the results, to propose a roadmap for the future.

  12. A Web interface generator for molecular biology programs in Unix.

    PubMed

    Letondal, C

    2001-01-01

    Almost all users encounter problems using sequence analysis programs. Not only are they difficult to learn because of the parameters, syntax and semantic, but many are different. That is why we have developed a Web interface generator for more than 150 molecular biology command-line driven programs, including: phylogeny, gene prediction, alignment, RNA, DNA and protein analysis, motif discovery, structure analysis and database searching programs. The generator uses XML as a high-level description language of the legacy software parameters. Its aim is to provide users with the equivalent of a basic Unix environment, with program combination, customization and basic scripting through macro registration. The program has been used for three years by about 15000 users throughout the world; it has recently been installed on other sites and evaluated as a standard user interface for EMBOSS programs.

  13. Characterisation of imperial college reactor centre legacy waste using gamma-ray spectrometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shuhaimi, Alif Imran Mohd

    Waste characterisation is a principal component in waste management strategy. The characterisation includes identification of chemical, physical and radiochemical parameters of radioactive waste. Failure to determine specific waste properties may result in sentencing waste packages which are not compliant with the regulation of long term storage or disposal. This project involved measurement of intensity and energy of gamma photons which may be emitted by radioactive waste generated during decommissioning of Imperial College Reactor Centre (ICRC). The measurement will use High Purity Germanium (HPGe) as Gamma-ray detector and ISOTOPIC-32 V4.1 as analyser. In order to ensure the measurements provide reliable results,more » two quality control (QC) measurements using difference matrices have been conducted. The results from QC measurements were used to determine the accuracy of the ISOTOPIC software.« less

  14. Legacy to Industry 4.0: A Profibus Sniffer

    NASA Astrophysics Data System (ADS)

    Tsegaye Mamo, Fesseha; Sikora, Axel; Rathfelder, Christoph

    2017-07-01

    Legacy industrial communication protocols are proved robust and functional. During the last decades, the industry has invented completely new or advanced versions of the legacy communication solutions. However, even with the high adoption rate of these new solutions, still the majority industry applications run on legacy, mostly fieldbus related technologies. Profibus is one of those technologies that still keep on growing in the market, albeit a slow in market growth in recent years. A retrofit technology that would enable these technologies to connect to the Internet of Things, utilize the ever growing potential of data analysis, predictive maintenance or cloud-based application, while at the same time not changing a running system is fundamental.

  15. Exploring the "legacy" of the Tuskegee Syphilis Study: a follow-up study from the Tuskegee Legacy Project.

    PubMed

    Katz, Ralph V; Green, B Lee; Kressin, Nancy R; James, Sherman A; Wang, Min Qi; Claudio, Cristina; Russell, Stephanie Luise

    2009-02-01

    The purpose of this follow-up 2003 3-City Tuskegee Legacy Project (TLP) Study was to validate or refute our prior findings from the 1999-2000 4 City TLP Study, which found no evidence to support the widely acknowledged "legacy" of the Tuskegee Syphilis Study (TSS), ie, that blacks are reluctant to participate in biomedical studies due to their knowledge of the TSS. The TLP Questionnaire was administered in this random-digit-dial telephone survey to a stratified random sample of 1162 black, white, and Puerto Rican Hispanic adults in 3 different US cities. The findings from this current 3-City TLP Study fail to support the widely acknowledged "legacy" of the TSS, as awareness of the TSS was not statistically associated with the willingness to participate in biomedical studies. These findings, being in complete agreement with our previous findings from our 1999-2000 4-City TLP, validate those prior findings.

  16. Spitzer Lensing Cluster Legacy Survey

    NASA Astrophysics Data System (ADS)

    Soifer, Tom; Armus, Lee; Bradac, Marusa; Capak, Peter; Coe, Dan; Siana, Brian; Treu, Tommaso; Vieira, Joaquin

    2015-11-01

    Cluster-scale gravitational lenses act as cosmic telescopes, enabling the study of otherwise unobservable galaxies. They are critical in answering the questions such as what is the star formation history at z > 7, and whether these galaxies can reionize the Universe. Accurate knowledge of stellar masses, ages, and star formation rates at this epoch requires measuring both rest-frame UV and optical light, which only Spitzer and HST can probe at z>7-11 for a large enough sample of typical galaxies. To address this cosmic puzzle, we propose a program that obtains shallow Spitzer/IRAC imaging of a large sample of cluster lenses, followed by deep imaging of those clusters with the largest number of z > 7 candidate galaxies. This proposal will be a valuable Legacy complement to the existing IRAC deep surveys, and it will open up a new parameter space by probing the ordinary yet magnified population. Furthermore, it will enable the measurements of the stellar mass of the galaxy cluster population, thereby allowing us to chart the build-up of the cluster red sequence from z~1 to the present and to determine the physical processes responsible for this stellar mass growth.

  17. The International Polar year 2007-2008; the Arctic human health legacy.

    PubMed

    Parkinson, Alan J

    2007-01-01

    Life expectancy in Arctic populations has greatly improved over the last 50 years. Much of this improvement can be attributed health research that has resulted in a reduction in morbidity and mortality from infectious diseases, such as tuberculosis, and the vaccine-preventable diseases of childhood. However, despite these improvements in health indicators of Arctic residents, life expectancy and infant mortality remain higher in indigenous Arctic residents in the US Arctic, northern Canada, and Greenland when compared to Arctic residents of Nordic countries. The International Polar Year (IPY) represents a unique opportunity to focus world attention on Arctic human health and to further stimulate Circumpolar cooperation on emerging Arctic human health concerns. The Arctic Human Health Initiative (AHHI) is an Arctic Council IPY initiative that aims to build and expand on existing Arctic Council and International Union for Circumpolar Health (IUCH) human health research activities. The human health legacy of the IPY will be increased visibility of the human health concerns of Arctic communities, revitalization of cooperative Arctic human health research focused on those concerns, the development of health policies based on research findings, and the subsequent implementation of appropriate interventions, prevention and control measures at the community level.

  18. Euro 2012 European Football Championship Finals: planning for a health legacy.

    PubMed

    Smallwood, Catherine A H; Arbuthnott, Katherine G; Banczak-Mysiak, Barbara; Borodina, Mariya; Coutinho, Ana Paula; Payne-Hallström, Lara; Lipska, Elzbieta; Lyashko, Viktor; Miklasz, Miroslaw; Miskiewicz, Paulina; Nitzan, Dorit; Pokanevych, Igor; Posobkiewicz, Marek; Rockenschaub, Gerald; Sadkowska-Todys, Malgorzata; Sinelnik, Svetlana; Smiley, Daniel; Tomialoic, Rysard; Yurchenko, Volodimir; Memish, Ziad A; Heymann, David; Endericks, Tina; McCloskey, Brian; Zumla, Alimuddin; Barbeschi, Maurizio

    2014-06-14

    The revised international health regulations offer a framework that can be used by host countries to organise public health activities for mass gatherings. From June 8, to July 1, 2012, Poland and Ukraine jointly hosted the Union of European Football Associations European Football Championship Finals (Euro 2012). More than 8 million people from around the world congregated to watch the games. Host countries and international public health agencies planned extensively to assess and build capacity in the host countries and to develop effective strategies for dissemination of public health messages. The effectiveness of public health services was maximised through rapid sharing of information between parties, early use of networks of experienced individuals, and the momentum of existing national health programmes. Organisers of future mass gatherings for sporting events should share best practice and their experiences through the WHO International Observer Program. Research about behaviour of large crowds is needed for crowd management and the evidence base translated into practice. A framework to measure and evaluate the legacy of Euro 2012 is needed based on the experiences and the medium-term and long-term benefits of the tournament. Copyright © 2014 Elsevier Ltd. All rights reserved.

  19. ICT for smart evaluation of vernacular architecture in a stilt-house village

    NASA Astrophysics Data System (ADS)

    Almeida, Filipa; Virtudes, Ana Lídia

    2016-12-01

    Vernacular architecture typologies, such as wooden stilt-houses, have been threatened by the vulnerability to conservation status degradation. This problem is not an exception in Portugal, where the few remaining examples have been neglected, with the disappearance or abandonment of almost all buildings, damaging architectural and urban spatial features. This legacy is rapidly disappearing, weakening the European cultural map. This research presents the results from a smart evaluation method using an ICT (information and communication technology) platform designed for the smart evaluation of wooden stilt-houses, considering their conservation status. This platform was used in the five remaining stilthouse villages still existing in Portugal including about 90 buildings and 300 inhabitants, located along Tagus river banks. This article refers to one of these case studies, the village of Escaroupim, which was chosen because it is the most urban space in between all of them. On one hand, the results are an exhaustive survey of vernacular buildings, useful as guideline for spatial strategies and instruments to protect this legacy. On the other hand, it can be used in other similar wooden buildings, to check their conservation status and therefore to define best rehabilitation actions.

  20. Reuse Metrics for Object Oriented Software

    NASA Technical Reports Server (NTRS)

    Bieman, James M.

    1998-01-01

    One way to increase the quality of software products and the productivity of software development is to reuse existing software components when building new software systems. In order to monitor improvements in reuse, the level of reuse must be measured. In this NASA supported project we (1) derived a suite of metrics which quantify reuse attributes for object oriented, object based, and procedural software, (2) designed prototype tools to take these measurements in Ada, C++, Java, and C software, (3) evaluated the reuse in available software, (4) analyzed the relationship between coupling, cohesion, inheritance, and reuse, (5) collected object oriented software systems for our empirical analyses, and (6) developed quantitative criteria and methods for restructuring software to improve reusability.

  1. Advanced Test Reactor Core Modeling Update Project Annual Report for Fiscal Year 2011

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    David W. Nigg; Devin A. Steuhm

    2011-09-01

    Legacy computational reactor physics software tools and protocols currently used for support of Advanced Test Reactor (ATR) core fuel management and safety assurance and, to some extent, experiment management are obsolete, inconsistent with the state of modern nuclear engineering practice, and are becoming increasingly difficult to properly verify and validate (V&V). Furthermore, the legacy staff knowledge required for application of these tools and protocols from the 1960s and 1970s is rapidly being lost due to staff turnover and retirements. In 2009 the Idaho National Laboratory (INL) initiated a focused effort to address this situation through the introduction of modern high-fidelitymore » computational software and protocols, with appropriate V&V, within the next 3-4 years via the ATR Core Modeling and Simulation and V&V Update (or 'Core Modeling Update') Project. This aggressive computational and experimental campaign will have a broad strategic impact on the operation of the ATR, both in terms of improved computational efficiency and accuracy for support of ongoing DOE programs as well as in terms of national and international recognition of the ATR National Scientific User Facility (NSUF). The ATR Core Modeling Update Project, targeted for full implementation in phase with the anticipated ATR Core Internals Changeout (CIC) in the 2014 time frame, began during the last quarter of Fiscal Year 2009, and has just completed its first full year. Key accomplishments so far have encompassed both computational as well as experimental work. A new suite of stochastic and deterministic transport theory based reactor physics codes and their supporting nuclear data libraries (SCALE, KENO-6, HELIOS, NEWT, and ATTILA) have been installed at the INL under various permanent sitewide license agreements and corresponding baseline models of the ATR and ATRC are now operational, demonstrating the basic feasibility of these code packages for their intended purpose. Furthermore, a capability for rigorous sensitivity analysis and uncertainty quantification based on the TSUNAMI system is being implemented and initial computational results have been obtained. This capability will have many applications in 2011 and beyond as a tool for understanding the margins of uncertainty in the new models as well as for validation experiment design and interpretation. Finally we note that although full implementation of the new computational models and protocols will extend over a period 3-4 years as noted above, interim applications in the much nearer term have already been demonstrated. In particular, these demonstrations included an analysis that was useful for understanding the cause of some issues in December 2009 that were triggered by a larger than acceptable discrepancy between the measured excess core reactivity and a calculated value that was based on the legacy computational methods. As the Modeling Update project proceeds we anticipate further such interim, informal, applications in parallel with formal qualification of the system under the applicable INL Quality Assurance procedures and standards.« less

  2. Domain management OSSs: bridging the gap between legacy and standards-based network management systems

    NASA Astrophysics Data System (ADS)

    Lemley, Todd A.

    1996-11-01

    The rapid change in the telecommunications environment is forcing carriers to re-assess not only their service offering, but also their network management philosophy. The competitive carrier environment has taken away the luxury of throwing technology at a problem by using legacy and proprietary systems and architectures. A more flexible management environment is necessary to effectively gain, and maintain operating margins in the new market era. Competitive forces are driving change which gives carriers more choices than those that are available in legacy and standards-based solutions alone. However, creating an operational support system (OSS) with this gap between legacy and standards has become as dynamic as the services which it supports. A philosophy which helps to integrate the legacy and standards systems is domain management. Domain management relates to a specific service or market 'domain,'and its associated operational support requirements. It supports a companies definition of its business model, which drives the definition of each domain. It also attempts to maximize current investment while injecting new technology available in a practical approach. The following paragraphs offer an overview of legacy systems, standards-based philosophy, and the potential of domain management to help bridge the gap between the two types of systems.

  3. Seqenv: linking sequences to environments through text mining.

    PubMed

    Sinclair, Lucas; Ijaz, Umer Z; Jensen, Lars Juhl; Coolen, Marco J L; Gubry-Rangin, Cecile; Chroňáková, Alica; Oulas, Anastasis; Pavloudi, Christina; Schnetzer, Julia; Weimann, Aaron; Ijaz, Ali; Eiler, Alexander; Quince, Christopher; Pafilis, Evangelos

    2016-01-01

    Understanding the distribution of taxa and associated traits across different environments is one of the central questions in microbial ecology. High-throughput sequencing (HTS) studies are presently generating huge volumes of data to address this biogeographical topic. However, these studies are often focused on specific environment types or processes leading to the production of individual, unconnected datasets. The large amounts of legacy sequence data with associated metadata that exist can be harnessed to better place the genetic information found in these surveys into a wider environmental context. Here we introduce a software program, seqenv, to carry out precisely such a task. It automatically performs similarity searches of short sequences against the "nt" nucleotide database provided by NCBI and, out of every hit, extracts-if it is available-the textual metadata field. After collecting all the isolation sources from all the search results, we run a text mining algorithm to identify and parse words that are associated with the Environmental Ontology (EnvO) controlled vocabulary. This, in turn, enables us to determine both in which environments individual sequences or taxa have previously been observed and, by weighted summation of those results, to summarize complete samples. We present two demonstrative applications of seqenv to a survey of ammonia oxidizing archaea as well as to a plankton paleome dataset from the Black Sea. These demonstrate the ability of the tool to reveal novel patterns in HTS and its utility in the fields of environmental source tracking, paleontology, and studies of microbial biogeography. To install seqenv, go to: https://github.com/xapple/seqenv.

  4. Global environmental change effects on plant community composition trajectories depend upon management legacies.

    PubMed

    Perring, Michael P; Bernhardt-Römermann, Markus; Baeten, Lander; Midolo, Gabriele; Blondeel, Haben; Depauw, Leen; Landuyt, Dries; Maes, Sybryn L; De Lombaerde, Emiel; Carón, Maria Mercedes; Vellend, Mark; Brunet, Jörg; Chudomelová, Markéta; Decocq, Guillaume; Diekmann, Martin; Dirnböck, Thomas; Dörfler, Inken; Durak, Tomasz; De Frenne, Pieter; Gilliam, Frank S; Hédl, Radim; Heinken, Thilo; Hommel, Patrick; Jaroszewicz, Bogdan; Kirby, Keith J; Kopecký, Martin; Lenoir, Jonathan; Li, Daijiang; Máliš, František; Mitchell, Fraser J G; Naaf, Tobias; Newman, Miles; Petřík, Petr; Reczyńska, Kamila; Schmidt, Wolfgang; Standovár, Tibor; Świerkosz, Krzysztof; Van Calster, Hans; Vild, Ondřej; Wagner, Eva Rosa; Wulf, Monika; Verheyen, Kris

    2018-04-01

    The contemporary state of functional traits and species richness in plant communities depends on legacy effects of past disturbances. Whether temporal responses of community properties to current environmental changes are altered by such legacies is, however, unknown. We expect global environmental changes to interact with land-use legacies given different community trajectories initiated by prior management, and subsequent responses to altered resources and conditions. We tested this expectation for species richness and functional traits using 1814 survey-resurvey plot pairs of understorey communities from 40 European temperate forest datasets, syntheses of management transitions since the year 1800, and a trait database. We also examined how plant community indicators of resources and conditions changed in response to management legacies and environmental change. Community trajectories were clearly influenced by interactions between management legacies from over 200 years ago and environmental change. Importantly, higher rates of nitrogen deposition led to increased species richness and plant height in forests managed less intensively in 1800 (i.e., high forests), and to decreases in forests with a more intensive historical management in 1800 (i.e., coppiced forests). There was evidence that these declines in community variables in formerly coppiced forests were ameliorated by increased rates of temperature change between surveys. Responses were generally apparent regardless of sites' contemporary management classifications, although sometimes the management transition itself, rather than historic or contemporary management types, better explained understorey responses. Main effects of environmental change were rare, although higher rates of precipitation change increased plant height, accompanied by increases in fertility indicator values. Analysis of indicator values suggested the importance of directly characterising resources and conditions to better understand legacy and environmental change effects. Accounting for legacies of past disturbance can reconcile contradictory literature results and appears crucial to anticipating future responses to global environmental change. © 2017 John Wiley & Sons Ltd.

  5. InSAR Scientific Computing Environment

    NASA Technical Reports Server (NTRS)

    Rosen, Paul A.; Sacco, Gian Franco; Gurrola, Eric M.; Zabker, Howard A.

    2011-01-01

    This computing environment is the next generation of geodetic image processing technology for repeat-pass Interferometric Synthetic Aperture (InSAR) sensors, identified by the community as a needed capability to provide flexibility and extensibility in reducing measurements from radar satellites and aircraft to new geophysical products. This software allows users of interferometric radar data the flexibility to process from Level 0 to Level 4 products using a variety of algorithms and for a range of available sensors. There are many radar satellites in orbit today delivering to the science community data of unprecedented quantity and quality, making possible large-scale studies in climate research, natural hazards, and the Earth's ecosystem. The proposed DESDynI mission, now under consideration by NASA for launch later in this decade, would provide time series and multiimage measurements that permit 4D models of Earth surface processes so that, for example, climate-induced changes over time would become apparent and quantifiable. This advanced data processing technology, applied to a global data set such as from the proposed DESDynI mission, enables a new class of analyses at time and spatial scales unavailable using current approaches. This software implements an accurate, extensible, and modular processing system designed to realize the full potential of InSAR data from future missions such as the proposed DESDynI, existing radar satellite data, as well as data from the NASA UAVSAR (Uninhabited Aerial Vehicle Synthetic Aperture Radar), and other airborne platforms. The processing approach has been re-thought in order to enable multi-scene analysis by adding new algorithms and data interfaces, to permit user-reconfigurable operation and extensibility, and to capitalize on codes already developed by NASA and the science community. The framework incorporates modern programming methods based on recent research, including object-oriented scripts controlling legacy and new codes, abstraction and generalization of the data model for efficient manipulation of objects among modules, and well-designed module interfaces suitable for command- line execution or GUI-programming. The framework is designed to allow users contributions to promote maximum utility and sophistication of the code, creating an open-source community that could extend the framework into the indefinite future.

  6. Data Management for a Climate Data Record in an Evolving Technical Landscape

    NASA Astrophysics Data System (ADS)

    Moore, K. D.; Walter, J.; Gleason, J. L.

    2017-12-01

    For nearly twenty years, NASA Langley Research Center's Clouds and the Earth's Radiant Energy System (CERES) Science Team has been producing a suite of data products that forms a persistent climate data record of the Earth's radiant energy budget. Many of the team's physical scientists and key research contributors have been with the team since the launch of the first CERES instrument in 1997. This institutional knowledge is irreplaceable and its longevity and continuity are among the reasons that the team has been so productive. Such legacy involvement, however, can also be a limiting factor. Some CERES scientists-cum-coders might possess skills that were state-of-the-field when they were emerging scientists but may now be outdated with respect to developments in software development best practices and supporting technologies. Both programming languages and processing frameworks have evolved significantly in the past twenty years, and updating one of these factors warrants consideration of updating the other. With the imminent launch of a final CERES instrument and the good health of those in flight, the CERES data record stands to continue far into the future. The CERES Science Team is, therefore, undergoing a re-architecture of its codebase to maintain compatibility with newer data processing platforms and technologies and to leverage modern software development best practices. This necessitates training our staff and consequently presents several challenges, including: Development continues immediately on the next "edition" of research algorithms upon release of the previous edition. How can code be rewritten at the same time that the science algorithms are being updated and integrated? With limited time to devote to training, how can we update the staff's existing skillset without slowing progress or introducing new errors? The CERES Science Team is large and complex, much like the current state of its codebase. How can we identify, in a breadth-wise manner, areas for code improvement across multiple research groups that maintain code with varying semantics but common concepts? In this work, we discuss the successes and pitfalls of this major re-architecture effort and share how we will sustain improvement into the future.

  7. How Well Can Existing Software Support Processes Accomplish Sustainment of a Non-Developmental Item-Based Acquisition Strategy

    DTIC Science & Technology

    2017-04-06

    Research Hypothesis ........................................................................................................... 15 Research Design ...user community and of accommodating advancing software applications by the vendors. Research Design My approach to this project was to conduct... design descriptions , requirements specifications, test documentation, interface requirement specifications, product specifications, and software

  8. Guidance on Software Maintenance. Final Report. Reports on Computer Science and Technology.

    ERIC Educational Resources Information Center

    Martin, Roger J.; Osborne, Wilma M.

    Based on informal discussions with personnel at selected federal agencies and private sector organizations and on additional research, this publication addresses issues and problems of software maintenance and suggests actions and procedures which can help software maintenance organizations meet the growing demands of maintaining existing systems.…

  9. GIMS—Software for asset market experiments

    PubMed Central

    Palan, Stefan

    2015-01-01

    In this article we lay out requirements for an experimental market software for financial and economic research. We then discuss existing solutions. Finally, we introduce GIMS, an open source market software which is characterized by extensibility and ease of use, while offering nearly all of the required functionality. PMID:26525085

  10. 78 FR 66865 - Acquisition Regulation: Patents, Data, and Copyrights

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-07

    ... obligations under current law. The proposed changes include policy revisions for computer software developed.... Computer Software DOE's existing Rights in Technical Data-Technology Transfer clause at 970.5227-2 provides mechanisms by which computer software first produced by a DOE contractor may be made available to the public...

  11. DSS command software update

    NASA Technical Reports Server (NTRS)

    Stinnett, W. G.

    1980-01-01

    The modifications, additions, and testing results for a version of the Deep Space Station command software, generated for support of the Voyager Saturn encounter, are discussed. The software update requirements included efforts to: (1) recode portions of the software to permit recovery of approximately 2000 words of memory; (2) correct five Voyager Ground data System liens; (3) provide capability to automatically turn off the command processor assembly local printer during periods of low activity; and (4) correct anomalies existing in the software.

  12. Product-oriented Software Certification Process for Software Synthesis

    NASA Technical Reports Server (NTRS)

    Nelson, Stacy; Fischer, Bernd; Denney, Ewen; Schumann, Johann; Richardson, Julian; Oh, Phil

    2004-01-01

    The purpose of this document is to propose a product-oriented software certification process to facilitate use of software synthesis and formal methods. Why is such a process needed? Currently, software is tested until deemed bug-free rather than proving that certain software properties exist. This approach has worked well in most cases, but unfortunately, deaths still occur due to software failure. Using formal methods (techniques from logic and discrete mathematics like set theory, automata theory and formal logic as opposed to continuous mathematics like calculus) and software synthesis, it is possible to reduce this risk by proving certain software properties. Additionally, software synthesis makes it possible to automate some phases of the traditional software development life cycle resulting in a more streamlined and accurate development process.

  13. Software Defined Network Monitoring Scheme Using Spectral Graph Theory and Phantom Nodes

    DTIC Science & Technology

    2014-09-01

    networks is the emergence of software - defined networking ( SDN ) [1]. SDN has existed for the...Chapter III for network monitoring. A. SOFTWARE DEFINED NETWORKS SDNs provide a new and innovative method to simplify network hardware by logically...and R. Giladi, “Performance analysis of software - defined networking ( SDN ),” in Proc. of IEEE 21st International Symposium on Modeling, Analysis

  14. Canputer Science and Technology: Introduction to Software Packages

    DTIC Science & Technology

    1984-04-01

    Table 5 Sources of Software Packages.20 Table 6 Reference Services Matrix . 33 Table 7 Reference Matrix.40 LIST OF FIGURES Figure 1 Document...consideration should be given to the acquisition of appropriate software packages to replace or upgrade existing services and to provide services not...Consequently, there are many companies that produce only software packages, and are committed to providing training, service , and support. These vendors

  15. APRON: A Cellular Processor Array Simulation and Hardware Design Tool

    NASA Astrophysics Data System (ADS)

    Barr, David R. W.; Dudek, Piotr

    2009-12-01

    We present a software environment for the efficient simulation of cellular processor arrays (CPAs). This software (APRON) is used to explore algorithms that are designed for massively parallel fine-grained processor arrays, topographic multilayer neural networks, vision chips with SIMD processor arrays, and related architectures. The software uses a highly optimised core combined with a flexible compiler to provide the user with tools for the design of new processor array hardware architectures and the emulation of existing devices. We present performance benchmarks for the software processor array implemented on standard commodity microprocessors. APRON can be configured to use additional processing hardware if necessary and can be used as a complete graphical user interface and development environment for new or existing CPA systems, allowing more users to develop algorithms for CPA systems.

  16. In the right order of brush strokes: a sketch of a software philosophy retrospective.

    PubMed

    Pyshkin, Evgeny

    2014-01-01

    This paper follows a discourse on software recognized as a product of art and human creativity progressing probably for as long as software exists. A retrospective view on computer science and software philosophy development is introduced. In so doing we discover parallels between software and various branches of human creative manifestations. Aesthetic properties and mutual dependency of the form and matter of art works are examined in their application to software programs. While exploring some philosophical and even artistic reflection on software we consider extended comprehension of technical sciences of programming and software engineering within the realm of liberal arts.

  17. Software Requirements for the Move to Unix

    NASA Astrophysics Data System (ADS)

    Rees, Paul

    This document provides information concerning the software requirements of each STARLINK site to move entirely to UNIX. It provides a list of proposed UNIX migration deadlines for all sites and lists of software requirements, both STARLINK and non-STARLINK software, which must be met before the existing VMS hardware can be switched off. The information presented in this document is used for the planning of software porting and distribution activities and also for setting realistic migration deadlines for STARLINK sites. The information on software requirements has been provided by STARLINK Site Managers.

  18. From MODFLOW-96 to MODFLOW-2005, ParFlow and Others: Updates and a Workflow for Up- and Out- Conversion

    NASA Astrophysics Data System (ADS)

    Pierce, S. A.; Hardesty Lewis, D.

    2017-12-01

    MODFLOW (MF) has served for decades as a de facto standard for groundwater modelling. Despite successive versions, legacy MF-96 simulations are still commonly encountered cases. Such is the case for many of the groundwater availability models of the State of Texas. Unfortunately, even the existence of converters to MF's newer versions has not necessarily stimulated their adoption, let alone re-creation of legacy models. This state of affairs may be due to the unfamiliarity of the modeller with the terminal or the FORTRAN programming language, resulting in an inability to address the minor or major bugs, nuances, or limitations in compilation or execution of the conversion programs. Here, we present a workflow that addresses the above intricacies all the while attempting to maintain portability in implementation. This workflow is contructed in the form of a Bash script and - with the geoscience-oriented in mind - re-presented as a Jupyter notebook. First, one may choose whether this executable will run with POSIX-compliance or with a preference towards the Bash facilities, both widely adopted by operating systems. In the same vein, it attempts to function within minimal command environments, which reduces any dependencies. Finally, it is designed to offer parallelism across as many cores and nodes as necessary or as few as desired, whether upon a personal or super-computer. Underlying this workflow are patches such that antiquated tools may compile and execute upon modern hardware. Also, fixes to long-standing bugs and limitations in the existing MF converters have been prepared. Specifically, support for the conversion of -96- and Horizontal Flow Barrier-coupled simulations has been added. More radically, we have laid the foundations of a conversion utility between MF and a similar modeller, ParFlow. Furthermore, the modular approach followed may extend to an application which inter-operates between arbitrary groundwater simulators. In short, an accessible and portable workflow of the process of up-conversion between MODFLOW versions now avails itself to geoscientists. Updated programs within it may allow for re-use, in whole or in part, legacy simulations. Lastly, a generic inter-operator has been established, invoking the possibility of significant ease in the recycling of groundwater data in the future.

  19. Legacy effects of colonial millponds on floodplain sedimentation, bank erosion, and channel morphology, MID-Atlantic, USA

    USGS Publications Warehouse

    Schenk, E.R.; Hupp, C.R.

    2009-01-01

    Many rivers and streams of the Mid-Atlantic Region, United States (U.S.) have been altered by postcolonial floodplain sedimentation (legacy sediment) associated with numerous milldams. Little Conestoga Creek, Pennsylvania, a tributary to the Susquehanna River and the Chesapeake Bay, is one of these streams. Floodplain sedimentation rates, bank erosion rates, and channel morphology were measured annually during 2004-2007 at five sites along a 28-km length of Little Conestoga Creek with nine colonial era milldams (one dam was still in place in 2007). This study was part of a larger cooperative effort to quantify floodplain sedimentation, bank erosion, and channel morphology in a high sediment yielding region of the Chesapeake Bay watershed. Data from the five sites were used to estimate the annual volume and mass of sediment stored on the floodplain and eroded from the banks for 14 segments along the 28-km length of creek. A bank and floodplain reach based sediment budget (sediment budget) was constructed for the 28 km by summing the net volume of sediment deposited and eroded from each segment. Mean floodplain sedimentation rates for Little Conestoga Creek were variable, with erosion at one upstream site (-5 mm/year) to deposition at the other four sites (highest = 11 mm/year) despite over a meter of floodplain aggradation from postcolonial sedimentation. Mean bank erosion rates range between 29 and 163 mm/year among the five sites. Bank height increased 1 m for every 10.6 m of channel width, from upstream to downstream (R2 = 0.79, p < 0.0001) resulting in progressively lowered hydraulic connectivity between the channel and the floodplain. Floodplain sedimentation and bank erosion rates also appear to be affected by the proximity of the segments to one existing milldam, which promotes deposition upstream and scouring downstream. The floodplain and bank along the 28-km reach produced a net mean sediment loss of 5,634 Mg/year for 2004-2007, indicating that bank erosion was exceeding floodplain sedimentation. In particular, the three segments between the existing dam and the confluence with the Conestoga River (32% of the studied reach) account for 97% of the measured net sediment budget. Future research directed at understanding channel equilibria should facilitate efforts to reduce the sediment impacts of dam removal and legacy sediment. ?? 2009 American Water Resources Association.

  20. U.S. Spacesuit Legacy: Maintaining it for the Future

    NASA Technical Reports Server (NTRS)

    Chullen, Cinda; McMann, Joe; Thomas, Ken; Kosmo, Joe; Lewis, Cathleen; Wright, Rebecca; Bitterly, Rose; Olivia, Vladenka Rose

    2013-01-01

    The history of U.S. spacesuit development and its use are rich with information on lessons learned, and constitutes a valuable legacy to those designing spacesuits for the future, as well as to educators, students, and the general public. The genesis of lessons learned is best understood by studying the evolution of past spacesuit programs - how the challenges and pressures of the times influenced the direction of the various spacesuit programs. This paper shows how the legacy of various spacesuit-related programs evolved in response to these forces. Important aspects of how this U.S. spacesuit legacy is being preserved today is described, including the archiving of spacesuit hardware, important documents, videos, oral history, and the rapidly expanding U.S. Spacesuit Knowledge Capture program.

  1. U.S. Spacesuit Legacy: Maintaining it for the Future

    NASA Technical Reports Server (NTRS)

    Chullen, Cinda; McMann, Joe; Thomas, Ken; Kosmo, Joe; Lewis, Cathleen; Wright, Rebecca; Bitterly, Rose; Oliva, Vladenka

    2012-01-01

    The history of US Spacesuit development and use is rich with information on lessons learned, and constitutes a valuable legacy to those designing spacesuits for the future, as well as educators, students and the general public. The genesis of lessons learned is best understood by studying the evolution of past spacesuit programs how the challenges and pressures of the times influenced the direction of the various spacesuit programs. This paper will show how the legacy of various programs evolved in response to these forces. Important aspects of how this rich U.S. spacesuit legacy is being preserved today will be described, including the archiving of spacesuit hardware, important documents, videos, oral history, and the rapidly expanding US Spacesuit Knowledge Capture program.

  2. The Knowledge-Based Software Assistant: Beyond CASE

    NASA Technical Reports Server (NTRS)

    Carozzoni, Joseph A.

    1993-01-01

    This paper will outline the similarities and differences between two paradigms of software development. Both support the whole software life cycle and provide automation for most of the software development process, but have different approaches. The CASE approach is based on a set of tools linked by a central data repository. This tool-based approach is data driven and views software development as a series of sequential steps, each resulting in a product. The Knowledge-Based Software Assistant (KBSA) approach, a radical departure from existing software development practices, is knowledge driven and centers around a formalized software development process. KBSA views software development as an incremental, iterative, and evolutionary process with development occurring at the specification level.

  3. The State of Software for Evolutionary Biology.

    PubMed

    Darriba, Diego; Flouri, Tomáš; Stamatakis, Alexandros

    2018-05-01

    With Next Generation Sequencing data being routinely used, evolutionary biology is transforming into a computational science. Thus, researchers have to rely on a growing number of increasingly complex software. All widely used core tools in the field have grown considerably, in terms of the number of features as well as lines of code and consequently, also with respect to software complexity. A topic that has received little attention is the software engineering quality of widely used core analysis tools. Software developers appear to rarely assess the quality of their code, and this can have potential negative consequences for end-users. To this end, we assessed the code quality of 16 highly cited and compute-intensive tools mainly written in C/C++ (e.g., MrBayes, MAFFT, SweepFinder, etc.) and JAVA (BEAST) from the broader area of evolutionary biology that are being routinely used in current data analysis pipelines. Because, the software engineering quality of the tools we analyzed is rather unsatisfying, we provide a list of best practices for improving the quality of existing tools and list techniques that can be deployed for developing reliable, high quality scientific software from scratch. Finally, we also discuss journal as well as science policy and, more importantly, funding issues that need to be addressed for improving software engineering quality as well as ensuring support for developing new and maintaining existing software. Our intention is to raise the awareness of the community regarding software engineering quality issues and to emphasize the substantial lack of funding for scientific software development.

  4. AWARE: Adaptive Software Monitoring and Dynamic Reconfiguration for Critical Infrastructure Protection

    DTIC Science & Technology

    2015-04-29

    in which we applied these adaptation patterns to an adaptive news web server intended to tolerate extremely heavy, unexpected loads. To address...collection of existing models used as benchmarks for OO-based refactoring and an existing web -based repository called REMODD to provide users with model...invariant properties. Specifically, we developed Avida- MDE (based on the Avida digital evolution platform) to support the automatic generation of software

  5. Deep space network software cost estimation model

    NASA Technical Reports Server (NTRS)

    Tausworthe, R. C.

    1981-01-01

    A parametric software cost estimation model prepared for Jet PRopulsion Laboratory (JPL) Deep Space Network (DSN) Data System implementation tasks is described. The resource estimation mdel modifies and combines a number of existing models. The model calibrates the task magnitude and difficulty, development environment, and software technology effects through prompted responses to a set of approximately 50 questions. Parameters in the model are adjusted to fit JPL software life-cycle statistics.

  6. Packaging Software Assets for Reuse

    NASA Astrophysics Data System (ADS)

    Mattmann, C. A.; Marshall, J. J.; Downs, R. R.

    2010-12-01

    The reuse of existing software assets such as code, architecture, libraries, and modules in current software and systems development projects can provide many benefits, including reduced costs, in time and effort, and increased reliability. Many reusable assets are currently available in various online catalogs and repositories, usually broken down by disciplines such as programming language (Ibiblio for Maven/Java developers, PyPI for Python developers, CPAN for Perl developers, etc.). The way these assets are packaged for distribution can play a role in their reuse - an asset that is packaged simply and logically is typically easier to understand, install, and use, thereby increasing its reusability. A well-packaged asset has advantages in being more reusable and thus more likely to provide benefits through its reuse. This presentation will discuss various aspects of software asset packaging and how they can affect the reusability of the assets. The characteristics of well-packaged software will be described. A software packaging domain model will be introduced, and some existing packaging approaches examined. An example case study of a Reuse Enablement System (RES), currently being created by near-term Earth science decadal survey missions, will provide information about the use of the domain model. Awareness of these factors will help software developers package their reusable assets so that they can provide the most benefits for software reuse.

  7. Pathways to Lean Software Development: An Analysis of Effective Methods of Change

    ERIC Educational Resources Information Center

    Hanson, Richard D.

    2014-01-01

    This qualitative Delphi study explored the challenges that exist in delivering software on time, within budget, and with the original scope identified. The literature review identified many attempts over the past several decades to reform the methods used to develop software. These attempts found that the classical waterfall method, which is…

  8. Plug-In Tutor Agents: Still Pluggin'

    ERIC Educational Resources Information Center

    Ritter, Steven

    2016-01-01

    "An Architecture for Plug-in Tutor Agents" (Ritter and Koedinger 1996) proposed a software architecture designed around the idea that tutors could be built as plug-ins for existing software applications. Looking back on the paper now, we can see that certain assumptions about the future of software architecture did not come to be, making…

  9. An Ontology for Software Engineering Education

    ERIC Educational Resources Information Center

    Ling, Thong Chee; Jusoh, Yusmadi Yah; Adbullah, Rusli; Alwi, Nor Hayati

    2013-01-01

    Software agents communicate using ontology. It is important to build an ontology for specific domain such as Software Engineering Education. Building an ontology from scratch is not only hard, but also incur much time and cost. This study aims to propose an ontology through adaptation of the existing ontology which is originally built based on a…

  10. An Exploratory Study of Software Cost Estimating at the Electronic Systems Division.

    DTIC Science & Technology

    1976-07-01

    action’. to improve the software cost Sestimating proces., While thin research was limited to the M.nD onvironment, the same types of problema may exist...Methods in Social Science. Now York: Random House, 1969. 57. Smith, Ronald L. Structured Programming Series (Vol. XI) - Estimating Software Project

  11. Investigating the long-term legacy of drought and warming on the soil microbial community across five European shrubland ecosystems.

    PubMed

    Rousk, Johannes; Smith, Andrew R; Jones, Davey L

    2013-12-01

    We investigated how the legacy of warming and summer drought affected microbial communities in five different replicated long-term (>10 years) field experiments across Europe (EU-FP7 INCREASE infrastructure). To focus explicitly on legacy effects (i.e., indirect rather than direct effects of the environmental factors), we measured microbial variables under the same moisture and temperature in a brief screening, and following a pre-incubation at stable conditions. Specifically, we investigated the size and composition of the soil microbial community (PLFA) alongside measurements of bacterial (leucine incorporation) and fungal (acetate in ergosterol incorporation) growth rates, previously shown to be highly responsive to changes in environmental factors, and microbial respiration. We found no legacy effects on the microbial community size, composition, growth rates, or basal respiration rates at the effect sizes used in our experimental setup (0.6 °C, about 30% precipitation reduction). Our findings support previous reports from single short-term ecosystem studies thereby providing a clear evidence base to allow long-term, broad-scale generalizations to be made. The implication of our study is that warming and summer drought will not result in legacy effects on the microbial community and their processes within the effect sizes here studied. While legacy effects on microbial processes during perturbation cycles, such as drying-rewetting, and on tolerance to drought and warming remain to be studied, our results suggest that any effects on overall ecosystem processes will be rather limited. Thus, the legacies of warming and drought should not be prioritized factors to consider when modeling contemporary rates of biogeochemical processes in soil. © 2013 John Wiley & Sons Ltd.

  12. Loss of ecosystem productivity with repeated drought: a multi-year experiment to assess the role of drought legacy effects

    NASA Astrophysics Data System (ADS)

    Smith, M. D.; Knapp, A.; Hoover, D. L.; Avolio, M. L.; Felton, A. J.; Slette, I.; Wilcox, K.

    2017-12-01

    Climate extremes, such as drought, are increasing in frequency and intensity, and the ecological consequences of these extreme events can be substantial and widespread. Yet, little is known about the factors that determine recovery of ecosystem function post-drought. Such knowledge is particularly important because post-drought recovery periods can be protracted depending on drought legacy effects (e.g., loss key plant populations, altered community structure and/or biogeochemical processes). These drought legacies may alter ecosystem function for many years post-drought and may impact future sensitivity to climate extremes. With forecasts of more frequent drought, there is an imperative to understand whether and how post-drought legacies will affect ecosystem response to future drought events. To address this knowledge gap, we experimentally imposed over an eight year period two extreme growing season droughts, each two years in duration followed by a two-year recovery period, in a central US grassland. We found that aboveground net primary productivity (ANPP) declined dramatically with the first drought and was accompanied by a large shift in plant species composition (loss of C3 forb and increase in C4 grasses). This drought legacy - shift in plant composition - persisted two years post-drought. Yet, despite this legacy, ANPP recovered fully. However, we expected that previously-droughted grassland would be less sensitive to a second extreme drought due to the shift in plant composition. Contrary to this expectation, previously droughted grassland experienced a greater loss in ANPP than grassland that had not experienced drought. Furthermore, previously droughted grassland did not fully recover after the second drought. Thus, the legacy of drought - a shift in plant community composition - increased ecosystem sensitivity to a future extreme drought event.

  13. A new architecture for enterprise information systems.

    PubMed Central

    Covvey, H. D.; Stumpf, J. J.

    1999-01-01

    Irresistible economic and technical forces are forcing healthcare institutions to develop regionalized services such as consolidated or virtual laboratories. Technical realities, such as the lack of an enabling enterprise-level information technology (IT) integration infrastructure, the existence of legacy systems, and non-existent or embryonic enterprise-level IT services organizations, are delaying or frustrating the achievement of the desired configuration of shared services. On attempting to address this matter, we discover that the state-of-the-art in integration technology is not wholly adequate, and itself becomes a barrier to the full realization of shared healthcare services. In this paper we report new work from the field of Co-operative Information Systems that proposes a new architecture of systems that are intrinsically cooperation-enabled, and we extend this architecture to both the regional and national scales. PMID:10566455

  14. Incorporating Semantics into Data Driven Workflows for Content Based Analysis

    NASA Astrophysics Data System (ADS)

    Argüello, M.; Fernandez-Prieto, M. J.

    Finding meaningful associations between text elements and knowledge structures within clinical narratives in a highly verbal domain, such as psychiatry, is a challenging goal. The research presented here uses a small corpus of case histories and brings into play pre-existing knowledge, and therefore, complements other approaches that use large corpus (millions of words) and no pre-existing knowledge. The paper describes a variety of experiments for content-based analysis: Linguistic Analysis using NLP-oriented approaches, Sentiment Analysis, and Semantically Meaningful Analysis. Although it is not standard practice, the paper advocates providing automatic support to annotate the functionality as well as the data for each experiment by performing semantic annotation that uses OWL and OWL-S. Lessons learnt can be transmitted to legacy clinical databases facing the conversion of clinical narratives according to prominent Electronic Health Records standards.

  15. Version 1 of the Hubble Source Catalog

    DOE PAGES

    Whitmore, Bradley C.; Allam, Sahar S.; Budavari, Tamas; ...

    2016-05-11

    The Hubble Source Catalog is designed to help optimize science from the Hubble Space Telescope by combining the tens of thousands of visit-based source lists in the Hubble Legacy Archive into a single master catalog. Version 1 of the Hubble Source Catalog includes WFPC2, ACS/WFC, WFC3/UVIS, and WFC3/IR photometric data generated using SExtractor software to produce the individual source lists. The catalog includes roughly 80 million detections of 30 million objects involving 112 different detector/filter combinations, and about 160 thousand HST exposures. Source lists from Data Release 8 of the Hubble Legacy Archive are matched using an algorithm developed by Budavari & Lubow (2012). The mean photometric accuracy for the catalog as a whole is better than 0.10 mag, with relative accuracy as good as 0.02 mag in certain circumstances (e.g., bright isolated stars). The relative astrometric residuals are typically within 10 mas, with a value for the mode (i.e., most common value) of 2.3 mas. The absolute astrometric accuracy is better thanmore » $$\\sim$$0.1 arcsec for most sources, but can be much larger for a fraction of fields that could not be matched to the PanSTARRS, SDSS, or 2MASS reference systems. In this paper we describe the database design with emphasis on those aspects that enable the users to fully exploit the catalog while avoiding common misunderstandings and potential pitfalls. Here, we provide usage examples to illustrate some of the science capabilities and data quality characteristics, and briefly discuss plans for future improvements to the Hubble Source Catalog.« less

  16. Extending the International Space Station Life and Operability

    NASA Technical Reports Server (NTRS)

    Cecil, Andrew J.; Pitts, R. Lee; Sparks, Ray N.; Wickline, Thomas W.; Zoller, David A.

    2012-01-01

    The International Space Station (ISS) is in an operational configuration with final assembly complete. To fully utilize ISS and extend the operational life, it became necessary to upgrade and extend the onboard systems with the Obsolescence Driven Avionics Redesign (ODAR) project. ODAR enabled a joint project between the Johnson Space Center (JSC) and Marshall Space Flight Center (MSFC) focused on upgrading the onboard payload and Ku-Band systems, expanding the voice and video capabilities, and including more modern protocols allowing unprecedented access for payload investigators to their on-orbit payloads. The MSFC Huntsville Operations Support Center (HOSC) was tasked with developing a high-rate enhanced Functionally Distributed Processor (eFDP) to handle 300Mbps Return Link data, double the legacy rate, and incorporate a Line Outage Recorder (LOR). The eFDP also provides a 25Mbps uplink transmission rate with a Space Link Extension (SLE) interface. HOSC also updated the Payload Data Services System (PDSS) to incorporate the latest Consultative Committee for Space Data Systems (CCSDS) protocols, most notably the use of the Internet Protocol (IP) Encapsulation, in addition to the legacy capabilities. The Central Command Processor was also updated to interact with the new onboard and ground capabilities of Mission Control Center -- Houston (MCC-H) for the uplink functionality. The architecture, implementation, and lessons learned, including integration and incorporation of Commercial Off The Shelf (COTS) hardware and software into the operational mission of the ISS, is described herein. The applicability of this new technology provides new benefits to ISS payload users and ensures better utilization of the ISS by the science community

  17. Large Scale Software Building with CMake in ATLAS

    NASA Astrophysics Data System (ADS)

    Elmsheuser, J.; Krasznahorkay, A.; Obreshkov, E.; Undrus, A.; ATLAS Collaboration

    2017-10-01

    The offline software of the ATLAS experiment at the Large Hadron Collider (LHC) serves as the platform for detector data reconstruction, simulation and analysis. It is also used in the detector’s trigger system to select LHC collision events during data taking. The ATLAS offline software consists of several million lines of C++ and Python code organized in a modular design of more than 2000 specialized packages. Because of different workflows, many stable numbered releases are in parallel production use. To accommodate specific workflow requests, software patches with modified libraries are distributed on top of existing software releases on a daily basis. The different ATLAS software applications also require a flexible build system that strongly supports unit and integration tests. Within the last year this build system was migrated to CMake. A CMake configuration has been developed that allows one to easily set up and build the above mentioned software packages. This also makes it possible to develop and test new and modified packages on top of existing releases. The system also allows one to detect and execute partial rebuilds of the release based on single package changes. The build system makes use of CPack for building RPM packages out of the software releases, and CTest for running unit and integration tests. We report on the migration and integration of the ATLAS software to CMake and show working examples of this large scale project in production.

  18. The Legacy Project Intervention to Enhance Meaningful Family Interactions: Case Examples

    PubMed Central

    ALLEN, REBECCA S.

    2009-01-01

    Manualized reminiscence and life review therapies are supported as an evidence-based, effective treatment for depression among older adults, but this therapeutic approach is usually individually administered and has rarely been applied in palliative care settings. We combined mutual reminiscence and life review with engagement in meaningful activity and examined the efficacy of this family-based dyadic intervention to decrease caregiving stress and increase family communication. Seventeen individuals living with chronic, life-limiting illnesses in the community and their family caregivers received three home visits with a master’s-level interventionist. During these sessions and through structured homework activities, the interventionist actively worked with the family to construct a personal Legacy Project, usually a scrapbook with photos, a cookbook, or audiotaped stories that celebrated the life of the ill individual. All participants in the intervention group initiated a Legacy Project and reported that Legacy activities improved family communication. Participation in Legacy creation also resulted in increased positive emotional experiences in patient and caregiver groups. These results are illustrated through careful examination of three case studies. PMID:20046967

  19. Pre-Columbian floristic legacies in modern homegardens of Central Amazonia.

    PubMed

    Lins, Juliana; Lima, Helena P; Baccaro, Fabricio B; Kinupp, Valdely F; Shepard, Glenn H; Clement, Charles R

    2015-01-01

    Historical ecologists have demonstrated legacy effects in apparently wild landscapes in Europe, North America, Mesoamerica, Amazonia, Africa and Oceania. People live and farm in archaeological sites today in many parts of the world, but nobody has looked for the legacies of past human occupations in the most dynamic areas in these sites: homegardens. Here we show that the useful flora of modern homegardens is partially a legacy of pre-Columbian occupations in Central Amazonia: the more complex the archaeological context, the more variable the floristic composition of useful native plants in homegardens cultivated there today. Species diversity was 10% higher in homegardens situated in multi-occupational archaeological contexts compared with homegardens situated in single-occupational ones. Species heterogeneity (β-diversity) among archaeological contexts was similar for the whole set of species, but markedly different when only native Amazonian species were included, suggesting the influence of pre-conquest indigenous occupations on current homegarden species composition. Our findings show that the legacy of pre-Columbian occupations is visible in the most dynamic of all agroecosystems, adding another dimension to the human footprint in the Amazonian landscape.

  20. The Caltech-NRAO Stripe 82 Survey (CNSS) Paper. I. The Pilot Radio Transient Survey in 50 Deg.(exp. 2)

    NASA Technical Reports Server (NTRS)

    Mooley, K. P.; Hallinan, G.; Bourke, S.; Horesh, A.; Myers, S. T.; Frail, D. A.; Kulkarni, S. R.; Levitan, D. B.; Kasliwal, M. M.; Cenko, S. B.; hide

    2016-01-01

    We have commenced a multiyear program, the Caltech-NRAO Stripe 82 Survey (CNSS), to search for radio transients with the Jansky VLA in the Sloan Digital Sky Survey Stripe 82 region. The CNSS will deliver five epochs over the entire approx. 270 deg.(exp. 2) of Stripe 82, an eventual deep combined map with an rms noise of approx. 40 proper motion epoch y and catalogs at a frequency of 3 GHz, and having a spatial resolution of 3 inches. This first paper presents the results from an initial pilot survey of a 50 deg.(exp. 2) region of Stripe 82, involving four epochs spanning logarithmic timescales between 1 week and 1.5 yr, with the combined map having a median rms noise of 35 proper motion epoch y. This pilot survey enabled the development of the hardware and software for rapid data processing, as well as transient detection and follow-up, necessary for the full 270 deg.(exp. 2) survey. Data editing, calibration, imaging, source extraction, cataloging, and transient identification were completed in a semi-automated fashion within 6 hr of completion of each epoch of observations, using dedicated computational hardware at the NRAO in Socorro and custom-developed data reduction and transient detection pipelines. Classification of variable and transient sources relied heavily on the wealth of multiwavelength legacy survey data in the Stripe 82 region, supplemented by repeated mapping of the region by the Palomar Transient Factory. A total of 3.9(+0.5%/-0.9%) of the few thousand detected point sources werefound to vary by greater than 30%, consistent with similar studies at 1.4 and 5 GHz. Multiwavelength photometric data and light curves suggest that the variability is mostly due to shock-induced flaring in the jets of active galactic nuclei (AGNs). Although this was only a pilot survey, we detected two bona fide transients, associated with an RS CVn binary and a dKe star. Comparison with existing legacy survey data (FIRST, VLA-Stripe 82) revealed additional highly variable and transient sources on timescales between 5 and 20 yr, largely associated with renewed AGN activity. The rates of such AGNs possibly imply episodes of enhanced accretion and jet activity occurring once every approx. 40,000 yr in these galaxies. We compile the revised radio transient rates and make recommendations for future transient surveys and joint radio-optical experiments.

  1. Paralympic Legacy: Exploring the Impact of the Games on the Perceptions of Young People With Disabilities.

    PubMed

    Coates, Janine; Vickerman, Philip B

    2016-10-01

    The London 2012 Olympic and Paralympic Games aimed to deliver a legacy to citizens of the United Kingdom, which included inspiring a generation of young people to participate in sport. This study aimed to understand the legacy of the Paralympic Games for children with disabilities. Eight adolescents (11-16 yr) with physical disabilities were interviewed about their perceptions of the Paralympic Games. Thematic analysis found 3 key themes that further our understanding of the Paralympic legacy. These were Paralympians as role models, changing perceptions of disability, and the motivating nature of the Paralympics. Findings demonstrate that the Games were inspirational for children with disabilities, improving their self-perceptions. This is discussed in relation to previous literature, and core recommendations are made.

  2. Unisys' experience in software quality and productivity management of an existing system

    NASA Technical Reports Server (NTRS)

    Munson, John B.

    1988-01-01

    A summary of Quality Improvement techniques, implementation, and results in the maintenance, management, and modification of large software systems for the Space Shuttle Program's ground-based systems is provided.

  3. Reading Diagnosis via the Microcomputer (The Printout).

    ERIC Educational Resources Information Center

    Weisberg, Renee; Balajthy, Ernest

    1989-01-01

    Examines and evaluates microcomputer software designed to assist in diagnosing students' reading abilities and making instructional decisions. Claims that existing software shows valuable potential when used sensibly and critically by trained reading clinicians. (MM)

  4. The HEASARC in 2013 and Beyond: NuSTAR, Astro-H, NICER..

    NASA Astrophysics Data System (ADS)

    Drake, Stephen A.; Smale, A. P.; McGlynn, T. A.; Arnaud, K. A.

    2013-04-01

    The High Energy Astrophysics Archival Research Center or HEASARC (http://heasarc.gsfc.nasa.gov/) is in its third decade as the NASA astrophysics discipline node supporting multi-mission cosmic X-ray and gamma-ray astronomy research. It provides a unified archive and software structure aimed both at 'legacy' missions such as Einstein, EXOSAT, ROSAT and RXTE, contemporary missions such as Fermi, Swift, Suzaku, Chandra, etc., and upcoming missions, such as NuSTAR, Astro-H and NICER. The HEASARC's high-energy astronomy archive has grown so that it presently contains 45 TB of data from 28 orbital missions. The HEASARC is the designated archive which supports NASA's Physics of the Cosmos theme (http://pcos.gsfc.nasa.gov/). We discuss some of the upcoming new initiatives and developments for the HEASARC, including the arrival of public data from the hard X-ray imaging NuSTAR mission in the summer of 2013, and the ongoing preparations to support the JAXA/NASA Astro-H mission and the NASA MoO Neutron Star Interior Composition Explorer (NICER), which are expected to become operational in 2015-2016. We also highlight some of the new software capabilities of the HEASARC, such as Xamin, a next-generation archive interface which will eventually supersede Browse, and the latest update of XSPEC (v 12.8.0).

  5. COBRApy: COnstraints-Based Reconstruction and Analysis for Python.

    PubMed

    Ebrahim, Ali; Lerman, Joshua A; Palsson, Bernhard O; Hyduke, Daniel R

    2013-08-08

    COnstraint-Based Reconstruction and Analysis (COBRA) methods are widely used for genome-scale modeling of metabolic networks in both prokaryotes and eukaryotes. Due to the successes with metabolism, there is an increasing effort to apply COBRA methods to reconstruct and analyze integrated models of cellular processes. The COBRA Toolbox for MATLAB is a leading software package for genome-scale analysis of metabolism; however, it was not designed to elegantly capture the complexity inherent in integrated biological networks and lacks an integration framework for the multiomics data used in systems biology. The openCOBRA Project is a community effort to promote constraints-based research through the distribution of freely available software. Here, we describe COBRA for Python (COBRApy), a Python package that provides support for basic COBRA methods. COBRApy is designed in an object-oriented fashion that facilitates the representation of the complex biological processes of metabolism and gene expression. COBRApy does not require MATLAB to function; however, it includes an interface to the COBRA Toolbox for MATLAB to facilitate use of legacy codes. For improved performance, COBRApy includes parallel processing support for computationally intensive processes. COBRApy is an object-oriented framework designed to meet the computational challenges associated with the next generation of stoichiometric constraint-based models and high-density omics data sets. http://opencobra.sourceforge.net/

  6. Search and Determine Integrated Environment (SADIE)

    NASA Astrophysics Data System (ADS)

    Sabol, C.; Schumacher, P.; Segerman, A.; Coffey, S.; Hoskins, A.

    2012-09-01

    A new and integrated high performance computing software applications package called the Search and Determine Integrated Environment (SADIE) is being jointly developed and refined by the Air Force and Naval Research Laboratories (AFRL and NRL) to automatically resolve uncorrelated tracks (UCTs) and build a more complete space object catalog for improved Space Situational Awareness (SSA). The motivation for SADIE is to respond to very challenging needs identified and guidance received from Air Force Space Command (AFSPC) and other senior leaders to develop this technology to support the evolving Joint Space Operations Center (JSpOC) and Alternate Space Control Center (ASC2)-Dahlgren. The JSpOC and JMS SSA mission requirements and threads flow down from the United States Strategic Command (USSTRATCOM). The SADIE suite includes modification and integration of legacy applications and software components that include Search And Determine (SAD), Satellite Identification (SID), and Parallel Catalog (Parcat), as well as other utilities and scripts to enable end-to-end catalog building and maintenance in a parallel processing environment. SADIE is being developed to handle large catalog building challenges in all orbit regimes and includes the automatic processing of radar, fence, and optical data. Real data results are provided for the processing of Air Force Space Surveillance System fence observations and for the processing of Space Surveillance Telescope optical data.

  7. AthenaMT: upgrading the ATLAS software framework for the many-core world with multi-threading

    NASA Astrophysics Data System (ADS)

    Leggett, Charles; Baines, John; Bold, Tomasz; Calafiura, Paolo; Farrell, Steven; van Gemmeren, Peter; Malon, David; Ritsch, Elmar; Stewart, Graeme; Snyder, Scott; Tsulaia, Vakhtang; Wynne, Benjamin; ATLAS Collaboration

    2017-10-01

    ATLAS’s current software framework, Gaudi/Athena, has been very successful for the experiment in LHC Runs 1 and 2. However, its single threaded design has been recognized for some time to be increasingly problematic as CPUs have increased core counts and decreased available memory per core. Even the multi-process version of Athena, AthenaMP, will not scale to the range of architectures we expect to use beyond Run2. After concluding a rigorous requirements phase, where many design components were examined in detail, ATLAS has begun the migration to a new data-flow driven, multi-threaded framework, which enables the simultaneous processing of singleton, thread unsafe legacy Algorithms, cloned Algorithms that execute concurrently in their own threads with different Event contexts, and fully re-entrant, thread safe Algorithms. In this paper we report on the process of modifying the framework to safely process multiple concurrent events in different threads, which entails significant changes in the underlying handling of features such as event and time dependent data, asynchronous callbacks, metadata, integration with the online High Level Trigger for partial processing in certain regions of interest, concurrent I/O, as well as ensuring thread safety of core services. We also report on upgrading the framework to handle Algorithms that are fully re-entrant.

  8. An MRI Von Economo - Koskinas atlas.

    PubMed

    Scholtens, Lianne H; de Reus, Marcel A; de Lange, Siemon C; Schmidt, Ruben; van den Heuvel, Martijn P

    2018-04-15

    The cerebral cortex displays substantial variation in cellular architecture, a regional patterning that has been of great interest to anatomists for centuries. In 1925, Constantin von Economo and George Koskinas published a detailed atlas of the human cerebral cortex, describing a cytoarchitectonic division of the cortical mantle into over 40 distinct areas. Von Economo and Koskinas accompanied their seminal work with large photomicrographic plates of their histological slides, together with tables containing for each described region detailed morphological layer-specific information on neuronal count, neuron size and thickness of the cortical mantle. Here, we aimed to make this legacy data accessible and relatable to in vivo neuroimaging data by constructing a digital Von Economo - Koskinas atlas compatible with the widely used FreeSurfer software suite. In this technical note we describe the procedures used for manual segmentation of the Von Economo - Koskinas atlas onto individual T1 scans and the subsequent construction of the digital atlas. We provide the files needed to run the atlas on new FreeSurfer data, together with some simple code of how to apply the atlas to T1 scans within the FreeSurfer software suite. The digital Von Economo - Koskinas atlas is easily applicable to modern day anatomical MRI data and is made publicly available online. Copyright © 2017 Elsevier Inc. All rights reserved.

  9. Development of a Software Tool to Automate ADCO Flight Controller Console Planning Tasks

    NASA Technical Reports Server (NTRS)

    Anderson, Mark G.

    2011-01-01

    This independent study project covers the development of the International Space Station (ISS) Attitude Determination and Control Officer (ADCO) Planning Exchange APEX Tool. The primary goal of the tool is to streamline existing manual and time-intensive planning tools into a more automated, user-friendly application that interfaces with existing products and allows the ADCO to produce accurate products and timelines more effectively. This paper will survey the current ISS attitude planning process and its associated requirements, goals, documentation and software tools and how a software tool could simplify and automate many of the planning actions which occur at the ADCO console. The project will be covered from inception through the initial prototype delivery in November 2011 and will include development of design requirements and software as well as design verification and testing.

  10. Four simple recommendations to encourage best practices in research software

    PubMed Central

    Jiménez, Rafael C.; Kuzak, Mateusz; Alhamdoosh, Monther; Barker, Michelle; Batut, Bérénice; Borg, Mikael; Capella-Gutierrez, Salvador; Chue Hong, Neil; Cook, Martin; Corpas, Manuel; Flannery, Madison; Garcia, Leyla; Gelpí, Josep Ll.; Gladman, Simon; Goble, Carole; González Ferreiro, Montserrat; Gonzalez-Beltran, Alejandra; Griffin, Philippa C.; Grüning, Björn; Hagberg, Jonas; Holub, Petr; Hooft, Rob; Ison, Jon; Katz, Daniel S.; Leskošek, Brane; López Gómez, Federico; Oliveira, Luis J.; Mellor, David; Mosbergen, Rowland; Mulder, Nicola; Perez-Riverol, Yasset; Pergl, Robert; Pichler, Horst; Pope, Bernard; Sanz, Ferran; Schneider, Maria V.; Stodden, Victoria; Suchecki, Radosław; Svobodová Vařeková, Radka; Talvik, Harry-Anton; Todorov, Ilian; Treloar, Andrew; Tyagi, Sonika; van Gompel, Maarten; Vaughan, Daniel; Via, Allegra; Wang, Xiaochuan; Watson-Haigh, Nathan S.; Crouch, Steve

    2017-01-01

    Scientific research relies on computer software, yet software is not always developed following practices that ensure its quality and sustainability. This manuscript does not aim to propose new software development best practices, but rather to provide simple recommendations that encourage the adoption of existing best practices. Software development best practices promote better quality software, and better quality software improves the reproducibility and reusability of research. These recommendations are designed around Open Source values, and provide practical suggestions that contribute to making research software and its source code more discoverable, reusable and transparent. This manuscript is aimed at developers, but also at organisations, projects, journals and funders that can increase the quality and sustainability of research software by encouraging the adoption of these recommendations. PMID:28751965

  11. Four simple recommendations to encourage best practices in research software.

    PubMed

    Jiménez, Rafael C; Kuzak, Mateusz; Alhamdoosh, Monther; Barker, Michelle; Batut, Bérénice; Borg, Mikael; Capella-Gutierrez, Salvador; Chue Hong, Neil; Cook, Martin; Corpas, Manuel; Flannery, Madison; Garcia, Leyla; Gelpí, Josep Ll; Gladman, Simon; Goble, Carole; González Ferreiro, Montserrat; Gonzalez-Beltran, Alejandra; Griffin, Philippa C; Grüning, Björn; Hagberg, Jonas; Holub, Petr; Hooft, Rob; Ison, Jon; Katz, Daniel S; Leskošek, Brane; López Gómez, Federico; Oliveira, Luis J; Mellor, David; Mosbergen, Rowland; Mulder, Nicola; Perez-Riverol, Yasset; Pergl, Robert; Pichler, Horst; Pope, Bernard; Sanz, Ferran; Schneider, Maria V; Stodden, Victoria; Suchecki, Radosław; Svobodová Vařeková, Radka; Talvik, Harry-Anton; Todorov, Ilian; Treloar, Andrew; Tyagi, Sonika; van Gompel, Maarten; Vaughan, Daniel; Via, Allegra; Wang, Xiaochuan; Watson-Haigh, Nathan S; Crouch, Steve

    2017-01-01

    Scientific research relies on computer software, yet software is not always developed following practices that ensure its quality and sustainability. This manuscript does not aim to propose new software development best practices, but rather to provide simple recommendations that encourage the adoption of existing best practices. Software development best practices promote better quality software, and better quality software improves the reproducibility and reusability of research. These recommendations are designed around Open Source values, and provide practical suggestions that contribute to making research software and its source code more discoverable, reusable and transparent. This manuscript is aimed at developers, but also at organisations, projects, journals and funders that can increase the quality and sustainability of research software by encouraging the adoption of these recommendations.

  12. An SAS Macro for Implementing the Modified Bollen-Stine Bootstrap for Missing Data: Implementing the Bootstrap Using Existing Structural Equation Modeling Software

    ERIC Educational Resources Information Center

    Enders, Craig K.

    2005-01-01

    The Bollen-Stine bootstrap can be used to correct for standard error and fit statistic bias that occurs in structural equation modeling (SEM) applications due to nonnormal data. The purpose of this article is to demonstrate the use of a custom SAS macro program that can be used to implement the Bollen-Stine bootstrap with existing SEM software.…

  13. Pancreatic Exocrine Tumors

    MedlinePlus

    ... Legacy Corporate Partnerships How You Change Lives Donor Recognition Donor Stories Give a patient, a caregiver, a ... Give Donate Create a Fundraiser Legacy Giving Donor Recognition Corporate Giving & Sponsorships Shop Purple Research For Researchers ...

  14. Selection of software for mechanical engineering undergraduates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cheah, C. T.; Yin, C. S.; Halim, T.

    A major problem with the undergraduate mechanical course is the limited exposure of students to software packages coupled with the long learning curve on the existing software packages. This work proposes the use of appropriate software packages for the entire mechanical engineering curriculum to ensure students get sufficient exposure real life design problems. A variety of software packages are highlighted as being suitable for undergraduate work in mechanical engineering, e.g. simultaneous non-linear equations; uncertainty analysis; 3-D modeling software with the FEA; analysis tools for the solution of problems in thermodynamics, fluid mechanics, mechanical system design, and solid mechanics.

  15. Fault-tolerant software - Experiment with the sift operating system. [Software Implemented Fault Tolerance computer

    NASA Technical Reports Server (NTRS)

    Brunelle, J. E.; Eckhardt, D. E., Jr.

    1985-01-01

    Results are presented of an experiment conducted in the NASA Avionics Integrated Research Laboratory (AIRLAB) to investigate the implementation of fault-tolerant software techniques on fault-tolerant computer architectures, in particular the Software Implemented Fault Tolerance (SIFT) computer. The N-version programming and recovery block techniques were implemented on a portion of the SIFT operating system. The results indicate that, to effectively implement fault-tolerant software design techniques, system requirements will be impacted and suggest that retrofitting fault-tolerant software on existing designs will be inefficient and may require system modification.

  16. Problem-Solving Software

    NASA Technical Reports Server (NTRS)

    1992-01-01

    CBR Express software solves problems by adapting sorted solutions to new problems specified by a user. It is applicable to a wide range of situations. The technology was originally developed by Inference Corporation for Johnson Space Center's Advanced Software Development Workstation. The project focused on the reuse of software designs, and Inference used CBR as part of the ACCESS prototype software. The commercial CBR Express is used as a "help desk" for customer support, enabling reuse of existing information when necessary. It has been adopted by several companies, among them American Airlines, which uses it to solve reservation system software problems.

  17. ESO Demonstration Project with the NRAO 12-m Antenna

    NASA Astrophysics Data System (ADS)

    Heald, R.; Karban, R.

    2000-03-01

    During the months of September through November 1999, an ALMA joint demonstration project between the European Southern Observatory (ESO) and the National Radio Astronomy Observatory (NRAO) was carried out in Socorro/New Mexico. During this period, Robert Karban (ESO) and Ron Heald (NRAO) worked together on the ESO Demonstration Project. The project integrated ESO software and existing NRAO software (a prototype for the future ALMA control software) to control the motion of the Kitt Peak 12-m antenna. ESO software from the VLT provided the operator interface and coordinate transformation software, while Pat Wallace's TPOINT provided the pointing- model software.

  18. Co Modeling and Co Synthesis of Safety Critical Multi threaded Embedded Software for Multi Core Embedded Platforms

    DTIC Science & Technology

    2017-03-20

    computation, Prime Implicates, Boolean Abstraction, real- time embedded software, software synthesis, correct by construction software design , model...types for time -dependent data-flow networks". J.-P. Talpin, P. Jouvelot, S. Shukla. ACM-IEEE Conference on Methods and Models for System Design ...information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing   data sources, gathering and

  19. Experiments in fault tolerant software reliability

    NASA Technical Reports Server (NTRS)

    Mcallister, David F.; Tai, K. C.; Vouk, Mladen A.

    1987-01-01

    The reliability of voting was evaluated in a fault-tolerant software system for small output spaces. The effectiveness of the back-to-back testing process was investigated. Version 3.0 of the RSDIMU-ATS, a semi-automated test bed for certification testing of RSDIMU software, was prepared and distributed. Software reliability estimation methods based on non-random sampling are being studied. The investigation of existing fault-tolerance models was continued and formulation of new models was initiated.

  20. DSN Resource Scheduling

    NASA Technical Reports Server (NTRS)

    Wang, Yeou-Fang; Baldwin, John

    2007-01-01

    TIGRAS is client-side software, which provides tracking-station equipment planning, allocation, and scheduling services to the DSMS (Deep Space Mission System). TIGRAS provides functions for schedulers to coordinate the DSN (Deep Space Network) antenna usage time and to resolve the resource usage conflicts among tracking passes, antenna calibrations, maintenance, and system testing activities. TIGRAS provides a fully integrated multi-pane graphical user interface for all scheduling operations. This is a great improvement over the legacy VAX VMS command line user interface. TIGRAS has the capability to handle all DSN resource scheduling aspects from long-range to real time. TIGRAS assists NASA mission operations for DSN tracking of station equipment resource request processes from long-range load forecasts (ten years or longer), to midrange, short-range, and real-time (less than one week) emergency tracking plan changes. TIGRAS can be operated by NASA mission operations worldwide to make schedule requests for the DSN station equipment.

Top