Science.gov

Sample records for advanced object-based software

  1. LEGOS: Object-based software components for mission-critical systems. Final report, June 1, 1995--December 31, 1997

    SciTech Connect

    1998-08-01

    An estimated 85% of the installed base of software is a custom application with a production quantity of one. In practice, almost 100% of military software systems are custom software. Paradoxically, the marginal costs of producing additional units are near zero. So why hasn`t the software market, a market with high design costs and low productions costs evolved like other similar custom widget industries, such as automobiles and hardware chips? The military software industry seems immune to market pressures that have motivated a multilevel supply chain structure in other widget industries: design cost recovery, improve quality through specialization, and enable rapid assembly from purchased components. The primary goal of the ComponentWare Consortium (CWC) technology plan was to overcome barriers to building and deploying mission-critical information systems by using verified, reusable software components (Component Ware). The adoption of the ComponentWare infrastructure is predicated upon a critical mass of the leading platform vendors` inevitable adoption of adopting emerging, object-based, distributed computing frameworks--initially CORBA and COM/OLE. The long-range goal of this work is to build and deploy military systems from verified reusable architectures. The promise of component-based applications is to enable developers to snap together new applications by mixing and matching prefabricated software components. A key result of this effort is the concept of reusable software architectures. A second important contribution is the notion that a software architecture is something that can be captured in a formal language and reused across multiple applications. The formalization and reuse of software architectures provide major cost and schedule improvements. The Unified Modeling Language (UML) is fast becoming the industry standard for object-oriented analysis and design notation for object-based systems. However, the lack of a standard real-time distributed

  2. Advanced fingerprint verification software

    NASA Astrophysics Data System (ADS)

    Baradarani, A.; Taylor, J. R. B.; Severin, F.; Maev, R. Gr.

    2016-05-01

    We have developed a fingerprint software package that can be used in a wide range of applications from law enforcement to public and private security systems, and to personal devices such as laptops, vehicles, and door- locks. The software and processing units are a unique implementation of new and sophisticated algorithms that compete with the current best systems in the world. Development of the software package has been in line with the third generation of our ultrasonic fingerprinting machine1. Solid and robust performance is achieved in the presence of misplaced and low quality fingerprints.

  3. A flexible object-based software framework for modeling complex systems with interacting natural and societal processes.

    SciTech Connect

    Christiansen, J. H.

    2000-06-15

    The Dynamic Information Architecture System (DIAS) is a flexible, extensible, object-based framework for developing and maintaining complex multidisciplinary simulations. The DIAS infrastructure makes it feasible to build and manipulate complex simulation scenarios in which many thousands of objects can interact via dozens to hundreds of concurrent dynamic processes. The flexibility and extensibility of the DIAS software infrastructure stem mainly from (1) the abstraction of object behaviors, (2) the encapsulation and formalization of model functionality, and (3) the mutability of domain object contents. DIAS simulation objects are inherently capable of highly flexible and heterogeneous spatial realizations. Geospatial graphical representation of DIAS simulation objects is addressed via the GeoViewer, an object-based GIS toolkit application developed at ANL. DIAS simulation capabilities have been extended by inclusion of societal process models generated by the Framework for Addressing Cooperative Extended Transactions (FACET), another object-based framework developed at Argonne National Laboratory. By using FACET models to implement societal behaviors of individuals and organizations within larger DIAS-based natural systems simulations, it has become possible to conveniently address a broad range of issues involving interaction and feedback among natural and societal processes. Example DIAS application areas discussed in this paper include a dynamic virtual oceanic environment, detailed simulation of clinical, physiological, and logistical aspects of health care delivery, and studies of agricultural sustainability of urban centers under environmental stress in ancient Mesopotamia.

  4. Automatic extraction of initial moving object based on advanced feature and video analysis

    NASA Astrophysics Data System (ADS)

    Liu, Mao-Ying; Dai, Qiong-Hai; Liu, Xiao-Dong; Er, Gui-Hua

    2005-07-01

    Traditionally, video segmentation usually extracts object using low-level features such as color, texture, edge, motion, and optical flow. This paper originally proposes that the connectivity of object motion is an advanced feature of video moving object because it can reflect semantic meanings of object to some extent. And it can be fully represented on cumulated difference image which is the combination of a certain number of interframe difference images. Based on this principle, a novel system is designed to extract initial moving object automatically. The system includes 3 key innovations: 1) System is applied on cumulated difference image which can make object more prominent than background noise. Object extraction is based on the connectivity of object motion and it can guarantee the integrity of the extracted object while eliminate big background regions which cannot be removed by conventional change detection methods, for example, intense-noise regions and shadow regions that are not connected tightly to object. 2) Video sequence analysis is performed ahead of video segmentation. Proper object extraction methods are adopted according to the characteristics of background noise and object motion. 3) The adaptive threshold is automatically determined on cumulated difference image after acute noises is removed. The threshold determined here is more reasonable. And with it, most noise can be eliminated while small-motion regions of object are preserved. Results show that this system can extract object in different kinds of sequences automatically, promptly and properly. Thus, this system is very suitable for real time video applications.

  5. Comparison of performance of object-based image analysis techniques available in open source software (Spring and Orfeo Toolbox/Monteverdi) considering very high spatial resolution data

    NASA Astrophysics Data System (ADS)

    Teodoro, Ana C.; Araujo, Ricardo

    2016-01-01

    The use of unmanned aerial vehicles (UAVs) for remote sensing applications is becoming more frequent. However, this type of information can result in several software problems related to the huge amount of data available. Object-based image analysis (OBIA) has proven to be superior to pixel-based analysis for very high-resolution images. The main objective of this work was to explore the potentialities of the OBIA methods available in two different open source software applications, Spring and OTB/Monteverdi, in order to generate an urban land cover map. An orthomosaic derived from UAVs was considered, 10 different regions of interest were selected, and two different approaches were followed. The first one (Spring) uses the region growing segmentation algorithm followed by the Bhattacharya classifier. The second approach (OTB/Monteverdi) uses the mean shift segmentation algorithm followed by the support vector machine (SVM) classifier. Two strategies were followed: four classes were considered using Spring and thereafter seven classes were considered for OTB/Monteverdi. The SVM classifier produces slightly better results and presents a shorter processing time. However, the poor spectral resolution of the data (only RGB bands) is an important factor that limits the performance of the classifiers applied.

  6. Advanced Extravehicular Mobility Unit Informatics Software Design

    NASA Technical Reports Server (NTRS)

    Wright, Theodore

    2014-01-01

    This is a description of the software design for the 2013 edition of the Advanced Extravehicular Mobility Unit (AEMU) Informatics computer assembly. The Informatics system is an optional part of the space suit assembly. It adds a graphical interface for displaying suit status, timelines, procedures, and caution and warning information. In the future it will display maps with GPS position data, and video and still images captured by the astronaut.

  7. The advanced software development workstation project

    NASA Technical Reports Server (NTRS)

    Fridge, Ernest M., III; Pitman, Charles L.

    1991-01-01

    The Advanced Software Development Workstation (ASDW) task is researching and developing the technologies required to support Computer Aided Software Engineering (CASE) with the emphasis on those advanced methods, tools, and processes that will be of benefit to support all NASA programs. Immediate goals are to provide research and prototype tools that will increase productivity, in the near term, in projects such as the Software Support Environment (SSE), the Space Station Control Center (SSCC), and the Flight Analysis and Design System (FADS) which will be used to support the Space Shuttle and Space Station Freedom. Goals also include providing technology for development, evolution, maintenance, and operations. The technologies under research and development in the ASDW project are targeted to provide productivity enhancements during the software life cycle phase of enterprise and information system modeling, requirements generation and analysis, system design and coding, and system use and maintenance. On-line user's guides will assist users in operating the developed information system with knowledge base expert assistance.

  8. Control Software for Advanced Video Guidance Sensor

    NASA Technical Reports Server (NTRS)

    Howard, Richard T.; Book, Michael L.; Bryan, Thomas C.

    2006-01-01

    Embedded software has been developed specifically for controlling an Advanced Video Guidance Sensor (AVGS). A Video Guidance Sensor is an optoelectronic system that provides guidance for automated docking of two vehicles. Such a system includes pulsed laser diodes and a video camera, the output of which is digitized. From the positions of digitized target images and known geometric relationships, the relative position and orientation of the vehicles are computed. The present software consists of two subprograms running in two processors that are parts of the AVGS. The subprogram in the first processor receives commands from an external source, checks the commands for correctness, performs commanded non-image-data-processing control functions, and sends image data processing parts of commands to the second processor. The subprogram in the second processor processes image data as commanded. Upon power-up, the software performs basic tests of functionality, then effects a transition to a standby mode. When a command is received, the software goes into one of several operational modes (e.g. acquisition or tracking). The software then returns, to the external source, the data appropriate to the command.

  9. Software Framework for Advanced Power Plant Simulations

    SciTech Connect

    John Widmann; Sorin Munteanu; Aseem Jain; Pankaj Gupta; Mark Moales; Erik Ferguson; Lewis Collins; David Sloan; Woodrow Fiveland; Yi-dong Lang; Larry Biegler; Michael Locke; Simon Lingard; Jay Yun

    2010-08-01

    This report summarizes the work accomplished during the Phase II development effort of the Advanced Process Engineering Co-Simulator (APECS). The objective of the project is to develop the tools to efficiently combine high-fidelity computational fluid dynamics (CFD) models with process modeling software. During the course of the project, a robust integration controller was developed that can be used in any CAPE-OPEN compliant process modeling environment. The controller mediates the exchange of information between the process modeling software and the CFD software. Several approaches to reducing the time disparity between CFD simulations and process modeling have been investigated and implemented. These include enabling the CFD models to be run on a remote cluster and enabling multiple CFD models to be run simultaneously. Furthermore, computationally fast reduced-order models (ROMs) have been developed that can be 'trained' using the results from CFD simulations and then used directly within flowsheets. Unit operation models (both CFD and ROMs) can be uploaded to a model database and shared between multiple users.

  10. Integrated review software advances at Los Alamos

    SciTech Connect

    Klosterbuer, S. F.; Michel, K. D.; Betts, S. E.; Determan, J. C.; Longo, J. F.; Parker, R. F.; Pelowitz, D. G.; Rothrock, R. B.; Schneider, C. M.; Nordquist, H. M.

    2004-01-01

    Since 1988, Los Alamos National Laboratory (LANL) has been developing software for unattended monitoring systems. These systems are composed of three categories of software: acquisition, collection and review. The data acquisition software is contained in modular instrumentation distributed throughout facilities to continuously acquire data from devices ranging from radiation detectors to cameras to binary switches. The data collection software runs on computers connected to the instruments and offloads and stores the acquired data. The review software enables the end user to quickly and easily examine the data collected from these different systems and compare the results to declared operator activities. This paper addresses the review software. The original standalone review software processed only radiation data. This software was expanded to include new programs (tools) to display and correlate video and operator declarations and added an interface to the standard neutron coincidence counter analysis program. This expanded review software containing multiple review tools is referred to collectively as the Integrated Review Software (IRS). The IRS continues to expand and evolve. Two primary IRS developments will be described in this paper. First, the IRS was expanded to include review tools to display and analyze new data types. Position Review was developed to display Global Positioning System (GPS) location data to aid in tracking radiation movements. Isotopic Review is being developed to provide a link to the standard gamma isotopic analysis software. In addition significant enhancements are being added to the existing review tools such as Operator Review, Radiation Review and Digital Video Review. A second IRS development is to produce standardized components with published interfaces enabling other parties to produce custom components that plug into review software. It is anticipated that there will be four primary types of components that could be

  11. Advances in knowledge-based software engineering

    NASA Technical Reports Server (NTRS)

    Truszkowski, Walt

    1991-01-01

    The underlying hypothesis of this work is that a rigorous and comprehensive software reuse methodology can bring about a more effective and efficient utilization of constrained resources in the development of large-scale software systems by both government and industry. It is also believed that correct use of this type of software engineering methodology can significantly contribute to the higher levels of reliability that will be required of future operational systems. An overview and discussion of current research in the development and application of two systems that support a rigorous reuse paradigm are presented: the Knowledge-Based Software Engineering Environment (KBSEE) and the Knowledge Acquisition fo the Preservation of Tradeoffs and Underlying Rationales (KAPTUR) systems. Emphasis is on a presentation of operational scenarios which highlight the major functional capabilities of the two systems.

  12. Advanced software development workstation project: Engineering scripting language. Graphical editor

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Software development is widely considered to be a bottleneck in the development of complex systems, both in terms of development and in terms of maintenance of deployed systems. Cost of software development and maintenance can also be very high. One approach to reducing costs and relieving this bottleneck is increasing the reuse of software designs and software components. A method for achieving such reuse is a software parts composition system. Such a system consists of a language for modeling software parts and their interfaces, a catalog of existing parts, an editor for combining parts, and a code generator that takes a specification and generates code for that application in the target language. The Advanced Software Development Workstation is intended to be an expert system shell designed to provide the capabilities of a software part composition system.

  13. Advancing Software Architecture Modeling for Large Scale Heterogeneous Systems

    SciTech Connect

    Gorton, Ian; Liu, Yan

    2010-11-07

    In this paper we describe how incorporating technology-specific modeling at the architecture level can help reduce risks and produce better designs for large, heterogeneous software applications. We draw an analogy with established modeling approaches in scientific domains, using groundwater modeling as an example, to help illustrate gaps in current software architecture modeling approaches. We then describe the advances in modeling, analysis and tooling that are required to bring sophisticated modeling and development methods within reach of software architects.

  14. Advanced information processing system: Input/output network management software

    NASA Technical Reports Server (NTRS)

    Nagle, Gail; Alger, Linda; Kemp, Alexander

    1988-01-01

    The purpose of this document is to provide the software requirements and specifications for the Input/Output Network Management Services for the Advanced Information Processing System. This introduction and overview section is provided to briefly outline the overall architecture and software requirements of the AIPS system before discussing the details of the design requirements and specifications of the AIPS I/O Network Management software. A brief overview of the AIPS architecture followed by a more detailed description of the network architecture.

  15. Byonic: Advanced Peptide and Protein Identification Software

    PubMed Central

    Bern, Marshall; Kil, Yong J.; Becker, Christopher

    2013-01-01

    Byonic™ is the name of a software package for peptide and protein identification by tandem mass spectrometry. This software, which has only recently become commercially available, facilitates a much wider range of search possibilities than previous search software such as SEQUEST and Mascot. Byonic allows the user to define an essentially unlimited number of variable modification types. Byonic also allows the user to set a separate limit on the number of occurrences of each modification type, so that a search may consider only one or two chance modifications such as oxidations and deamidations per peptide, yet allow three or four biological modifications such as phosphorylations, which tend to cluster together. Hence Byonic can search for 10s or even 100s of modification types simultaneously without a prohibitively large combinatorial explosion. Byonic’s Wildcard Search™ allows the user to search for unanticipated or even unknown modifications alongside known modifications. Finally, Byonic’s Glycopeptide Search allows the user to identify glycopeptides without prior knowledge of glycan masses or glycosylation sites. PMID:23255153

  16. Object-based media and stream-based computing

    NASA Astrophysics Data System (ADS)

    Bove, V. Michael, Jr.

    1998-03-01

    Object-based media refers to the representation of audiovisual information as a collection of objects - the result of scene-analysis algorithms - and a script describing how they are to be rendered for display. Such multimedia presentations can adapt to viewing circumstances as well as to viewer preferences and behavior, and can provide a richer link between content creator and consumer. With faster networks and processors, such ideas become applicable to live interpersonal communications as well, creating a more natural and productive alternative to traditional videoconferencing. In this paper is outlined an example of object-based media algorithms and applications developed by my group, and present new hardware architectures and software methods that we have developed to enable meeting the computational requirements of object- based and other advanced media representations. In particular we describe stream-based processing, which enables automatic run-time parallelization of multidimensional signal processing tasks even given heterogenous computational resources.

  17. Advanced Transport Operating System (ATOPS) utility library software description

    NASA Technical Reports Server (NTRS)

    Clinedinst, Winston C.; Slominski, Christopher J.; Dickson, Richard W.; Wolverton, David A.

    1993-01-01

    The individual software processes used in the flight computers on-board the Advanced Transport Operating System (ATOPS) aircraft have many common functional elements. A library of commonly used software modules was created for general uses among the processes. The library includes modules for mathematical computations, data formatting, system database interfacing, and condition handling. The modules available in the library and their associated calling requirements are described.

  18. Advanced transport operating system software upgrade: Flight management/flight controls software description

    NASA Technical Reports Server (NTRS)

    Clinedinst, Winston C.; Debure, Kelly R.; Dickson, Richard W.; Heaphy, William J.; Parks, Mark A.; Slominski, Christopher J.; Wolverton, David A.

    1988-01-01

    The Flight Management/Flight Controls (FM/FC) software for the Norden 2 (PDP-11/70M) computer installed on the NASA 737 aircraft is described. The software computes the navigation position estimates, guidance commands, those commands to be issued to the control surfaces to direct the aircraft in flight based on the modes selected on the Advanced Guidance Control System (AGSC) mode panel, and the flight path selected via the Navigation Control/Display Unit (NCDU).

  19. Advances in Games Technology: Software, Models, and Intelligence

    ERIC Educational Resources Information Center

    Prakash, Edmond; Brindle, Geoff; Jones, Kevin; Zhou, Suiping; Chaudhari, Narendra S.; Wong, Kok-Wai

    2009-01-01

    Games technology has undergone tremendous development. In this article, the authors report the rapid advancement that has been observed in the way games software is being developed, as well as in the development of games content using game engines. One area that has gained special attention is modeling the game environment such as terrain and…

  20. Advanced software development workstation. OOPSLA 1992 Conference. Trip report

    NASA Technical Reports Server (NTRS)

    Izygon, Michel E.

    1992-01-01

    This report summarizes the main trends observed at the Object Oriented Programming Systems, Languages, and Applications Conference held in Vancouver, British Columbia. This conference is the main object-oriented event that allows us to assess the dynamism of the technology and to meet the main actors of the field. It is an invaluable source of information for the advanced software development project.

  1. Advanced Spacesuit Informatics Software Design for Power, Avionics and Software Version 2.0

    NASA Technical Reports Server (NTRS)

    Wright, Theodore W.

    2016-01-01

    A description of the software design for the 2016 edition of the Informatics computer assembly of the NASAs Advanced Extravehicular Mobility Unit (AEMU), also called the Advanced Spacesuit. The Informatics system is an optional part of the spacesuit assembly. It adds a graphical interface for displaying suit status, timelines, procedures, and warning information. It also provides an interface to the suit mounted camera for recording still images, video, and audio field notes.

  2. Software for the ACP (Advanced Computer Program) multiprocessor system

    SciTech Connect

    Biel, J.; Areti, H.; Atac, R.; Cook, A.; Fischler, M.; Gaines, I.; Kaliher, C.; Hance, R.; Husby, D.; Nash, T.

    1987-02-02

    Software has been developed for use with the Fermilab Advanced Computer Program (ACP) multiprocessor system. The software was designed to make a system of a hundred independent node processors as easy to use as a single, powerful CPU. Subroutines have been developed by which a user's host program can send data to and get results from the program running in each of his ACP node processors. Utility programs make it easy to compile and link host and node programs, to debug a node program on an ACP development system, and to submit a debugged program to an ACP production system.

  3. Advanced Transport Operating System (ATOPS) control display unit software description

    NASA Technical Reports Server (NTRS)

    Slominski, Christopher J.; Parks, Mark A.; Debure, Kelly R.; Heaphy, William J.

    1992-01-01

    The software created for the Control Display Units (CDUs), used for the Advanced Transport Operating Systems (ATOPS) project, on the Transport Systems Research Vehicle (TSRV) is described. Module descriptions are presented in a standardized format which contains module purpose, calling sequence, a detailed description, and global references. The global reference section includes subroutines, functions, and common variables referenced by a particular module. The CDUs, one for the pilot and one for the copilot, are used for flight management purposes. Operations performed with the CDU affects the aircraft's guidance, navigation, and display software.

  4. Center for Technology for Advanced Scientific Component Software (TASCS)

    SciTech Connect

    Damevski, Kostadin

    2009-03-30

    A resounding success of the Scientific Discover through Advanced Computing (SciDAC) program is that high-performance computational science is now universally recognized as a critical aspect of scientific discovery [71], complementing both theoretical and experimental research. As scientific communities prepare to exploit unprecedened computing capabilities of emerging leadership-class machines for multi-model simulations at the extreme scale [72], it is more important than ever to address the technical and social challenges of geographically distributed teams that combine expertise in domain science, applied mathematics, and computer science to build robust and flexible codes that can incorporate changes over time. The Center for Technology for Advanced Scientific Component Software (TASCS) tackles these issues by exploiting component-based software development to facilitate collaborative hig-performance scientific computing.

  5. Software control of the Advanced Technology Solar Telescope enclosure PLC hardware using COTS software

    NASA Astrophysics Data System (ADS)

    Borrowman, Alastair J.; de Bilbao, Lander; Ariño, Javier; Murga, Gaizka; Goodrich, Bret; Hubbard, John R.; Greer, Alan; Mayer, Chris; Taylor, Philip

    2012-09-01

    As PLCs evolve from simple logic controllers into more capable Programmable Automation Controllers (PACs), observatories are increasingly using such devices to control complex mechanisms1, 2. This paper describes use of COTS software to control such hardware using the Advanced Technology Solar Telescope (ATST) Common Services Framework (CSF). We present the Enclosure Control System (ECS) under development in Spain and the UK. The paper details selection of the commercial PLC communication library PLCIO. Implemented in C and delivered with source code, the library separates the programmer from communication details through a simple API. Capable of communicating with many types of PLCs (including Allen-Bradley and Siemens) the API remains the same irrespective of PLC in use. The ECS is implemented in Java using the observatory's framework that provides common services for software components. We present a design following a connection-based approach where all components access the PLC through a single connection class. The link between Java and PLCIO C library is provided by a thin Java Native Interface (JNI) layer. Also presented is a software simulator of the PLC based upon the PLCIO Virtual PLC. This creates a simulator operating below the library's API and thus requires no change to ECS software. It also provides enhanced software testing capabilities prior to hardware becoming available. Results are presented in the form of communication timing test data, showing that the use of CSF, JNI and PLCIO provide a control system capable of controlling enclosure tracking mechanisms, that would be equally valid for telescope mount control.

  6. Constructing an advanced software tool for planetary atmospheric modeling

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Sims, Michael; Podolak, Ester; Mckay, Christopher

    1990-01-01

    Scientific model building can be an intensive and painstaking process, often involving the development of large and complex computer programs. Despite the effort involved, scientific models cannot be easily distributed and shared with other scientists. In general, implemented scientific models are complex, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We believe that advanced software techniques can facilitate both the model building and model sharing process. In this paper, we describe a prototype for a scientific modeling software tool that serves as an aid to the scientist in developing and using models. This tool includes an interactive intelligent graphical interface, a high level domain specific modeling language, a library of physics equations and experimental datasets, and a suite of data display facilities. Our prototype has been developed in the domain of planetary atmospheric modeling, and is being used to construct models of Titan's atmosphere.

  7. Development of Advanced Multi-Modality Radiation Treatment Planning Software

    SciTech Connect

    Nigg, D W; Hartmann Siantar, C

    2002-02-19

    The Idaho National Engineering and Environmental Laboratory (INEEL) has long been active in development of advanced Monte-Carlo based computational dosimetry and treatment planning methods and software for advanced radiotherapy, with a particular focus on Neutron Capture Therapy (NCT) and, to a somewhat lesser extent, Fast-Neutron Therapy. The most recent INEEL software product system of this type is known as SERA, Simulation Environment for Radiotherapy Applications. SERA is at a mature level in its life cycle, it has been licensed for research use worldwide, and it has become well established as a computational tool for research. However, along with its strengths, SERA also has some limitations in its structure and computational methodologies. More specifically, it is optimized only for neutron-based applications. Although photon transport can be computed with SERA, the simplified model that is used is designed primarily for photons produced in the neutron transport process. Thus SERA is not appropriate for applications to, for example, standard external-beam photon radiotherapy, which is by far more commonly used in the clinic than neutron based therapy.

  8. Advanced Data Format (ADF) Software Library and Users Guide

    NASA Technical Reports Server (NTRS)

    Smith, Matthew; Smith, Charles A. (Technical Monitor)

    1998-01-01

    The "CFD General Notation System" (CGNS) consists of a collection of conventions, and conforming software, for the storage and retrieval of Computational Fluid Dynamics (CFD) data. It facilitates the exchange of data between sites and applications, and helps stabilize the archiving of aerodynamic data. This effort was initiated in order to streamline the procedures in exchanging data and software between NASA and its customers, but the goal is to develop CGNS into a National Standard for the exchange of aerodynamic data. The CGNS development team is comprised of members from Boeing Commercial. Airplane Group, NASA-Ames, NASA-Langley, NASA-Lewis, McDonnell-Douglas Corporation (now Boeing-St. Louis), Air Force-Wright Lab., and ICEM-CFD Engineering. The elements of CGNS address all activities associated with the storage of data on external media and its movement to and from application programs. These elements include: 1) The Advanced Data Format (ADF) Database manager, consisting of both a file format specification and its 1/0 software, which handles the actual reading and writing of data from and to external storage media; 2) The Standard Interface Data Structures (SIDS), which specify the intellectual content of CFD data and the conventions governing naming and terminology; 3) The SIDS-to-ADF File Mapping conventions, which specify the exact location where the CFD data defined by the SIDS is to be stored within the ADF file(s); and 4) The CGNS Mid-level Library, which provides CFD-knowledgeable routines suitable for direct installation into application codes. The ADF is a generic database manager with minimal intrinsic capability. It was written for the purpose of storing large numerical datasets in an efficient, platform independent manner. To be effective, it must be used in conjunction with external agreements on how the data will be organized within the ADF database such defined by the SIDS. There are currently 34 user callable functions that comprise the ADF

  9. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan : ASC software quality engineering practices Version 3.0.

    SciTech Connect

    Turgeon, Jennifer L.; Minana, Molly A.; Hackney, Patricia; Pilch, Martin M.

    2009-01-01

    The purpose of the Sandia National Laboratories (SNL) Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. Quality is defined in the US Department of Energy/National Nuclear Security Agency (DOE/NNSA) Quality Criteria, Revision 10 (QC-1) as 'conformance to customer requirements and expectations'. This quality plan defines the SNL ASC Program software quality engineering (SQE) practices and provides a mapping of these practices to the SNL Corporate Process Requirement (CPR) 001.3.6; 'Corporate Software Engineering Excellence'. This plan also identifies ASC management's and the software project teams responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals. This SNL ASC Software Quality Plan establishes the signatories commitments to improving software products by applying cost-effective SQE practices. This plan enumerates the SQE practices that comprise the development of SNL ASC's software products and explains the project teams opportunities for tailoring and implementing the practices.

  10. Final Technical Report - Center for Technology for Advanced Scientific Component Software (TASCS)

    SciTech Connect

    Sussman, Alan

    2014-10-21

    This is a final technical report for the University of Maryland work in the SciDAC Center for Technology for Advanced Scientific Component Software (TASCS). The Maryland work focused on software tools for coupling parallel software components built using the Common Component Architecture (CCA) APIs. Those tools are based on the Maryland InterComm software framework that has been used in multiple computational science applications to build large-scale simulations of complex physical systems that employ multiple separately developed codes.

  11. An object-based interviewing system

    SciTech Connect

    Tonn, B.; Goeltz, R. ); Chiang, Tai-Lun )

    1992-01-01

    Oak Ridge National Laboratory (ORNL) has developed an object-based interviewing system (OBIS). The key design feature is that each survey question is a function, which is treated as an independent object. The Survey Manager (SM) module maintains a stack of questions and calls each question function when directed. Each question, when called, calls the Interaction Facility (IF) to set up the appropriate screen. This modular approach to automated survey design offers maximum flexibility for system development and maintenance. The software is written in Common Lisp and currently runs on Symbolics and VAX computers. The Bureau of Labor Statistics is funding a project to use OBIS to automate the Current Population Survey (CPS), an expenditure survey, and questions about intrashousehold communication to collect data to study communication and proxy response error.

  12. An object-based interviewing system

    SciTech Connect

    Tonn, B.; Goeltz, R.; Chiang, Tai-Lun

    1992-05-01

    Oak Ridge National Laboratory (ORNL) has developed an object-based interviewing system (OBIS). The key design feature is that each survey question is a function, which is treated as an independent object. The Survey Manager (SM) module maintains a stack of questions and calls each question function when directed. Each question, when called, calls the Interaction Facility (IF) to set up the appropriate screen. This modular approach to automated survey design offers maximum flexibility for system development and maintenance. The software is written in Common Lisp and currently runs on Symbolics and VAX computers. The Bureau of Labor Statistics is funding a project to use OBIS to automate the Current Population Survey (CPS), an expenditure survey, and questions about intrashousehold communication to collect data to study communication and proxy response error.

  13. Recent progress and advances in iterative software (including parallel aspects)

    SciTech Connect

    Carey, G.; Young, D.M.; Kincaid, D.

    1994-12-31

    The purpose of the workshop is to provide a forum for discussion of the current state of iterative software packages. Of particular interest is software for large scale engineering and scientific applications, especially for distributed parallel systems. However, the authors will also review the state of software development for conventional architectures. This workshop will complement the other proposed workshops on iterative BLAS kernels and applications. The format for the workshop is as follows: To provide some structure, there will be brief presentations, each of less than five minutes duration and dealing with specific facets of the subject. These will be designed to focus the discussion and to stimulate an exchange with the participants. Issues to be covered include: The evolution of iterative packages, current state of the art, the parallel computing challenge, applications viewpoint, standards, and future directions and open problems.

  14. Final Report for "Center for Technology for Advanced Scientific Component Software"

    SciTech Connect

    Svetlana Shasharina

    2010-12-01

    The goal of the Center for Technology for Advanced Scientific Component Software is to fundamentally changing the way scientific software is developed and used by bringing component-based software development technologies to high-performance scientific and engineering computing. The role of Tech-X work in TASCS project is to provide an outreach to accelerator physics and fusion applications by introducing TASCS tools into applications, testing tools in the applications and modifying the tools to be more usable.

  15. Software Partitioning Schemes for Advanced Simulation Computer Systems. Final Report.

    ERIC Educational Resources Information Center

    Clymer, S. J.

    Conducted to design software partitioning techniques for use by the Air Force to partition a large flight simulator program for optimal execution on alternative configurations, this study resulted in a mathematical model which defines characteristics for an optimal partition, and a manually demonstrated partitioning algorithm design which…

  16. Musical Composition and Creativity in an Advanced Software Environment

    ERIC Educational Resources Information Center

    Reynolds, Nicholas

    2002-01-01

    This paper serves as a brief description of research into the use of professional level music software as a learning tool for creativity and composition by primary school children. The research formed the basis of a Master of Information Technology in Education degree at the University of Melbourne. The paper examines the physical environment, the…

  17. Building Software Development Capacity to Advance the State of Educational Technology

    ERIC Educational Resources Information Center

    Luterbach, Kenneth J.

    2013-01-01

    Educational technologists may advance the state of the field by increasing capacity to develop software tools and instructional applications. Presently, few academic programs in educational technology require even a single computer programming course. Further, the educational technologists who develop software generally work independently or in…

  18. Center for Technology for Advanced Scientific Componet Software (TASCS)

    SciTech Connect

    Govindaraju, Madhusudhan

    2010-10-31

    Advanced Scientific Computing Research Computer Science FY 2010Report Center for Technology for Advanced Scientific Component Software: Distributed CCA State University of New York, Binghamton, NY, 13902 Summary The overall objective of Binghamton's involvement is to work on enhancements of the CCA environment, motivated by the applications and research initiatives discussed in the proposal. This year we are working on re-focusing our design and development efforts to develop proof-of-concept implementations that have the potential to significantly impact scientific components. We worked on developing parallel implementations for non-hydrostatic code and worked on a model coupling interface for biogeochemical computations coded in MATLAB. We also worked on the design and implementation modules that will be required for the emerging MapReduce model to be effective for scientific applications. Finally, we focused on optimizing the processing of scientific datasets on multi-core processors. Research Details We worked on the following research projects that we are working on applying to CCA-based scientific applications. 1. Non-Hydrostatic Hydrodynamics: Non-static hydrodynamics are significantly more accurate at modeling internal waves that may be important in lake ecosystems. Non-hydrostatic codes, however, are significantly more computationally expensive, often prohibitively so. We have worked with Chin Wu at the University of Wisconsin to parallelize non-hydrostatic code. We have obtained a speed up of about 26 times maximum. Although this is significant progress, we hope to improve the performance further, such that it becomes a practical alternative to hydrostatic codes. 2. Model-coupling for water-based ecosystems: To answer pressing questions about water resources requires that physical models (hydrodynamics) be coupled with biological and chemical models. Most hydrodynamics codes are written in Fortran, however, while most ecologists work in MATLAB. This

  19. Object-Based Image Compression

    NASA Astrophysics Data System (ADS)

    Schmalz, Mark S.

    2003-01-01

    Image compression frequently supports reduced storage requirement in a computer system, as well as enhancement of effective channel bandwidth in a communication system, by decreasing the source bit rate through reduction of source redundancy. The majority of image compression techniques emphasize pixel-level operations, such as matching rectangular or elliptical sampling blocks taken from the source data stream, with exemplars stored in a database (e.g., a codebook in vector quantization or VQ). Alternatively, one can represent a source block via transformation, coefficient quantization, and selection of coefficients deemed significant for source content approximation in the decompressed image. This approach, called transform coding (TC), has predominated for several decades in the signal and image processing communities. A further technique that has been employed is the deduction of affine relationships from source properties such as local self-similarity, which supports the construction of adaptive codebooks in a self-VQ paradigm that has been called iterated function systems (IFS). Although VQ, TC, and IFS based compression algorithms have enjoyed varying levels of success for different types of applications, bit rate requirements, and image quality constraints, few of these algorithms examine the higher-level spatial structure of an image, and fewer still exploit this structure to enhance compression ratio. In this paper, we discuss a fourth type of compression algorithm, called object-based compression, which is based on research in joint segmentaton and compression, as well as previous research in the extraction of sketch-like representations from digital imagery. Here, large image regions that correspond to contiguous recognizeable objects or parts of objects are segmented from the source, then represented compactly in the compressed image. Segmentation is facilitated by source properties such as size, shape, texture, statistical properties, and spectral

  20. Recent advances in the CRANK software suite for experimental phasing

    SciTech Connect

    Pannu, Navraj S. Waterreus, Willem-Jan; Skubák, Pavol; Sikharulidze, Irakli; Abrahams, Jan Pieter; Graaff, Rudolf A. G. de

    2011-04-01

    Recent developments in the CRANK software suite for experimental phasing have led to many more structures being built automatically. For its first release in 2004, CRANK was shown to effectively detect and phase anomalous scatterers from single-wavelength anomalous diffraction data. Since then, CRANK has been significantly improved and many more structures can be built automatically with single- or multiple-wavelength anomalous diffraction or single isomorphous replacement with anomalous scattering data. Here, the new algorithms that have been developed that have led to these substantial improvements are discussed and CRANK’s performance on over 100 real data sets is shown. The latest version of CRANK is freely available for download at http://www.bfsc.leidenuniv.nl/software/crank/ and from CCP4 (http://www.ccp4.ac.uk/)

  1. Advanced Numerical Methods and Software Approaches for Semiconductor Device Simulation

    DOE PAGES

    Carey, Graham F.; Pardhanani, A. L.; Bova, S. W.

    2000-01-01

    In this article we concisely present several modern strategies that are applicable to driftdominated carrier transport in higher-order deterministic models such as the driftdiffusion, hydrodynamic, and quantum hydrodynamic systems. The approaches include extensions of “upwind” and artificial dissipation schemes, generalization of the traditional Scharfetter – Gummel approach, Petrov – Galerkin and streamline-upwind Petrov Galerkin (SUPG), “entropy” variables, transformations, least-squares mixed methods and other stabilized Galerkin schemes such as Galerkin least squares and discontinuous Galerkin schemes. The treatment is representative rather than an exhaustive review and several schemes are mentioned only briefly with appropriate reference to the literature. Some of themore » methods have been applied to the semiconductor device problem while others are still in the early stages of development for this class of applications. We have included numerical examples from our recent research tests with some of the methods. A second aspect of the work deals with algorithms that employ unstructured grids in conjunction with adaptive refinement strategies. The full benefits of such approaches have not yet been developed in this application area and we emphasize the need for further work on analysis, data structures and software to support adaptivity. Finally, we briefly consider some aspects of software frameworks. These include dial-an-operator approaches such as that used in the industrial simulator PROPHET, and object-oriented software support such as those in the SANDIA National Laboratory framework SIERRA.« less

  2. Advanced numerical methods and software approaches for semiconductor device simulation

    SciTech Connect

    CAREY,GRAHAM F.; PARDHANANI,A.L.; BOVA,STEVEN W.

    2000-03-23

    In this article the authors concisely present several modern strategies that are applicable to drift-dominated carrier transport in higher-order deterministic models such as the drift-diffusion, hydrodynamic, and quantum hydrodynamic systems. The approaches include extensions of upwind and artificial dissipation schemes, generalization of the traditional Scharfetter-Gummel approach, Petrov-Galerkin and streamline-upwind Petrov Galerkin (SUPG), entropy variables, transformations, least-squares mixed methods and other stabilized Galerkin schemes such as Galerkin least squares and discontinuous Galerkin schemes. The treatment is representative rather than an exhaustive review and several schemes are mentioned only briefly with appropriate reference to the literature. Some of the methods have been applied to the semiconductor device problem while others are still in the early stages of development for this class of applications. They have included numerical examples from the recent research tests with some of the methods. A second aspect of the work deals with algorithms that employ unstructured grids in conjunction with adaptive refinement strategies. The full benefits of such approaches have not yet been developed in this application area and they emphasize the need for further work on analysis, data structures and software to support adaptivity. Finally, they briefly consider some aspects of software frameworks. These include dial-an-operator approaches such as that used in the industrial simulator PROPHET, and object-oriented software support such as those in the SANDIA National Laboratory framework SIERRA.

  3. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan. Part 1: ASC software quality engineering practices, Version 2.0.

    SciTech Connect

    Sturtevant, Judith E.; Heaphy, Robert; Hodges, Ann Louise; Boucheron, Edward A.; Drake, Richard Roy; Minana, Molly A.; Hackney, Patricia; Forsythe, Christi A.; Schofield, Joseph Richard, Jr.; Pavlakos, Constantine James; Williamson, Charles Michael; Edwards, Harold Carter

    2006-09-01

    The purpose of the Sandia National Laboratories Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. The plan defines the ASC program software quality practices and provides mappings of these practices to Sandia Corporate Requirements CPR 1.3.2 and 1.3.6 and to a Department of Energy document, ASCI Software Quality Engineering: Goals, Principles, and Guidelines. This document also identifies ASC management and software project teams responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals.

  4. Sandia National Laboratories Advanced Simulation and Computing (ASC) Software Quality Plan. Part 2, Mappings for the ASC software quality engineering practices. Version 1.0.

    SciTech Connect

    Ellis, Molly A.; Heaphy, Robert; Sturtevant, Judith E.; Hodges, Ann Louise; Boucheron, Edward A.; Drake, Richard Roy; Forsythe, Christi A.; Schofield, Joseph Richard, Jr.; Pavlakos, Constantine James; Williamson, Charles Michael; Edwards, Harold Carter

    2005-01-01

    The purpose of the Sandia National Laboratories Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. The plan defines the ASC program software quality practices and provides mappings of these practices to Sandia Corporate Requirements CPR 1.3.2 and 1.3.6 and to a Department of Energy document, 'ASCI Software Quality Engineering: Goals, Principles, and Guidelines'. This document also identifies ASC management and software project teams responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals.

  5. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan part 2 mappings for the ASC software quality engineering practices, version 2.0.

    SciTech Connect

    Heaphy, Robert; Sturtevant, Judith E.; Hodges, Ann Louise; Boucheron, Edward A.; Drake, Richard Roy; Minana, Molly A.; Hackney, Patricia; Forsythe, Christi A.; Schofield, Joseph Richard, Jr.; Pavlakos, Constantine James; Williamson, Charles Michael; Edwards, Harold Carter

    2006-09-01

    The purpose of the Sandia National Laboratories Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. The plan defines the ASC program software quality practices and provides mappings of these practices to Sandia Corporate Requirements CPR001.3.2 and CPR001.3.6 and to a Department of Energy document, ''ASCI Software Quality Engineering: Goals, Principles, and Guidelines''. This document also identifies ASC management and software project teams' responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals.

  6. Advanced program development management software system. Software description and user's manual

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The objectives of this project were to apply emerging techniques and tools from the computer science discipline of paperless management to the activities of the Space Transportation and Exploration Office (PT01) in Marshall Space Flight Center (MSFC) Program Development, thereby enhancing the productivity of the workforce, the quality of the data products, and the collection, dissemination, and storage of information. The approach used to accomplish the objectives emphasized the utilization of finished form (off-the-shelf) software products to the greatest extent possible without impacting the performance of the end product, to pursue developments when necessary in the rapid prototyping environment to provide a mechanism for frequent feedback from the users, and to provide a full range of user support functions during the development process to promote testing of the software.

  7. Development of Advanced Multi-Modality Radiation Treatment Planning Software for Neutron Radiotherapy and Beyond

    SciTech Connect

    Nigg, D; Wessol, D; Wemple, C; Harkin, G; Hartmann-Siantar, C

    2002-08-20

    The Idaho National Engineering and Environmental Laboratory (INEEL) has long been active in development of advanced Monte-Carlo based computational dosimetry and treatment planning methods and software for advanced radiotherapy, with a particular focus on Neutron Capture Therapy (NCT) and, to a somewhat lesser extent, Fast-Neutron Therapy. The most recent INEEL software system of this type is known as SERA, Simulation Environment for Radiotherapy Applications. As a logical next step in the development of modern radiotherapy planning tools to support the most advanced research, INEEL and Lawrence Livermore National Laboratory (LLNL), the developers of the PEREGRTNE computational engine for radiotherapy treatment planning applications, have recently launched a new project to collaborate in the development of a ''next-generation'' multi-modality treatment planning software system that will be useful for all modern forms of radiotherapy.

  8. Advances in Single-Photon Emission Computed Tomography Hardware and Software.

    PubMed

    Piccinelli, Marina; Garcia, Ernest V

    2016-02-01

    Nuclear imaging techniques remain today's most reliable modality for the assessment and quantification of myocardial perfusion. In recent years, the field has experienced tremendous progress both in terms of dedicated cameras for cardiac applications and software techniques for image reconstruction. The most recent advances in single-photon emission computed tomography hardware and software are reviewed, focusing on how these improvements have resulted in an even more powerful diagnostic tool with reduced injected radiation dose and acquisition time.

  9. Using CONFIG for Simulation of Operation of Water Recovery Subsystems for Advanced Control Software Evaluation

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Flores, Luis; Fleming, Land; Throop, Daiv

    2002-01-01

    A hybrid discrete/continuous simulation tool, CONFIG, has been developed to support evaluation of the operability life support systems. CON FIG simulates operations scenarios in which flows and pressures change continuously while system reconfigurations occur as discrete events. In simulations, intelligent control software can interact dynamically with hardware system models. CONFIG simulations have been used to evaluate control software and intelligent agents for automating life support systems operations. A CON FIG model of an advanced biological water recovery system has been developed to interact with intelligent control software that is being used in a water system test at NASA Johnson Space Center

  10. Cost benefits of advanced software: A review of methodology used at Kennedy Space Center

    NASA Technical Reports Server (NTRS)

    Joglekar, Prafulla N.

    1993-01-01

    To assist rational investments in advanced software, a formal, explicit, and multi-perspective cost-benefit analysis methodology is proposed. The methodology can be implemented through a six-stage process which is described and explained. The current practice of cost-benefit analysis at KSC is reviewed in the light of this methodology. The review finds that there is a vicious circle operating. Unsound methods lead to unreliable cost-benefit estimates. Unreliable estimates convince management that cost-benefit studies should not be taken seriously. Then, given external demands for cost-benefit estimates, management encourages software enginees to somehow come up with the numbers for their projects. Lacking the expertise needed to do a proper study, courageous software engineers with vested interests use ad hoc and unsound methods to generate some estimates. In turn, these estimates are unreliable, and the cycle continues. The proposed methodology should help KSC to break out of this cycle.

  11. Advanced Transport Operating System (ATOPS) Flight Management/Flight Controls (FM/FC) software description

    NASA Technical Reports Server (NTRS)

    Wolverton, David A.; Dickson, Richard W.; Clinedinst, Winston C.; Slominski, Christopher J.

    1993-01-01

    The flight software developed for the Flight Management/Flight Controls (FM/FC) MicroVAX computer used on the Transport Systems Research Vehicle for Advanced Transport Operating Systems (ATOPS) research is described. The FM/FC software computes navigation position estimates, guidance commands, and those commands issued to the control surfaces to direct the aircraft in flight. Various modes of flight are provided for, ranging from computer assisted manual modes to fully automatic modes including automatic landing. A high-level system overview as well as a description of each software module comprising the system is provided. Digital systems diagrams are included for each major flight control component and selected flight management functions.

  12. Proposal for constructing an advanced software tool for planetary atmospheric modeling

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Sims, Michael H.; Podolak, Esther; Mckay, Christopher P.; Thompson, David E.

    1990-01-01

    Scientific model building can be a time intensive and painstaking process, often involving the development of large and complex computer programs. Despite the effort involved, scientific models cannot easily be distributed and shared with other scientists. In general, implemented scientific models are complex, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We believe that advanced software techniques can facilitate both the model building and model sharing process. We propose to construct a scientific modeling software tool that serves as an aid to the scientist in developing and using models. The proposed tool will include an interactive intelligent graphical interface and a high level, domain specific, modeling language. As a testbed for this research, we propose development of a software prototype in the domain of planetary atmospheric modeling.

  13. Center for Technology for Advanced Scientific Component Software (TASCS) Consolidated Progress Report July 2006 - March 2009

    SciTech Connect

    Bernholdt, D E; McInnes, L C; Govindaraju, M; Bramley, R; Epperly, T; Kohl, J A; Nieplocha, J; Armstrong, R; Shasharina, S; Sussman, A L; Sottile, M; Damevski, K

    2009-04-14

    A resounding success of the Scientific Discovery through Advanced Computing (SciDAC) program is that high-performance computational science is now universally recognized as a critical aspect of scientific discovery [71], complementing both theoretical and experimental research. As scientific communities prepare to exploit unprecedented computing capabilities of emerging leadership-class machines for multi-model simulations at the extreme scale [72], it is more important than ever to address the technical and social challenges of geographically distributed teams that combine expertise in domain science, applied mathematics, and computer science to build robust and flexible codes that can incorporate changes over time. The Center for Technology for Advanced Scientific Component Software (TASCS) tackles these issues by exploiting component-based software development to facilitate collaborative high-performance scientific computing.

  14. Center for Center for Technology for Advanced Scientific Component Software (TASCS)

    SciTech Connect

    Kostadin, Damevski

    2015-01-25

    A resounding success of the Scientific Discovery through Advanced Computing (SciDAC) program is that high-performance computational science is now universally recognized as a critical aspect of scientific discovery [71], complementing both theoretical and experimental research. As scientific communities prepare to exploit unprecedented computing capabilities of emerging leadership-class machines for multi-model simulations at the extreme scale [72], it is more important than ever to address the technical and social challenges of geographically distributed teams that combine expertise in domain science, applied mathematics, and computer science to build robust and flexible codes that can incorporate changes over time. The Center for Technology for Advanced Scientific Component Software (TASCS)1 tackles these these issues by exploiting component-based software development to facilitate collaborative high-performance scientific computing.

  15. Advances in software development for intelligent interfaces for alarm and emergency management consoles

    SciTech Connect

    Moseley, M.R.; Olson, C.E.

    1986-01-01

    Recent advances in technology allow features like voice synthesis, voice and speech recognition, image understanding, and intelligent data base management to be incorporated in computer driven alarm and emergency management information systems. New software development environments make it possible to do rapid prototyping of custom applications. Three examples using these technologies are discussed. (1) Maximum use is made of high-speed graphics and voice synthesis to implement a state-of-the-art alarm processing and display system with features that make the operator-machine interface efficient and accurate. Although very functional, this system is not portable or flexible; the software would have to be substantially rewritten for other applications. (2) An application generator which has the capability of ''building'' a specific alarm processing and display application in a matter of a few hours, using the site definition developed in the security planning phase to produce the custom application. This package is based on a standardized choice of hardware, within which it is capable of building a system to order, automatically constructing graphics, data tables, alarm prioritization rules, and interfaces to peripherals. (3) A software tool, the User Interface Management System (UIMS), is described which permits rapid prototyping of human-machine interfaces for a variety of applications including emergency management, alarm display and process information display. The object-oriented software of the UIMS achieves rapid prototyping of a new interface by standardizing to a class library of software objects instead of hardware objects.

  16. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan. Part 1 : ASC software quality engineering practices version 1.0.

    SciTech Connect

    Minana, Molly A.; Sturtevant, Judith E.; Heaphy, Robert; Hodges, Ann Louise; Boucheron, Edward A.; Drake, Richard Roy; Forsythe, Christi A.; Schofield, Joseph Richard, Jr.; Pavlakos, Constantine James; Williamson, Charles Michael; Edwards, Harold Carter

    2005-01-01

    The purpose of the Sandia National Laboratories (SNL) Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. Quality is defined in DOE/AL Quality Criteria (QC-1) as conformance to customer requirements and expectations. This quality plan defines the ASC program software quality practices and provides mappings of these practices to the SNL Corporate Process Requirements (CPR 1.3.2 and CPR 1.3.6) and the Department of Energy (DOE) document, ASCI Software Quality Engineering: Goals, Principles, and Guidelines (GP&G). This quality plan identifies ASC management and software project teams' responsibilities for cost-effective software engineering quality practices. The SNL ASC Software Quality Plan establishes the signatories commitment to improving software products by applying cost-effective software engineering quality practices. This document explains the project teams opportunities for tailoring and implementing the practices; enumerates the practices that compose the development of SNL ASC's software products; and includes a sample assessment checklist that was developed based upon the practices in this document.

  17. Object-Based Attention and Cognitive Tunneling

    ERIC Educational Resources Information Center

    Jarmasz, Jerzy; Herdman, Chris M.; Johannsdottir, Kamilla Run

    2005-01-01

    Simulator-based research has shown that pilots cognitively tunnel their attention on head-up displays (HUDs). Cognitive tunneling has been linked to object-based visual attention on the assumption that HUD symbology is perceptually grouped into an object that is perceived and attended separately from the external scene. The present research…

  18. NASA's Advanced Multimission Operations System: A Case Study in Formalizing Software Architecture Evolution

    NASA Technical Reports Server (NTRS)

    Barnes, Jeffrey M.

    2011-01-01

    All software systems of significant size and longevity eventually undergo changes to their basic architectural structure. Such changes may be prompted by evolving requirements, changing technology, or other reasons. Whatever the cause, software architecture evolution is commonplace in real world software projects. Recently, software architecture researchers have begun to study this phenomenon in depth. However, this work has suffered from problems of validation; research in this area has tended to make heavy use of toy examples and hypothetical scenarios and has not been well supported by real world examples. To help address this problem, I describe an ongoing effort at the Jet Propulsion Laboratory to re-architect the Advanced Multimission Operations System (AMMOS), which is used to operate NASA's deep-space and astrophysics missions. Based on examination of project documents and interviews with project personnel, I describe the goals and approach of this evolution effort and then present models that capture some of the key architectural changes. Finally, I demonstrate how approaches and formal methods from my previous research in architecture evolution may be applied to this evolution, while using languages and tools already in place at the Jet Propulsion Laboratory.

  19. Earth Observing System (EOS)/ Advanced Microwave Sounding Unit-A (AMSU-A): Special Test Equipment. Software Requirements

    NASA Technical Reports Server (NTRS)

    Schwantje, Robert

    1995-01-01

    This document defines the functional, performance, and interface requirements for the Earth Observing System/Advanced Microwave Sounding Unit-A (EOS/AMSU-A) Special Test Equipment (STE) software used in the test and integration of the instruments.

  20. Advanced Transport Operating System (ATOPS) color displays software description microprocessor system

    NASA Technical Reports Server (NTRS)

    Slominski, Christopher J.; Plyler, Valerie E.; Dickson, Richard W.

    1992-01-01

    This document describes the software created for the Sperry Microprocessor Color Display System used for the Advanced Transport Operating Systems (ATOPS) project on the Transport Systems Research Vehicle (TSRV). The software delivery known as the 'baseline display system', is the one described in this document. Throughout this publication, module descriptions are presented in a standardized format which contains module purpose, calling sequence, detailed description, and global references. The global reference section includes procedures and common variables referenced by a particular module. The system described supports the Research Flight Deck (RFD) of the TSRV. The RFD contains eight cathode ray tubes (CRTs) which depict a Primary Flight Display, Navigation Display, System Warning Display, Takeoff Performance Monitoring System Display, and Engine Display.

  1. Advances in software development for intelligent interfaces for alarm and emergency management consoles

    SciTech Connect

    Moseley, M.R.; Olson, C.E.

    1986-01-01

    Recent advances in technology allow features like voice synthesis, voice and speech recognition, image understanding, and intelligent data base management to be incorporated in computer driven alarm and emergency management information systems. New software development environments make it possible to do rapid prototyping of custom applications. Three examples using these technologies are discussed. 1) Maximum use is made of high-speed graphics and voice synthesis to implement a state-of-the-art alarm processing and display system with features that make the operator-machine interface efficient and accurate. 2) An application generator which has the capability of ''building'' a specific alarm processing and display application in a matter of a few hours, using the site definition developed in the security planning phase to produce the custom application. 3) A software tool, is described which permits rapid prototyping of human-machine interfaces for a variety of applications including emergency management, alarm display and process information display.

  2. Advanced Transport Operating System (ATOPS) color displays software description: MicroVAX system

    NASA Technical Reports Server (NTRS)

    Slominski, Christopher J.; Plyler, Valerie E.; Dickson, Richard W.

    1992-01-01

    This document describes the software created for the Display MicroVAX computer used for the Advanced Transport Operating Systems (ATOPS) project on the Transport Systems Research Vehicle (TSRV). The software delivery of February 27, 1991, known as the 'baseline display system', is the one described in this document. Throughout this publication, module descriptions are presented in a standardized format which contains module purpose, calling sequence, detailed description, and global references. The global references section includes subroutines, functions, and common variables referenced by a particular module. The system described supports the Research Flight Deck (RFD) of the TSRV. The RFD contains eight Cathode Ray Tubes (CRTs) which depict a Primary Flight Display, Navigation Display, System Warning Display, Takeoff Performance Monitoring System Display, and Engine Display.

  3. ACTS (Advanced Communications Technology Satellite) Propagation Experiment: Preprocessing Software User's Manual

    NASA Technical Reports Server (NTRS)

    Crane, Robert K.; Wang, Xuhe; Westenhaver, David

    1996-01-01

    The preprocessing software manual describes the Actspp program originally developed to observe and diagnose Advanced Communications Technology Satellite (ACTS) propagation terminal/receiver problems. However, it has been quite useful for automating the preprocessing functions needed to convert the terminal output to useful attenuation estimates. Prior to having data acceptable for archival functions, the individual receiver system must be calibrated and the power level shifts caused by ranging tone modulation must be received. Actspp provides three output files: the daylog, the diurnal coefficient file, and the file that contains calibration information.

  4. Recent advances in software for beamline design, accelerator operations and personnel training

    NASA Astrophysics Data System (ADS)

    Gillespie, George H.; Hill, Barrey W.; Martono, Hendy; Moore, John M.

    2000-03-01

    Accelerators are finding new applications in research, industry, medicine, as well as other fields, and there is a growing need for new tools to improve the productivity of scientists and engineers involved with these emerging accelerator applications. Several advances in computer software have been made that focus on meeting those needs. This paper summarizes recent work in the development of a unique software framework designed specifically to support the accelerator community: the Multi-Platform Shell for Particle Accelerator Related Codes (SPARC MP). SPARC MP includes a sophisticated beamline object model, an extensive library of GUI components, and supports a variety of particle optics codes and add-on tools. This framework has been used to create the Particle Beam Optics Laboratory (PBO Lab TM) family of software applications. PBO Lab has been used for beamline design, as a computer aid for teaching particle optics, and to support accelerator operations. Several popular charged particle optics programs, including MARYLIE, TRANSPORT, TURTLE and TRACE 3-D, have been integrated with a new version of PBO Lab. The modeling and simulation capabilities of these codes allow PBO Lab to support a wide spectrum of accelerator types. New external data interface tools are available to import beamline parameters from other sources, for example, to utilize magnet strengths generated by a control system. An overview of the new version of PBO Lab is presented.

  5. A Visual Basic simulation software tool for performance analysis of a membrane-based advanced water treatment plant.

    PubMed

    Pal, P; Kumar, R; Srivastava, N; Chowdhury, J

    2014-02-01

    A Visual Basic simulation software (WATTPPA) has been developed to analyse the performance of an advanced wastewater treatment plant. This user-friendly and menu-driven software is based on the dynamic mathematical model for an industrial wastewater treatment scheme that integrates chemical, biological and membrane-based unit operations. The software-predicted results corroborate very well with the experimental findings as indicated in the overall correlation coefficient of the order of 0.99. The software permits pre-analysis and manipulation of input data, helps in optimization and exhibits performance of an integrated plant visually on a graphical platform. It allows quick performance analysis of the whole system as well as the individual units. The software first of its kind in its domain and in the well-known Microsoft Excel environment is likely to be very useful in successful design, optimization and operation of an advanced hybrid treatment plant for hazardous wastewater.

  6. A Visual Basic simulation software tool for performance analysis of a membrane-based advanced water treatment plant.

    PubMed

    Pal, P; Kumar, R; Srivastava, N; Chowdhury, J

    2014-02-01

    A Visual Basic simulation software (WATTPPA) has been developed to analyse the performance of an advanced wastewater treatment plant. This user-friendly and menu-driven software is based on the dynamic mathematical model for an industrial wastewater treatment scheme that integrates chemical, biological and membrane-based unit operations. The software-predicted results corroborate very well with the experimental findings as indicated in the overall correlation coefficient of the order of 0.99. The software permits pre-analysis and manipulation of input data, helps in optimization and exhibits performance of an integrated plant visually on a graphical platform. It allows quick performance analysis of the whole system as well as the individual units. The software first of its kind in its domain and in the well-known Microsoft Excel environment is likely to be very useful in successful design, optimization and operation of an advanced hybrid treatment plant for hazardous wastewater. PMID:23982824

  7. Award-Winning CARES/Life Ceramics Durability Evaluation Software Is Making Advanced Technology Accessible

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Products made from advanced ceramics show great promise for revolutionizing aerospace and terrestrial propulsion and power generation. However, ceramic components are difficult to design because brittle materials in general have widely varying strength values. The CARES/Life software developed at the NASA Lewis Research Center eases this by providing a tool that uses probabilistic reliability analysis techniques to optimize the design and manufacture of brittle material components. CARES/Life is an integrated package that predicts the probability of a monolithic ceramic component's failure as a function of its time in service. It couples commercial finite element programs--which resolve a component's temperature and stress distribution - with reliability evaluation and fracture mechanics routines for modeling strength - limiting defects. These routines are based on calculations of the probabilistic nature of the brittle material's strength.

  8. Object-based mapping of drumlins from DTMs

    NASA Astrophysics Data System (ADS)

    Eisank, C.; Dragut, L.; Blaschke, T.

    2012-04-01

    Until recently, landforms such as drumlins have only been manually delineated due to the difficulty in integrating contextual and semantic landform information in per cell classification approaches. Therefore, in most cases the results of per cell classifications presented basic landform elements or broad-scale physiographic regions that were only thematically defined. In contrast, object-based analysis provides spatially configured landform objects that are generated by terrain segmentation, the process of merging DTM cells to meaningful terrain objects at multiple scales. Such terrain objects should be favoured for landform modelling due to the following reasons: Firstly, their outlines potentially better correspond to the spatial limits of landforms as conceptualised by geoscientists; secondly, spatially aware objects enable the integration of semantic descriptions in the classification process. We present a multi-scale object-based study on automated delineation and classification of drumlins for a small test area in Bavaria, Germany. The multi-resolution segmentation algorithm is applied to create statistically meaningful objects patterns of selected DTMs, which are derived from a 5 m LiDAR DEM. For the subsequent classification of drumlins a semantics-based approach, which uses the principles of semantic modelling, is employed: initially, a geomorphological concept of the landform type drumlin is developed. The drumlin concept should ideally comprise verbal descriptions of the fundamental morphometric, morphological, hierarchical and contextual properties. Subsequently, the semantic model is built by structuring the conceptualised knowledge facts, and by associating those facts with object and class-related features, which are available in commonly used object-based software products for the development of classification rules. For the accuracy assessment we plan an integrated approach, which combines a statistical comparison to field maps and a qualitative

  9. Advanced Software for Analysis of High-Speed Rolling-Element Bearings

    NASA Technical Reports Server (NTRS)

    Poplawski, J. V.; Rumbarger, J. H.; Peters, S. M.; Galatis, H.; Flower, R.

    2003-01-01

    COBRA-AHS is a package of advanced software for analysis of rigid or flexible shaft systems supported by rolling-element bearings operating at high speeds under complex mechanical and thermal loads. These loads can include centrifugal and thermal loads generated by motions of bearing components. COBRA-AHS offers several improvements over prior commercial bearing-analysis programs: It includes innovative probabilistic fatigue-life-estimating software that provides for computation of three-dimensional stress fields and incorporates stress-based (in contradistinction to prior load-based) mathematical models of fatigue life. It interacts automatically with the ANSYS finite-element code to generate finite-element models for estimating distributions of temperature and temperature-induced changes in dimensions in iterative thermal/dimensional analyses: thus, for example, it can be used to predict changes in clearances and thermal lockup. COBRA-AHS provides an improved graphical user interface that facilitates the iterative cycle of analysis and design by providing analysis results quickly in graphical form, enabling the user to control interactive runs without leaving the program environment, and facilitating transfer of plots and printed results for inclusion in design reports. Additional features include roller-edge stress prediction and influence of shaft and housing distortion on bearing performance.

  10. Advanced communications technology satellite high burst rate link evaluation terminal communication protocol software user's guide, version 1.0

    NASA Technical Reports Server (NTRS)

    Reinhart, Richard C.

    1993-01-01

    The Communication Protocol Software was developed at the NASA Lewis Research Center to support the Advanced Communications Technology Satellite High Burst Rate Link Evaluation Terminal (ACTS HBR-LET). The HBR-LET is an experimenters terminal to communicate with the ACTS for various experiments by government, university, and industry agencies. The Communication Protocol Software is one segment of the Control and Performance Monitor (C&PM) Software system of the HBR-LET. The Communication Protocol Software allows users to control and configure the Intermediate Frequency Switch Matrix (IFSM) on board the ACTS to yield a desired path through the spacecraft payload. Besides IFSM control, the C&PM Software System is also responsible for instrument control during HBR-LET experiments, uplink power control of the HBR-LET to demonstrate power augmentation during signal fade events, and data display. The Communication Protocol Software User's Guide, Version 1.0 (NASA CR-189162) outlines the commands and procedures to install and operate the Communication Protocol Software. Configuration files used to control the IFSM, operator commands, and error recovery procedures are discussed. The Communication Protocol Software Maintenance Manual, Version 1.0 (NASA CR-189163, to be published) is a programmer's guide to the Communication Protocol Software. This manual details the current implementation of the software from a technical perspective. Included is an overview of the Communication Protocol Software, computer algorithms, format representations, and computer hardware configuration. The Communication Protocol Software Test Plan (NASA CR-189164, to be published) provides a step-by-step procedure to verify the operation of the software. Included in the Test Plan is command transmission, telemetry reception, error detection, and error recovery procedures.

  11. An Advanced Educational Program for Software Design Engineering at Graduate School of Information Science and Technology of Osaka University

    NASA Astrophysics Data System (ADS)

    Masuzawa, Toshimitsu; Inoue, Katsuro; Murakami, Koso; Fujiwara, Toru; Nishio, Shojiro

    This paper gives an overview of an advanced educational program for software design engineering that is currently conducted at Graduate School of Information Science and Technology, Osaka University under the grant “ Initiatives for Attractive Education in Graduate Schools” from MEXT. Software design engineering is highly expected to play a critical role in winning success in designing the next-generation software systems. The aim of the program is to bring up young researchers with the latest design methodologies and practical design experience, who can pioneer the frontier of software design engineering. The program is conducted with the collaboration of industries that have rich practical experience and are facing the engineering problems to be solved in developing the next-generation software.

  12. Advanced functionality for radio analysis in the Offline software framework of the Pierre Auger Observatory

    NASA Astrophysics Data System (ADS)

    Abreu, P.; Aglietta, M.; Ahn, E. J.; Albuquerque, I. F. M.; Allard, D.; Allekotte, I.; Allen, J.; Allison, P.; Alvarez Castillo, J.; Alvarez-Muñiz, J.; Ambrosio, M.; Aminaei, A.; Anchordoqui, L.; Andringa, S.; Antičić, T.; Aramo, C.; Arganda, E.; Arqueros, F.; Asorey, H.; Assis, P.; Aublin, J.; Ave, M.; Avenier, M.; Avila, G.; Bäcker, T.; Balzer, M.; Barber, K. B.; Barbosa, A. F.; Bardenet, R.; Barroso, S. L. C.; Baughman, B.; Beatty, J. J.; Becker, B. R.; Becker, K. H.; Bellido, J. A.; Benzvi, S.; Berat, C.; Bertou, X.; Biermann, P. L.; Billoir, P.; Blanco, F.; Blanco, M.; Bleve, C.; Blümer, H.; Boháčová, M.; Boncioli, D.; Bonifazi, C.; Bonino, R.; Borodai, N.; Brack, J.; Brogueira, P.; Brown, W. C.; Bruijn, R.; Buchholz, P.; Bueno, A.; Burton, R. E.; Caballero-Mora, K. S.; Caramete, L.; Caruso, R.; Castellina, A.; Cataldi, G.; Cazon, L.; Cester, R.; Chauvin, J.; Chiavassa, A.; Chinellato, J. A.; Chou, A.; Chudoba, J.; Clay, R. W.; Coluccia, M. R.; Conceição, R.; Contreras, F.; Cook, H.; Cooper, M. J.; Coppens, J.; Cordier, A.; Cotti, U.; Coutu, S.; Covault, C. E.; Creusot, A.; Criss, A.; Cronin, J.; Curutiu, A.; Dagoret-Campagne, S.; Dallier, R.; Dasso, S.; Daumiller, K.; Dawson, B. R.; de Almeida, R. M.; de Domenico, M.; de Donato, C.; de Jong, S. J.; de La Vega, G.; de Mello Junior, W. J. M.; de Mello Neto, J. R. T.; de Mitri, I.; de Souza, V.; de Vries, K. D.; Decerprit, G.; Del Peral, L.; Deligny, O.; Dembinski, H.; Denkiewicz, A.; di Giulio, C.; Diaz, J. C.; Díaz Castro, M. L.; Diep, P. N.; Dobrigkeit, C.; D'Olivo, J. C.; Dong, P. N.; Dorofeev, A.; Dos Anjos, J. C.; Dova, M. T.; D'Urso, D.; Dutan, I.; Ebr, J.; Engel, R.; Erdmann, M.; Escobar, C. O.; Etchegoyen, A.; Facal San Luis, P.; Falcke, H.; Farrar, G.; Fauth, A. C.; Fazzini, N.; Ferguson, A. P.; Ferrero, A.; Fick, B.; Filevich, A.; Filipčič, A.; Fliescher, S.; Fracchiolla, C. E.; Fraenkel, E. D.; Fröhlich, U.; Fuchs, B.; Gamarra, R. F.; Gambetta, S.; García, B.; García Gámez, D.; Garcia-Pinto, D.; Gascon, A.; Gemmeke, H.; Gesterling, K.; Ghia, P. L.; Giaccari, U.; Giller, M.; Glass, H.; Gold, M. S.; Golup, G.; Gomez Albarracin, F.; Gómez Berisso, M.; Gonçalves, P.; Gonzalez, D.; Gonzalez, J. G.; Gookin, B.; Góra, D.; Gorgi, A.; Gouffon, P.; Gozzini, S. R.; Grashorn, E.; Grebe, S.; Griffith, N.; Grigat, M.; Grillo, A. F.; Guardincerri, Y.; Guarino, F.; Guedes, G. P.; Hague, J. D.; Hansen, P.; Harari, D.; Harmsma, S.; Harton, J. L.; Haungs, A.; Hebbeker, T.; Heck, D.; Herve, A. E.; Hojvat, C.; Holmes, V. C.; Homola, P.; Hörandel, J. R.; Horneffer, A.; Hrabovský, M.; Huege, T.; Insolia, A.; Ionita, F.; Italiano, A.; Jiraskova, S.; Kadija, K.; Kampert, K. H.; Karhan, P.; Karova, T.; Kasper, P.; Kégl, B.; Keilhauer, B.; Keivani, A.; Kelley, J. L.; Kemp, E.; Kieckhafer, R. M.; Klages, H. O.; Kleifges, M.; Kleinfeller, J.; Knapp, J.; Koang, D.-H.; Kotera, K.; Krohm, N.; Krömer, O.; Kruppke-Hansen, D.; Kuehn, F.; Kuempel, D.; Kulbartz, J. K.; Kunka, N.; La Rosa, G.; Lachaud, C.; Lautridou, P.; Leão, M. S. A. B.; Lebrun, D.; Lebrun, P.; Leigui de Oliveira, M. A.; Lemiere, A.; Letessier-Selvon, A.; Lhenry-Yvon, I.; Link, K.; López, R.; Lopez Agüera, A.; Louedec, K.; Lozano Bahilo, J.; Lucero, A.; Ludwig, M.; Lyberis, H.; Macolino, C.; Maldera, S.; Mandat, D.; Mantsch, P.; Mariazzi, A. G.; Marin, V.; Maris, I. C.; Marquez Falcon, H. R.; Marsella, G.; Martello, D.; Martin, L.; Martínez Bravo, O.; Mathes, H. J.; Matthews, J.; Matthews, J. A. J.; Matthiae, G.; Maurizio, D.; Mazur, P. O.; Medina-Tanco, G.; Melissas, M.; Melo, D.; Menichetti, E.; Menshikov, A.; Mertsch, P.; Meurer, C.; Mićanović, S.; Micheletti, M. I.; Miller, W.; Miramonti, L.; Mollerach, S.; Monasor, M.; Monnier Ragaigne, D.; Montanet, F.; Morales, B.; Morello, C.; Moreno, E.; Moreno, J. C.; Morris, C.; Mostafá, M.; Moura, C. A.; Mueller, S.; Muller, M. A.; Müller, G.; Münchmeyer, M.; Mussa, R.; Navarra, G.; Navarro, J. L.; Navas, S.; Necesal, P.; Nellen, L.; Nelles, A.; Nhung, P. T.; Nierstenhoefer, N.; Nitz, D.; Nosek, D.; Nožka, L.; Nyklicek, M.; Oehlschläger, J.; Olinto, A.; Oliva, P.; Olmos-Gilbaja, V. M.; Ortiz, M.; Pacheco, N.; Pakk Selmi-Dei, D.; Palatka, M.; Pallotta, J.; Palmieri, N.; Parente, G.; Parizot, E.; Parra, A.; Parrisius, J.; Parsons, R. D.; Pastor, S.; Paul, T.; Pech, M.; PeĶala, J.; Pelayo, R.; Pepe, I. M.; Perrone, L.; Pesce, R.; Petermann, E.; Petrera, S.; Petrinca, P.; Petrolini, A.; Petrov, Y.; Petrovic, J.; Pfendner, C.; Phan, N.; Piegaia, R.; Pierog, T.; Pieroni, P.; Pimenta, M.; Pirronello, V.; Platino, M.; Ponce, V. H.; Pontz, M.; Privitera, P.; Prouza, M.; Quel, E. J.; Rautenberg, J.; Ravel, O.; Ravignani, D.; Revenu, B.; Ridky, J.; Risse, M.; Ristori, P.; Rivera, H.; Riviére, C.; Rizi, V.; Robledo, C.; Rodrigues de Carvalho, W.; Rodriguez, G.; Rodriguez Martino, J.; Rodriguez Rojo, J.; Rodriguez-Cabo, I.; Rodríguez-Frías, M. D.; Ros, G.; Rosado, J.; Rossler, T.; Roth, M.; Rouillé-D'Orfeuil, B.; Roulet, E.; Rovero, A. C.; Rühle, C.; Salamida, F.; Salazar, H.; Salina, G.; Sánchez, F.; Santander, M.; Santo, C. E.; Santos, E.; Santos, E. M.; Sarazin, F.; Sarkar, S.; Sato, R.; Scharf, N.; Scherini, V.; Schieler, H.; Schiffer, P.; Schmidt, A.; Schmidt, F.; Schmidt, T.; Scholten, O.; Schoorlemmer, H.; Schovancova, J.; Schovánek, P.; Schroeder, F.; Schulte, S.; Schuster, D.; Sciutto, S. J.; Scuderi, M.; Segreto, A.; Semikoz, D.; Settimo, M.; Shadkam, A.; Shellard, R. C.; Sidelnik, I.; Sigl, G.; Śmiałkowski, A.; Šmída, R.; Snow, G. R.; Sommers, P.; Sorokin, J.; Spinka, H.; Squartini, R.; Stapleton, J.; Stasielak, J.; Stephan, M.; Stutz, A.; Suarez, F.; Suomijärvi, T.; Supanitsky, A. D.; Šuša, T.; Sutherland, M. S.; Swain, J.; Szadkowski, Z.; Szuba, M.; Tamashiro, A.; Tapia, A.; Taşcău, O.; Tcaciuc, R.; Tegolo, D.; Thao, N. T.; Thomas, D.; Tiffenberg, J.; Timmermans, C.; Tiwari, D. K.; Tkaczyk, W.; Todero Peixoto, C. J.; Tomé, B.; Tonachini, A.; Travnicek, P.; Tridapalli, D. B.; Tristram, G.; Trovato, E.; Tueros, M.; Ulrich, R.; Unger, M.; Urban, M.; Valdés Galicia, J. F.; Valiño, I.; Valore, L.; van den Berg, A. M.; Vargas Cárdenas, B.; Vázquez, J. R.; Vázquez, R. A.; Veberič, D.; Verzi, V.; Videla, M.; Villaseñor, L.; Wahlberg, H.; Wahrlich, P.; Wainberg, O.; Warner, D.; Watson, A. A.; Weber, M.; Weidenhaupt, K.; Weindl, A.; Westerhoff, S.; Whelan, B. J.; Wieczorek, G.; Wiencke, L.; Wilczyńska, B.; Wilczyński, H.; Will, M.; Williams, C.; Winchen, T.; Winders, L.; Winnick, M. G.; Wommer, M.; Wundheiler, B.; Yamamoto, T.; Younk, P.; Yuan, G.; Zamorano, B.; Zas, E.; Zavrtanik, D.; Zavrtanik, M.; Zaw, I.; Zepeda, A.; Ziolkowski, M.

    2011-04-01

    The advent of the Auger Engineering Radio Array (AERA) necessitates the development of a powerful framework for the analysis of radio measurements of cosmic ray air showers. As AERA performs “radio-hybrid” measurements of air shower radio emission in coincidence with the surface particle detectors and fluorescence telescopes of the Pierre Auger Observatory, the radio analysis functionality had to be incorporated in the existing hybrid analysis solutions for fluorescence and surface detector data. This goal has been achieved in a natural way by extending the existing Auger Offline software framework with radio functionality. In this article, we lay out the design, highlights and features of the radio extension implemented in the Auger Offline framework. Its functionality has achieved a high degree of sophistication and offers advanced features such as vectorial reconstruction of the electric field, advanced signal processing algorithms, a transparent and efficient handling of FFTs, a very detailed simulation of detector effects, and the read-in of multiple data formats including data from various radio simulation codes. The source code of this radio functionality can be made available to interested parties on request.

  13. Advanced functionality for radio analysis in the Offline software framework of the Pierre Auger Observatory

    SciTech Connect

    Abreu, P.; Aglietta, M.; Ahn, E.J.; Albuquerque, I.F.M.; Allard, D.; Allekotte, I.; Allen, J.; Allison, P.; Alvarez Castillo, J.; Alvarez-Muniz, J.; Ambrosio, M.; /INFN, Naples /Copenhagen Astron. Observ. /Nijmegen U., IMAPP

    2011-01-01

    The advent of the Auger Engineering Radio Array (AERA) necessitates the development of a powerful framework for the analysis of radio measurements of cosmic ray air showers. As AERA performs 'radio-hybrid' measurements of air shower radio emission in coincidence with the surface particle detectors and fluorescence telescopes of the Pierre Auger Observatory, the radio analysis functionality had to be incorporated in the existing hybrid analysis solutions for fluorescence and surface detector data. This goal has been achieved in a natural way by extending the existing Auger Offline software framework with radio functionality. In this article, we lay out the design, highlights and features of the radio extension implemented in the Auger Offline framework. Its functionality has achieved a high degree of sophistication and offers advanced features such as vectorial reconstruction of the electric field, advanced signal processing algorithms, a transparent and efficient handling of FFTs, a very detailed simulation of detector effects, and the read-in of multiple data formats including data from various radio simulation codes. The source code of this radio functionality can be made available to interested parties on request.

  14. Performance analysis and optimization of an advanced pharmaceutical wastewater treatment plant through a visual basic software tool (PWWT.VB).

    PubMed

    Pal, Parimal; Thakura, Ritwik; Chakrabortty, Sankha

    2016-05-01

    A user-friendly, menu-driven simulation software tool has been developed for the first time to optimize and analyze the system performance of an advanced continuous membrane-integrated pharmaceutical wastewater treatment plant. The software allows pre-analysis and manipulation of input data which helps in optimization and shows the software performance visually on a graphical platform. Moreover, the software helps the user to "visualize" the effects of the operating parameters through its model-predicted output profiles. The software is based on a dynamic mathematical model, developed for a systematically integrated forward osmosis-nanofiltration process for removal of toxic organic compounds from pharmaceutical wastewater. The model-predicted values have been observed to corroborate well with the extensive experimental investigations which were found to be consistent under varying operating conditions like operating pressure, operating flow rate, and draw solute concentration. Low values of the relative error (RE = 0.09) and high values of Willmott-d-index (d will = 0.981) reflected a high degree of accuracy and reliability of the software. This software is likely to be a very efficient tool for system design or simulation of an advanced membrane-integrated treatment plant for hazardous wastewater. PMID:26856870

  15. Performance analysis and optimization of an advanced pharmaceutical wastewater treatment plant through a visual basic software tool (PWWT.VB).

    PubMed

    Pal, Parimal; Thakura, Ritwik; Chakrabortty, Sankha

    2016-05-01

    A user-friendly, menu-driven simulation software tool has been developed for the first time to optimize and analyze the system performance of an advanced continuous membrane-integrated pharmaceutical wastewater treatment plant. The software allows pre-analysis and manipulation of input data which helps in optimization and shows the software performance visually on a graphical platform. Moreover, the software helps the user to "visualize" the effects of the operating parameters through its model-predicted output profiles. The software is based on a dynamic mathematical model, developed for a systematically integrated forward osmosis-nanofiltration process for removal of toxic organic compounds from pharmaceutical wastewater. The model-predicted values have been observed to corroborate well with the extensive experimental investigations which were found to be consistent under varying operating conditions like operating pressure, operating flow rate, and draw solute concentration. Low values of the relative error (RE = 0.09) and high values of Willmott-d-index (d will = 0.981) reflected a high degree of accuracy and reliability of the software. This software is likely to be a very efficient tool for system design or simulation of an advanced membrane-integrated treatment plant for hazardous wastewater.

  16. Framework Programmable Platform for the Advanced Software Development Workstation (FPP/ASDW). Demonstration framework document. Volume 1: Concepts and activity descriptions

    NASA Technical Reports Server (NTRS)

    Mayer, Richard J.; Blinn, Thomas M.; Dewitte, Paul S.; Crump, John W.; Ackley, Keith A.

    1992-01-01

    The Framework Programmable Software Development Platform (FPP) is a project aimed at effectively combining tool and data integration mechanisms with a model of the software development process to provide an intelligent integrated software development environment. Guided by the model, this system development framework will take advantage of an integrated operating environment to automate effectively the management of the software development process so that costly mistakes during the development phase can be eliminated. The Advanced Software Development Workstation (ASDW) program is conducting research into development of advanced technologies for Computer Aided Software Engineering (CASE).

  17. Hybrid Modeling for Scenario-Based Evaluation of Failure Effects in Advanced Hardware-Software Designs

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Fleming, Land; Throop, David

    2001-01-01

    This paper describes an incremental scenario-based simulation approach to evaluation of intelligent software for control and management of hardware systems. A hybrid continuous/discrete event simulation of the hardware dynamically interacts with the intelligent software in operations scenarios. Embedded anomalous conditions and failures in simulated hardware can lead to emergent software behavior and identification of missing or faulty software or hardware requirements. An approach is described for extending simulation-based automated incremental failure modes and effects analysis, to support concurrent evaluation of intelligent software and the hardware controlled by the software

  18. Framework programmable platform for the advanced software development workstation. Integration mechanism design document

    NASA Technical Reports Server (NTRS)

    Mayer, Richard J.; Blinn, Thomas M.; Mayer, Paula S. D.; Reddy, Uday; Ackley, Keith; Futrell, Mike

    1991-01-01

    The Framework Programmable Software Development Platform (FPP) is a project aimed at combining effective tool and data integration mechanisms with a model of the software development process in an intelligent integrated software development environment. Guided by this model, this system development framework will take advantage of an integrated operating environment to automate effectively the management of the software development process so that costly mistakes during the development phase can be eliminated.

  19. Development of tools for safety analysis of control software in advanced reactors

    SciTech Connect

    Guarro, S.; Yau, M.; Motamed, M.

    1996-04-01

    Software based control systems have gained a pervasive presence in a wide variety of applications, including nuclear power plant control and protection systems which are within the oversight and licensing responsibility of the US Nuclear Regulatory Commission. While the cost effectiveness and flexibility of software based plant process control is widely recognized, it is very difficult to achieve and prove high levels of demonstrated dependability and safety assurance for the functions performed by process control software, due to the very flexibility and potential complexity of the software itself. The development of tools to model, analyze and test software design and implementations in the context of the system that the software is designed to control can greatly assist the task of providing higher levels of assurance than those obtainable by software testing alone. This report presents and discusses the development of the Dynamic Flowgraph Methodology (DFM) and its application in the dependability and assurance analysis of software-based control systems. The features of the methodology and full-scale examples of application to both generic process and nuclear power plant control systems are presented and discussed in detail. The features of a workstation software tool developed to assist users in the application of DFM are also described.

  20. Framework Programmable Platform for the advanced software development workstation: Framework processor design document

    NASA Technical Reports Server (NTRS)

    Mayer, Richard J.; Blinn, Thomas M.; Mayer, Paula S. D.; Ackley, Keith A.; Crump, Wes; Sanders, Les

    1991-01-01

    The design of the Framework Processor (FP) component of the Framework Programmable Software Development Platform (FFP) is described. The FFP is a project aimed at combining effective tool and data integration mechanisms with a model of the software development process in an intelligent integrated software development environment. Guided by the model, this Framework Processor will take advantage of an integrated operating environment to provide automated support for the management and control of the software development process so that costly mistakes during the development phase can be eliminated.

  1. Space Flight Software Development Software for Intelligent System Health Management

    NASA Technical Reports Server (NTRS)

    Trevino, Luis C.; Crumbley, Tim

    2004-01-01

    The slide presentation examines the Marshall Space Flight Center Flight Software Branch, including software development projects, mission critical space flight software development, software technical insight, advanced software development technologies, and continuous improvement in the software development processes and methods.

  2. Framework Programmable Platform for the Advanced Software Development Workstation: Preliminary system design document

    NASA Technical Reports Server (NTRS)

    Mayer, Richard J.; Blinn, Thomas M.; Mayer, Paula S. D.; Ackley, Keith A.; Crump, John W., IV; Henderson, Richard; Futrell, Michael T.

    1991-01-01

    The Framework Programmable Software Development Platform (FPP) is a project aimed at combining effective tool and data integration mechanisms with a model of the software development process in an intelligent integrated software environment. Guided by the model, this system development framework will take advantage of an integrated operating environment to automate effectively the management of the software development process so that costly mistakes during the development phase can be eliminated. The focus here is on the design of components that make up the FPP. These components serve as supporting systems for the Integration Mechanism and the Framework Processor and provide the 'glue' that ties the FPP together. Also discussed are the components that allow the platform to operate in a distributed, heterogeneous environment and to manage the development and evolution of software system artifacts.

  3. Automated object-based classification of topography from SRTM data

    PubMed Central

    Drăguţ, Lucian; Eisank, Clemens

    2012-01-01

    We introduce an object-based method to automatically classify topography from SRTM data. The new method relies on the concept of decomposing land-surface complexity into more homogeneous domains. An elevation layer is automatically segmented and classified at three scale levels that represent domains of complexity by using self-adaptive, data-driven techniques. For each domain, scales in the data are detected with the help of local variance and segmentation is performed at these appropriate scales. Objects resulting from segmentation are partitioned into sub-domains based on thresholds given by the mean values of elevation and standard deviation of elevation respectively. Results resemble reasonably patterns of existing global and regional classifications, displaying a level of detail close to manually drawn maps. Statistical evaluation indicates that most of classes satisfy the regionalization requirements of maximizing internal homogeneity while minimizing external homogeneity. Most objects have boundaries matching natural discontinuities at regional level. The method is simple and fully automated. The input data consist of only one layer, which does not need any pre-processing. Both segmentation and classification rely on only two parameters: elevation and standard deviation of elevation. The methodology is implemented as a customized process for the eCognition® software, available as online download. The results are embedded in a web application with functionalities of visualization and download. PMID:22485060

  4. Automated object-based classification of topography from SRTM data

    NASA Astrophysics Data System (ADS)

    Drăguţ, Lucian; Eisank, Clemens

    2012-03-01

    We introduce an object-based method to automatically classify topography from SRTM data. The new method relies on the concept of decomposing land-surface complexity into more homogeneous domains. An elevation layer is automatically segmented and classified at three scale levels that represent domains of complexity by using self-adaptive, data-driven techniques. For each domain, scales in the data are detected with the help of local variance and segmentation is performed at these appropriate scales. Objects resulting from segmentation are partitioned into sub-domains based on thresholds given by the mean values of elevation and standard deviation of elevation respectively. Results resemble reasonably patterns of existing global and regional classifications, displaying a level of detail close to manually drawn maps. Statistical evaluation indicates that most of classes satisfy the regionalization requirements of maximizing internal homogeneity while minimizing external homogeneity. Most objects have boundaries matching natural discontinuities at regional level. The method is simple and fully automated. The input data consist of only one layer, which does not need any pre-processing. Both segmentation and classification rely on only two parameters: elevation and standard deviation of elevation. The methodology is implemented as a customized process for the eCognition® software, available as online download. The results are embedded in a web application with functionalities of visualization and download.

  5. ATOS: Integration of advanced technology software within distributed Spacecraft Mission Operations Systems

    NASA Technical Reports Server (NTRS)

    Jones, M.; Wheadon, J.; Omullane, W.; Whitgift, D.; Poulter, K.; Niezette, M.; Timmermans, R.; Rodriguez, Ivan; Romero, R.

    1994-01-01

    The Advanced Technology Operations System (ATOS) is a program of studies into the integration of advanced applications (including knowledge based systems (KBS)) with ground systems for the support of spacecraft mission operations.

  6. Object-based landslide detection in different geographic regions

    NASA Astrophysics Data System (ADS)

    Friedl, Barbara; Hölbling, Daniel; Eisank, Clemens; Blaschke, Thomas

    2015-04-01

    , SPOT-5 images are combined with digital elevation models (DEM) for developing a consistent semi-automated landslide detection approach using eCognition (Trimble) software. Suitable image objects are generated by means of multiresolution segmentation. Expert knowledge, i.e. reported facts on features (e.g. mean object slope, mean NDVI) and thresholds that are commonly chosen by professionals for digital landslide mapping, is considered during classification. The applicability of a range of features is tested and the most promising parameters, i.e. features that produce appropriate results for both regions, are selected for landslide detection. However, minor adaptations of particular thresholds are necessary due to the distinct environmental conditions of the test sites. In order to reduce the number of required adjustments to a minimum, relational features and spectral indices are primarily used for classification. The obtained results are finally compared to manually digitized reference polygons and existing landslide inventories in order to quantify the applicability of the developed object-based landslide detection approach in different geographic regions.

  7. Advances in the NetCDF Data Model, Format, and Software

    NASA Astrophysics Data System (ADS)

    Rew, R. K.; Hartnett, E. J.; Heimbigner, D.; Caron, J. L.

    2010-12-01

    Two decades of experience facilitating the creation of and access to portable, self-describing scientific data have resulted in Unidata's network Common Data Form (netCDF), with an associated data model, freely available reference software, and endorsement as a standard by groups including NASA's Earth Science Data Systems Standards Process Group. NetCDF is now widely used in climate, ocean, and atmospheric sciences, and has proved general enough for other uses, such as neural imaging, molecular dynamics, and fusion energy research. A variety of software packages, data archives, and client/server systems are available for the access, analysis, visualization, and use of netCDF data, as well as other kinds of data through a netCDF programming interface. The widely used Climate and Forecast (CF) Metadata Conventions, based on the netCDF "classic" data model but applicable to other formats, add a useful layer of semantics for interoperability, and define "CF compliance" as a standard for Earth science metadata. Broad adoption of CF Conventions make practical the development of CF-compliant software that can access such data and metadata. The simple netCDF data model has recently been extended to improve its ability to faithfully represent earth science data and metadata. The netCDF software that implements the enhanced data model provides backward compatibility with existing data and applications, while making access to useful features available through additional programming interfaces that extend the netCDF classic interfaces. This presentation provides an up-to-date overview of netCDF including a summary of the enhanced netCDF data model, describes experience developing generic software to handle its features, and offers guidelines based on that experience for incrementally adapting existing netCDF software to take advantage of benefits offered by the enhanced netCDF data model.

  8. Research in software allocation for advanced manned mission communications and tracking systems

    NASA Technical Reports Server (NTRS)

    Warnagiris, Tom; Wolff, Bill; Kusmanoff, Antone

    1990-01-01

    An assessment of the planned processing hardware and software/firmware for the Communications and Tracking System of the Space Station Freedom (SSF) was performed. The intent of the assessment was to determine the optimum distribution of software/firmware in the processing hardware for maximum throughput with minimum required memory. As a product of the assessment process an assessment methodology was to be developed that could be used for similar assessments of future manned spacecraft system designs. The assessment process was hampered by changing requirements for the Space Station. As a result, the initial objective of determining the optimum software/firmware allocation was not fulfilled, but several useful conclusions and recommendations resulted from the assessment. It was concluded that the assessment process would not be completely successful for a system with changing requirements. It was also concluded that memory requirements and hardware requirements were being modified to fit as a consequence of the change process, and although throughput could not be quantitized, potential problem areas could be identified. Finally, inherent flexibility of the system design was essential for the success of a system design with changing requirements. Recommendations resulting from the assessment included development of common software for some embedded controller functions, reduction of embedded processor requirements by hardwiring some Orbital Replacement Units (ORUs) to make better use of processor capabilities, and improvement in communications between software development personnel to enhance the integration process. Lastly, a critical observation was made regarding the software integration tasks did not appear to be addressed in the design process to the degree necessary for successful satisfaction of the system requirements.

  9. OBEST: The Object-Based Event Scenario Tree Methodology

    SciTech Connect

    WYSS, GREGORY D.; DURAN, FELICIA A.

    2001-03-01

    Event tree analysis and Monte Carlo-based discrete event simulation have been used in risk assessment studies for many years. This report details how features of these two methods can be combined with concepts from object-oriented analysis to develop a new risk assessment methodology with some of the best features of each. The resultant Object-Based Event Scenarios Tree (OBEST) methodology enables an analyst to rapidly construct realistic models for scenarios for which an a priori discovery of event ordering is either cumbersome or impossible (especially those that exhibit inconsistent or variable event ordering, which are difficult to represent in an event tree analysis). Each scenario produced by OBEST is automatically associated with a likelihood estimate because probabilistic branching is integral to the object model definition. The OBEST method uses a recursive algorithm to solve the object model and identify all possible scenarios and their associated probabilities. Since scenario likelihoods are developed directly by the solution algorithm, they need not be computed by statistical inference based on Monte Carlo observations (as required by some discrete event simulation methods). Thus, OBEST is not only much more computationally efficient than these simulation methods, but it also discovers scenarios that have extremely low probabilities as a natural analytical result--scenarios that would likely be missed by a Monte Carlo-based method. This report documents the OBEST methodology, the demonstration software that implements it, and provides example OBEST models for several different application domains, including interactions among failing interdependent infrastructure systems, circuit analysis for fire risk evaluation in nuclear power plants, and aviation safety studies.

  10. The Center for Technology for Advanced Scientific Component Software (TASCS) Lawrence Livermore National Laboratory - Site Status Update

    SciTech Connect

    Epperly, T W

    2008-12-03

    This report summarizes LLNL's progress for the period April through September of 2008 for the Center for Technology for Advanced Scientific Component Software (TASCS) SciDAC. The TASCS project is organized into four major thrust areas: CCA Environment (72%), Component Technology Initiatives (16%), CCA Toolkit (8%), and User and Application Outreach & Support (4%). The percentage of LLNL's effort allocation is shown in parenthesis for each thrust area. Major thrust areas are further broken down into activity areas, LLNL's effort directed to each activity is shown in Figure 1. Enhancements, Core Tools, and Usability are all part of CCA Environment, and Software Quality is part of Component Technology Initiatives. The balance of this report will cover our accomplishments in each of these activity areas.

  11. Inhibition of Return and Object-Based Attentional Selection

    ERIC Educational Resources Information Center

    List, Alexandra; Robertson, Lynn C.

    2007-01-01

    Visual attention research has revealed that attentional allocation can occur in space- and/or object-based coordinates. Using the direct and elegant design of R. Egly, J. Driver, and R. Rafal (1994), the present experiments tested whether space- and object-based inhibition of return (IOR) emerge under similar time courses. The experiments were…

  12. Probability-Based Software for Grid Optimization: Improved Power System Operations Using Advanced Stochastic Optimization

    SciTech Connect

    2012-02-24

    GENI Project: Sandia National Laboratories is working with several commercial and university partners to develop software for market management systems (MMSs) that enable greater use of renewable energy sources throughout the grid. MMSs are used to securely and optimally determine which energy resources should be used to service energy demand across the country. Contributions of electricity to the grid from renewable energy sources such as wind and solar are intermittent, introducing complications for MMSs, which have trouble accommodating the multiple sources of price and supply uncertainties associated with bringing these new types of energy into the grid. Sandia’s software will bring a new, probability-based formulation to account for these uncertainties. By factoring in various probability scenarios for electricity production from renewable energy sources in real time, Sandia’s formula can reduce the risk of inefficient electricity transmission, save ratepayers money, conserve power, and support the future use of renewable energy.

  13. Advanced multilateration theory, software development, and data processing: The MICRODOT system

    NASA Technical Reports Server (NTRS)

    Escobal, P. R.; Gallagher, J. F.; Vonroos, O. H.

    1976-01-01

    The process of geometric parameter estimation to accuracies of one centimeter, i.e., multilateration, is defined and applications are listed. A brief functional explanation of the theory is presented. Next, various multilateration systems are described in order of increasing system complexity. Expected systems accuracy is discussed from a general point of view and a summary of the errors is listed. An outline of the design of a software processing system for multilateration, called MICRODOT, is presented next. The links of this software, which can be used for multilateration data simulations or operational data reduction, are examined on an individual basis. Functional flow diagrams are presented to aid in understanding the software capability. MICRODOT capability is described with respect to vehicle configurations, interstation coordinate reduction, geophysical parameter estimation, and orbit determination. Numerical results obtained from MICRODOT via data simulations are displayed both for hypothetical and real world vehicle/station configurations such as used in the GEOS-3 Project. These simulations show the inherent power of the multilateration procedure.

  14. Parallel supercomputing: Advanced methods, algorithms and software for large-scale problems. Final report, August 1, 1987--July 31, 1994

    SciTech Connect

    Carey, G.F.; Young, D.M.

    1994-12-31

    The focus of the subject DOE sponsored research concerns parallel methods, algorithms, and software for complex applications such as those in coupled fluid flow and heat transfer. The research has been directed principally toward the solution of large-scale PDE problems using iterative solvers for finite differences and finite elements on advanced computer architectures. This work embraces parallel domain decomposition, element-by-element, spectral, and multilevel schemes with adaptive parameter determination, rational iteration and related issues. In addition to the fundamental questions related to developing new methods and mapping these to parallel computers, there are important software issues. The group has played a significant role in the development of software both for iterative solvers and also for finite element codes. The research in computational fluid dynamics (CFD) led to sustained multi-Gigaflop performance rates for parallel-vector computations of realistic large scale applications (not computational kernels alone). The main application areas for these performance studies have been two-dimensional problems in CFD. Over the course of this DOE sponsored research significant progress has been made. A report of the progression of the research is given and at the end of the report is a list of related publications and presentations over the entire grant period.

  15. Software Systems 2--Compiler and Operating Systems Lab--Advanced, Data Processing Technology: 8025.33.

    ERIC Educational Resources Information Center

    Dade County Public Schools, Miami, FL.

    The course outline has been prepared as a guide to help the student develop the skills and knowledge necessary to succeed in the field of data processing. By learning the purpose and principles of compiler programs and operating systems, the student will become familiar with advanced data processing procedures that are representative of computer…

  16. CalVR: an advanced open source virtual reality software framework

    NASA Astrophysics Data System (ADS)

    Schulze, Jürgen P.; Prudhomme, Andrew; Weber, Philip; DeFanti, Thomas A.

    2013-03-01

    We developed CalVR because none of the existing virtual reality software frameworks offered everything we needed, such as cluster-awareness, multi-GPU capability, Linux compatibility, multi-user support, collaborative session support, or custom menu widgets. CalVR combines features from multiple existing VR frameworks into an open-source system, which we use in our laboratory on a daily basis, and for which dozens of VR applications have already been written at UCSD but also other research laboratories world-wide. In this paper, we describe the philosophy behind CalVR, its standard and unique features and functions, its programming interface, and its inner workings.

  17. A USNRC perspective on the use of commercial-off-shelf software (COTS) in advanced reactors

    SciTech Connect

    Stewart, J.C.

    1997-12-01

    The use of commercially available digital computer systems and components in safety critical systems (nuclear power plant, military, and commercial applications) is increasing rapidly. While this paper focuses on the software aspects of the application most of these continents are applicable to the hardware aspects as well. Commercial dedication (the process of assuring that a commercial grade item will perform its intended safety function) has demonstrated benefits in cost savings and a wide base of user experience, however, care must be taken to avoid difficulties with some aspects of the dedication process such as access to vendor development information, configuration management long term support, and system integration.

  18. Advances in hardware, software, and automation for 193nm aerial image measurement systems

    NASA Astrophysics Data System (ADS)

    Zibold, Axel M.; Schmid, R.; Seyfarth, A.; Waechter, M.; Harnisch, W.; Doornmalen, H. v.

    2005-05-01

    A new, second generation AIMS fab 193 system has been developed which is capable of emulating lithographic imaging of any type of reticles such as binary and phase shift masks (PSM) including resolution enhancement technologies (RET) such as optical proximity correction (OPC) or scatter bars. The system emulates the imaging process by adjustment of the lithography equivalent illumination and imaging conditions of 193nm wafer steppers including circular, annular, dipole and quadrupole type illumination modes. The AIMS fab 193 allows a rapid prediction of wafer printability of critical mask features, including dense patterns and contacts, defects or repairs by acquiring through-focus image stacks by means of a CCD camera followed by quantitative image analysis. Moreover the technology can be readily applied to directly determine the process window of a given mask under stepper imaging conditions. Since data acquisition is performed electronically, AIMS in many applications replaces the need for costly and time consuming wafer prints using a wafer stepper/ scanner followed by CD SEM resist or wafer analysis. The AIMS fab 193 second generation system is designed for 193nm lithography mask printing predictability down to the 65nm node. In addition to hardware improvements a new modular AIMS software is introduced allowing for a fully automated operation mode. Multiple pre-defined points can be visited and through-focus AIMS measurements can be executed automatically in a recipe based mode. To increase the effectiveness of the automated operation mode, the throughput of the system to locate the area of interest, and to acquire the through-focus images is increased by almost a factor of two in comparison with the first generation AIMS systems. In addition a new software plug-in concept is realised for the tools. One new feature has been successfully introduced as "Global CD Map", enabling automated investigation of global mask quality based on the local determination of

  19. Advances in Software Tools for Pre-processing and Post-processing of Overset Grid Computations

    NASA Technical Reports Server (NTRS)

    Chan, William M.

    2004-01-01

    Recent developments in three pieces of software for performing pre-processing and post-processing work on numerical computations using overset grids are presented. The first is the OVERGRID graphical interface which provides a unified environment for the visualization, manipulation, generation and diagnostics of geometry and grids. Modules are also available for automatic boundary conditions detection, flow solver input preparation, multiple component dynamics input preparation and dynamics animation, simple solution viewing for moving components, and debris trajectory analysis input preparation. The second is a grid generation script library that enables rapid creation of grid generation scripts. A sample of recent applications will be described. The third is the OVERPLOT graphical interface for displaying and analyzing history files generated by the flow solver. Data displayed include residuals, component forces and moments, number of supersonic and reverse flow points, and various dynamics parameters.

  20. Advanced software development workstation. Comparison of two object-oriented development methodologies

    NASA Technical Reports Server (NTRS)

    Izygon, Michel E.

    1992-01-01

    This report is an attempt to clarify some of the concerns raised about the OMT method, specifically that OMT is weaker than the Booch method in a few key areas. This interim report specifically addresses the following issues: (1) is OMT object-oriented or only data-driven?; (2) can OMT be used as a front-end to implementation in C++?; (3) the inheritance concept in OMT is in contradiction with the 'pure and real' inheritance concept found in object-oriented (OO) design; (4) low support for software life-cycle issues, for project and risk management; (5) uselessness of functional modeling for the ROSE project; and (6) problems with event-driven and simulation systems. The conclusion of this report is that both Booch's method and Rumbaugh's method are good OO methods, each with strengths and weaknesses in different areas of the development process.

  1. Advanced Communications Technology Satellite high burst rate link evaluation terminal experiment control and monitor software maintenance manual, version 1.0

    NASA Technical Reports Server (NTRS)

    Reinhart, Richard C.

    1992-01-01

    The Experiment Control and Monitor (EC&M) software was developed at NASA Lewis Research Center to support the Advanced Communications Technology Satellite (ACTS) High Burst Rate Link Evaluation Terminal (HBR-LET). The HBR-LET is an experimenter's terminal to communicate with the ACTS for various investigations by government agencies, universities, and industry. The EC&M software is one segment of the Control and Performance Monitoring (C&PM) software system of the HBR-LET. The EC&M software allows users to initialize, control, and monitor the instrumentation within the HBR-LET using a predefined sequence of commands. Besides instrument control, the C&PM software system is also responsible for computer communication between the HBR-LET and the ACTS NASA Ground Station and for uplink power control of the HBR-LET to demonstrate power augmentation during rain fade events. The EC&M Software User's Guide, Version 1.0 (NASA-CR-189160) outlines the commands required to install and operate the EC&M software. Input and output file descriptions, operator commands, and error recovery procedures are discussed in the document. The EC&M Software Maintenance Manual, Version 1.0 (NASA-CR-189161) is a programmer's guide that describes current implementation of the EC&M software from a technical perspective. An overview of the EC&M software, computer algorithms, format representation, and computer hardware configuration are included in the manual.

  2. Advancing global marine biogeography research with open-source GIS software and cloud-computing

    USGS Publications Warehouse

    Fujioka, Ei; Vanden Berghe, Edward; Donnelly, Ben; Castillo, Julio; Cleary, Jesse; Holmes, Chris; McKnight, Sean; Halpin, patrick

    2012-01-01

    Across many scientific domains, the ability to aggregate disparate datasets enables more meaningful global analyses. Within marine biology, the Census of Marine Life served as the catalyst for such a global data aggregation effort. Under the Census framework, the Ocean Biogeographic Information System was established to coordinate an unprecedented aggregation of global marine biogeography data. The OBIS data system now contains 31.3 million observations, freely accessible through a geospatial portal. The challenges of storing, querying, disseminating, and mapping a global data collection of this complexity and magnitude are significant. In the face of declining performance and expanding feature requests, a redevelopment of the OBIS data system was undertaken. Following an Open Source philosophy, the OBIS technology stack was rebuilt using PostgreSQL, PostGIS, GeoServer and OpenLayers. This approach has markedly improved the performance and online user experience while maintaining a standards-compliant and interoperable framework. Due to the distributed nature of the project and increasing needs for storage, scalability and deployment flexibility, the entire hardware and software stack was built on a Cloud Computing environment. The flexibility of the platform, combined with the power of the application stack, enabled rapid re-development of the OBIS infrastructure, and ensured complete standards-compliance.

  3. Advanced communications technology satellite high burst rate link evaluation terminal power control and rain fade software test plan, version 1.0

    NASA Technical Reports Server (NTRS)

    Reinhart, Richard C.

    1993-01-01

    The Power Control and Rain Fade Software was developed at the NASA Lewis Research Center to support the Advanced Communications Technology Satellite High Burst Rate Link Evaluation Terminal (ACTS HBR-LET). The HBR-LET is an experimenters terminal to communicate with the ACTS for various experiments by government, university, and industry agencies. The Power Control and Rain Fade Software is one segment of the Control and Performance Monitor (C&PM) Software system of the HBR-LET. The Power Control and Rain Fade Software automatically controls the LET uplink power to compensate for signal fades. Besides power augmentation, the C&PM Software system is also responsible for instrument control during HBR-LET experiments, control of the Intermediate Frequency Switch Matrix on board the ACTS to yield a desired path through the spacecraft payload, and data display. The Power Control and Rain Fade Software User's Guide, Version 1.0 outlines the commands and procedures to install and operate the Power Control and Rain Fade Software. The Power Control and Rain Fade Software Maintenance Manual, Version 1.0 is a programmer's guide to the Power Control and Rain Fade Software. This manual details the current implementation of the software from a technical perspective. Included is an overview of the Power Control and Rain Fade Software, computer algorithms, format representations, and computer hardware configuration. The Power Control and Rain Fade Test Plan provides a step-by-step procedure to verify the operation of the software using a predetermined signal fade event. The Test Plan also provides a means to demonstrate the capability of the software.

  4. An open software framework for advancement of x-ray optics simulation and modeling

    NASA Astrophysics Data System (ADS)

    Bruhwiler, David L.; Chubar, Oleg; Nagler, Robert; Krzywinski, Jacek; Boehnlein, Amber

    2014-09-01

    Accurate physical-optics based simulation of emission, transport and use in experiments of fully- and partially-coherent X-ray radiation is essential for both designers and users of experiments at state-of-the-art light sources: low-emittance storage rings, energy-recovery linacs and free-electron lasers. To be useful for different applications, the simulations must include accurate physical models for the processes of emission, for the structures of X-ray optical elements, interaction of the radiation with samples, and propagation of scattered X-rays to a detector. Based on the "Synchrotron Radiation Workshop" (SRW) open source computer code, we are developing a simulation framework, including a graphical user interface, web interface for client-server simulations, data format for wave-optics based representation of partially-coherent X-ray radiation, and a dictionary for universal description of optical elements. Also, we are evaluating formats for sample and experimental data representation for different types of experiments and processing. The simulation framework will facilitate start-to-end simulations by different computer codes complementary to SRW, for example GENESIS and FAST codes for simulating self-amplified spontaneous emission, SHADOW and McXtrace geometrical ray-tracing codes, as well as codes for simulation of interaction of radiation with matter and data processing in experiments exploiting coherence of radiation. The development of the new framework is building on components developed for the Python-based RadTrack software, which is designed for loose coupling of multiple electron and radiation codes to enable sophisticated workflows. We are exploring opportunities for collaboration with teams pursuing similar developments at European Synchrotron Radiation Facility and the European XFEL.

  5. User-specific interfaces for clinical data-management systems: an object-based approach.

    PubMed Central

    Wilton, R.

    1992-01-01

    Multiple user-specific visual interfaces are desirable in any computer-based clinical data-management system that is used by different people with different jobs to perform. The programming and maintenance problems of supporting multiple user interfaces to a single information system can be addressed by separating user-interface functionality from data-management subsystems, and by building user interfaces from object-based software components whose functionality is bound to an underlying server-client data-management architecture. Experience with this approach in a patient-tracking system suggests that this object-based approach is viable in the design of a user interface for a clinical information system. PMID:1482880

  6. Object-based attention in chimpanzees (Pan troglodytes).

    PubMed

    Ushitani, Tomokazu; Imura, Tomoko; Tomonaga, Masaki

    2010-03-17

    We conducted three experiments to investigate how object-based components contribute to the attentional processes of chimpanzees and to examine how such processes operate with regard to perceptually structured objects. In Experiment 1, chimpanzees responded to a spatial cueing task that required them to touch a target appearing at either end of two parallel rectangles. We compared the time involved in shifting attention (cost of attentional shift) when the locations of targets were cued and non cued. Results showed that the cost of the attentional shift within one rectangle was smaller than that beyond the object's boundary, demonstrating object-based attention in chimpanzees. The results of Experiment 2, conducted with different stimulus configurations, replicated the results of Experiment 1, supporting that object-based attention operates in chimpanzees. In Experiment 3, the cost of attentional shift within a cued but partly occluded rectangle was shorter than that within a rectangle that was cued but divided in the middle. The results suggest that the attention of chimpanzees is activated not only by an explicit object but also by fragmented patches represented as an object at a higher-order perceptual level. Chimpanzees' object-based attention may be similar to that of humans.

  7. Object based data access at the D0 experiment

    SciTech Connect

    Fuess, S.; D0 Collaboration

    1995-11-01

    The D{O} Experiment at Fermilab is currently participating in the FNAL Computing Division`s ``Computing for Analysis Project`` (CAP) to investigate object based data storage and access. Following a short description of the CAP system architecture, the D{O} data model is explored. A brief discussion of the method of operation of the CAP system leads into a concluding section.

  8. Multiscale quantification of urban composition from EO-1/Hyperion data using object-based spectral unmixing

    NASA Astrophysics Data System (ADS)

    Zhang, Caiyun

    2016-05-01

    Quantification of the urban composition is important in urban planning and management. Previous research has primarily focused on unmixing medium-spatial resolution multispectral imagery using spectral mixture analysis (SMA) in order to estimate the abundance of urban components. For this study an object-based multiple endmember spectral mixture analysis (MESMA) approach was applied to unmix the 30-m Earth Observing-1 (EO-1)/Hyperion hyperspectral imagery. The abundance of two physical urban components (vegetation and impervious surface) was estimated and mapped at multiple scales and two defined geographic zones. The estimation results were validated by a reference dataset generated from fine spatial resolution aerial photography. The object-based MESMA approach was compared with its corresponding pixel-based one, and EO-1/Hyperion hyperspectral data was compared with the simulated EO-1/Advanced Land Imager (ALI) multispectral data in the unmixing modeling. The pros and cons of the object-based MESMA were evaluated. The result illustrates that the object-based MESMA is promising for unmixing the medium-spatial resolution hyperspectral imagery to quantify the urban composition, and it is an attractive alternative to the traditional pixel-based mixture analysis for various applications.

  9. Object-Based Classification and Change Detection of Hokkaido, Japan

    NASA Astrophysics Data System (ADS)

    Park, J. G.; Harada, I.; Kwak, Y.

    2016-06-01

    Topography and geology are factors to characterize the distribution of natural vegetation. Topographic contour is particularly influential on the living conditions of plants such as soil moisture, sunlight, and windiness. Vegetation associations having similar characteristics are present in locations having similar topographic conditions unless natural disturbances such as landslides and forest fires or artificial disturbances such as deforestation and man-made plantation bring about changes in such conditions. We developed a vegetation map of Japan using an object-based segmentation approach with topographic information (elevation, slope, slope direction) that is closely related to the distribution of vegetation. The results found that the object-based classification is more effective to produce a vegetation map than the pixel-based classification.

  10. An object-based methodology for knowledge representation in SGML

    SciTech Connect

    Kelsey, R.L.; Hartley, R.T.; Webster, R.B.

    1997-11-01

    An object-based methodology for knowledge representation and its Standard Generalized Markup Language (SGML) implementation is presented. The methodology includes class, perspective domain, and event constructs for representing knowledge within an object paradigm. The perspective construct allows for representation of knowledge from multiple and varying viewpoints. The event construct allows actual use of knowledge to be represented. The SGML implementation of the methodology facilitates usability, structured, yet flexible knowledge design, and sharing and reuse of knowledge class libraries.

  11. Object-Based Epistemology at a Creationist Museum

    NASA Astrophysics Data System (ADS)

    Wendel, Paul J.

    2011-01-01

    In a regional young-earth creationist museum, objects are presented as if they speak for themselves, purportedly embodying proof that the earth is less than 10,000 years old, that humans have lived on earth throughout its history, and that dinosaurs and humans lived simultaneously. In public lectures, tours, and displays, museum associates emphasize direct observation over inference or theory. These emphases resonate closely with the "object-based epistemology" of the late nineteenth century described in Steven Conn's Museums and American Intellectual Life, 1876- 1926. In Conn's description, museum objects, artfully arranged and displayed, were intended to speak for themselves, and observation and categorization were valued over experiment and theory. The regional young-earth creationist museum is observed to partly succeed and partly fail in implementing an object-based epistemology. Although object-based epistemology represents a nineteenth-century approach to knowledge and museum display, it is compatible with an inductive approach to biblical interpretation and it confers various rhetorical advantages to creationist arguments. It is concluded that a focus on the theory-laden nature of data would likely strengthen nature-of-science education efforts to increase public acceptance of evolution.

  12. Object-based Landslide Mapping: Examples, Challenges and Opportunities

    NASA Astrophysics Data System (ADS)

    Hölbling, Daniel; Eisank, Clemens; Friedl, Barbara; Chang, Kang-Tsung; Tsai, Tsai-Tsung; Birkefeldt Møller Pedersen, Gro; Betts, Harley; Cigna, Francesca; Chiang, Shou-Hao; Aubrey Robson, Benjamin; Bianchini, Silvia; Füreder, Petra; Albrecht, Florian; Spiekermann, Raphael; Weinke, Elisabeth; Blaschke, Thomas; Phillips, Chris

    2016-04-01

    Over the last decade, object-based image analysis (OBIA) has been increasingly used for mapping landslides that occur after triggering events such as heavy rainfall. The increasing availability and quality of Earth Observation (EO) data in terms of temporal, spatial and spectral resolution allows for comprehensive mapping of landslides at multiple scales. Most often very high resolution (VHR) or high resolution (HR) optical satellite images are used in combination with a digital elevation model (DEM) and its products such as slope and curvature. Semi-automated object-based mapping makes use of various characteristics of image objects that are derived through segmentation. OBIA enables numerous spectral, spatial, contextual and textural image object properties to be applied during an analysis. This is especially useful when mapping complex natural features such as landslides and constitutes an advantage over pixel-based image analysis. However, several drawbacks in the process of object-based landslide mapping have not been overcome yet. The developed classification routines are often rather complex and limited regarding their transferability across areas and sensors. There is still more research needed to further improve present approaches and to fully exploit the capabilities of OBIA for landslide mapping. In this study several examples of object-based landslide mapping from various geographical regions with different characteristics are presented. Examples from the Austrian and Italian Alps are shown, whereby one challenge lies in the detection of small-scale landslides on steep slopes while preventing the classification of false positives with similar spectral properties (construction areas, utilized land, etc.). Further examples feature landslides mapped in Iceland, where the differentiation of landslides from other landscape-altering processes in a highly dynamic volcanic landscape poses a very distinct challenge, and in Norway, which is exposed to multiple

  13. Earth Observing System (EOS)/Advanced Microwave Sounding Unit-A (AMSU-A) software assurance plan

    NASA Technical Reports Server (NTRS)

    Schwantje, Robert; Smith, Claude

    1994-01-01

    This document defines the responsibilities of Software Quality Assurance (SOA) for the development of the flight software installed in EOS/AMSU-A instruments, and the ground support software used in the test and integration of the EOS/AMSU-A instruments.

  14. Advanced communications technology satellite high burst rate link evaluation terminal experiment control and monitor software user's guide, version 1.0

    NASA Technical Reports Server (NTRS)

    Reinhart, Richard C.

    1992-01-01

    The Experiment Control and Monitor (EC&M) software was developed at NASA Lewis Research Center to support the Advanced Communications Technology Satellite (ACTS) High Burst Rate Link Evaluation Terminal (HBR-LET). The HBR-LET is an experimenter's terminal to communicate with the ACTS for various investigations by government agencies, universities, and industry. The EC&M software is one segment of the Control and Performance Monitoring (C&PM) software system of the HBR-LET. The EC&M software allows users to initialize, control, and monitor the instrumentation within the HBR-LET using a predefined sequence of commands. Besides instrument control, the C&PM software system is also responsible for computer communication between the HBR-LET and the ACTS NASA Ground Station and for uplink power control of the HBR-LET to demonstrate power augmentation during rain fade events. The EC&M Software User's Guide, Version 1.0 (NASA-CR-189160) outlines the commands required to install and operate the EC&M software. Input and output file descriptions, operator commands, and error recovery procedures are discussed in the document.

  15. An object-based methodology for knowledge representation

    SciTech Connect

    Kelsey, R.L.; Hartley, R.T.; Webster, R.B.

    1997-11-01

    An object based methodology for knowledge representation is presented. The constructs and notation to the methodology are described and illustrated with examples. The ``blocks world,`` a classic artificial intelligence problem, is used to illustrate some of the features of the methodology including perspectives and events. Representing knowledge with perspectives can enrich the detail of the knowledge and facilitate potential lines of reasoning. Events allow example uses of the knowledge to be represented along with the contained knowledge. Other features include the extensibility and maintainability of knowledge represented in the methodology.

  16. Object-based indexing of MPEG-4 compressed video

    NASA Astrophysics Data System (ADS)

    Ferman, Ahmet M.; Gunsel, Bilge; Tekalp, A. Murat

    1997-01-01

    The MPEG-4 object-based coding standard, designed as a common platform for all multimedia applications, is inherently well-suited for video indexing applications. To fully exploit the advantages offered by MPEG-4, however, a reconsideration of existing indexing strategies is required. This paper proposes a new object-based framework for video indexing and retrieval that treats as the basic indexing unit the object itself, where changes in content are detected through observations made on the objects in the video sequence. We present a temporal segmentation algorithm that is designed to automatically extract key frames for each video object in an MPEG-4 compressed sequence based on the prediction model chosen by the encoder for individual macroblocks. An extension to the existing MPEG-4 syntax is presented for conducting and facilitating vast database searches. The data presented in the proposed 'indexing field' are: the birth and death frames of individual objects, global motion characteristics/camera operations observed in the scene, representative key frames that capture the major transformations each object undergoes, and the dominant motion characteristics of each object throughout its lifetime. We present the validity of the proposed scheme by results obtained on several MPEG-4 test sequences.

  17. A Secure and Robust Object-Based Video Authentication System

    NASA Astrophysics Data System (ADS)

    He, Dajun; Sun, Qibin; Tian, Qi

    2004-12-01

    An object-based video authentication system, which combines watermarking, error correction coding (ECC), and digital signature techniques, is presented for protecting the authenticity between video objects and their associated backgrounds. In this system, a set of angular radial transformation (ART) coefficients is selected as the feature to represent the video object and the background, respectively. ECC and cryptographic hashing are applied to those selected coefficients to generate the robust authentication watermark. This content-based, semifragile watermark is then embedded into the objects frame by frame before MPEG4 coding. In watermark embedding and extraction, groups of discrete Fourier transform (DFT) coefficients are randomly selected, and their energy relationships are employed to hide and extract the watermark. The experimental results demonstrate that our system is robust to MPEG4 compression, object segmentation errors, and some common object-based video processing such as object translation, rotation, and scaling while securely preventing malicious object modifications. The proposed solution can be further incorporated into public key infrastructure (PKI).

  18. Object-based attentional selection modulates anticipatory alpha oscillations.

    PubMed

    Knakker, Balázs; Weiss, Béla; Vidnyánszky, Zoltán

    2014-01-01

    Visual cortical alpha oscillations are involved in attentional gating of incoming visual information. It has been shown that spatial and feature-based attentional selection result in increased alpha oscillations over the cortical regions representing sensory input originating from the unattended visual field and task-irrelevant visual features, respectively. However, whether attentional gating in the case of object based selection is also associated with alpha oscillations has not been investigated before. Here we measured anticipatory electroencephalography (EEG) alpha oscillations while participants were cued to attend to foveal face or word stimuli, the processing of which is known to have right and left hemispheric lateralization, respectively. The results revealed that in the case of simultaneously displayed, overlapping face and word stimuli, attending to the words led to increased power of parieto-occipital alpha oscillations over the right hemisphere as compared to when faces were attended. This object category-specific modulation of the hemispheric lateralization of anticipatory alpha oscillations was maintained during sustained attentional selection of sequentially presented face and word stimuli. These results imply that in the case of object-based attentional selection-similarly to spatial and feature-based attention-gating of visual information processing might involve visual cortical alpha oscillations.

  19. An Object-Based Method for Chinese Landform Types Classification

    NASA Astrophysics Data System (ADS)

    Ding, Hu; Tao, Fei; Zhao, Wufan; Na, Jiaming; Tang, Guo'an

    2016-06-01

    Landform classification is a necessary task for various fields of landscape and regional planning, for example for landscape evaluation, erosion studies, hazard prediction, et al. This study proposes an improved object-based classification for Chinese landform types using the factor importance analysis of random forest and the gray-level co-occurrence matrix (GLCM). In this research, based on 1km DEM of China, the combination of the terrain factors extracted from DEM are selected by correlation analysis and Sheffield's entropy method. Random forest classification tree is applied to evaluate the importance of the terrain factors, which are used as multi-scale segmentation thresholds. Then the GLCM is conducted for the knowledge base of classification. The classification result was checked by using the 1:4,000,000 Chinese Geomorphological Map as reference. And the overall classification accuracy of the proposed method is 5.7% higher than ISODATA unsupervised classification, and 15.7% higher than the traditional object-based classification method.

  20. Introducing shape constraints into object-based traveltime tomography

    NASA Astrophysics Data System (ADS)

    Gaullier, G.; Charbonnier, P.; Heitz, F.; Côte, P.

    2016-09-01

    Traveltime tomography is a difficult, ill-posed reconstruction problem due to the nonlinearity of the forward model and the limited number of measurements usually available. In such an adverse situation, pixel-based regularization methods are generally unable to provide satisfactory reconstructions. In this paper we propose a novel object-based reconstruction method that introduces prior information about the shape of the structures to be reconstructed, which yields high quality geoacoustic inversion. The proposed method approaches the forward model by a series of linear problems, leading to a sequence of minimizations during which the shape prior is introduced. The method is demonstrated on synthetic and real data, collected on a specific bench dedicated to non-destructive testing of civil engineering structures.

  1. Earth Observing System/Advanced Microwave Sounding Unit-A (EOS/AMSU-A) software management plan

    NASA Technical Reports Server (NTRS)

    Schwantje, Robert

    1994-01-01

    This document defines the responsibilites for the management of the like-cycle development of the flight software installed in the AMSU-A instruments, and the ground support software used in the test and integration of the AMSU-A instruments.

  2. Earth Observing System/Advanced Microwave Sounding Unit-A (EOS/AMSU-A): Software concept document

    NASA Technical Reports Server (NTRS)

    Schwantje, Robert

    1994-01-01

    This document presents the concept for two separate but closely related software systems. The first is the special test equipment (STE) software used at Aerojet for AMSU-A instrument testing, and the second is the workstation software used at the spacecraft integration facility to monitor the AMSU-A instrument when installed on the spacecraft. The primary use of the STE software is to monitor the data output from the AMSU-A instruments, to command the instruments, and to perform automated thermal-vacuum calibration testing. The primary use of the workstation software is to monitor the AMSU-A instrument's performance through an Ethernet link during the instrument/spacecraft integration process.

  3. The Cementitious Barriers Partnership Experimental Programs and Software Advancing DOE’s Waste Disposal/Tank Closure Efforts – 15436

    SciTech Connect

    Burns, Heather; Flach, Greg; Smith, Frank; Langton, Christine; Brown, Kevin; Mallick, Pramod

    2015-01-27

    The U.S. Department of Energy Environmental Management (DOE-EM) Office of Tank Waste Management-sponsored Cementitious Barriers Partnership (CBP) is chartered with providing the technical basis for implementing cement-based waste forms and radioactive waste containment structures for long-term disposal. DOE needs in this area include the following to support progress in final treatment and disposal of legacy waste and closure of High-Level Waste (HLW) tanks in the DOE complex: long-term performance predictions, flow sheet development and flow sheet enhancements, and conceptual designs for new disposal facilities. The DOE-EM Cementitious Barriers Partnership is producing software and experimental programs resulting in new methods and data needed for end-users involved with environmental cleanup and waste disposal. Both the modeling tools and the experimental data have already benefited the DOE sites in the areas of performance assessments by increasing confidence backed up with modeling support, leaching methods, and transport properties developed for actual DOE materials. In 2014, the CBP Partnership released the CBP Software Toolbox –“Version 2.0” which provides concrete degradation models for 1) sulfate attack, 2) carbonation, and 3) chloride initiated rebar corrosion, and includes constituent leaching. These models are applicable and can be used by both DOE and the Nuclear Regulatory Commission (NRC) for service life and long-term performance evaluations and predictions of nuclear and radioactive waste containment structures across the DOE complex, including future SRS Saltstone and HLW tank performance assessments and special analyses, Hanford site HLW tank closure projects and other projects in which cementitious barriers are required, the Advanced Simulation Capability for Environmental Management (ASCEM) project which requires source terms from cementitious containment structures as input to their flow simulations, regulatory reviews of DOE performance

  4. An Advanced Flaw-Response Modelling Approach for Inspection Qualification Using a Multi-Agent System Software Platform

    NASA Astrophysics Data System (ADS)

    McKenna, James P.; McLean, Neil; Gachagan, Anthony; McArthur, Stephen D. J.; Hayward, Gordon

    2008-02-01

    This paper describes the application of a Multi-Agent System used in the design and qualification of ultrasonic NDT inspections using theoretical ultrasonic flaw-response modelling. When a large number of models are available the selection of the most appropriate one for a given inspection scenario becomes time-consuming; the disparate nature of the software models prevents easy integration with other NDT software tools to automate this process. A prototype Inspection Qualification Multi-Agent System has been developed which incorporates a rule-based software system to perform the flaw-response model selection procedure.

  5. Object-based change detection: dimension of damage in residential areas of Abu Suruj, Sudan

    NASA Astrophysics Data System (ADS)

    Demharter, Timo; Michel, Ulrich; Ehlers, Manfred; Reinartz, Peter

    2011-11-01

    Given the importance of Change Detection, especially in the field of crisis management, this paper discusses the advantage of object-based Change Detection. This project and the used methods give an opportunity to coordinate relief actions strategically. The principal objective of this project was to develop an algorithm which allows to detect rapidly damaged and destroyed buildings in the area of Abu Suruj. This Sudanese village is located in West-Darfur and has become the victim of civil war. The software eCognition Developer was used to per-form an object-based Change Detection on two panchromatic Quickbird 2 images from two different time slots. The first image shows the area before, the second image shows the area after the massacres in this region. Seeking a classification for the huts of the Sudanese town Abu Suruj was reached by first segmenting the huts and then classifying them on the basis of geo-metrical and brightness-related values. The huts were classified as "new", "destroyed" and "preserved" with the help of a automated algorithm. Finally the results were presented in the form of a map which displays the different conditions of the huts. The accuracy of the project is validated by an accuracy assessment resulting in an Overall Classification Accuracy of 90.50 percent. These change detection results allow aid organizations to provide quick and efficient help where it is needed the most.

  6. Geographic Object-Based Image Analysis - Towards a new paradigm

    NASA Astrophysics Data System (ADS)

    Blaschke, Thomas; Hay, Geoffrey J.; Kelly, Maggi; Lang, Stefan; Hofmann, Peter; Addink, Elisabeth; Queiroz Feitosa, Raul; van der Meer, Freek; van der Werff, Harald; van Coillie, Frieke; Tiede, Dirk

    2014-01-01

    The amount of scientific literature on (Geographic) Object-based Image Analysis - GEOBIA has been and still is sharply increasing. These approaches to analysing imagery have antecedents in earlier research on image segmentation and use GIS-like spatial analysis within classification and feature extraction approaches. This article investigates these development and its implications and asks whether or not this is a new paradigm in remote sensing and Geographic Information Science (GIScience). We first discuss several limitations of prevailing per-pixel methods when applied to high resolution images. Then we explore the paradigm concept developed by Kuhn (1962) and discuss whether GEOBIA can be regarded as a paradigm according to this definition. We crystallize core concepts of GEOBIA, including the role of objects, of ontologies and the multiplicity of scales and we discuss how these conceptual developments support important methods in remote sensing such as change detection and accuracy assessment. The ramifications of the different theoretical foundations between the 'per-pixel paradigm' and GEOBIA are analysed, as are some of the challenges along this path from pixels, to objects, to geo-intelligence. Based on several paradigm indications as defined by Kuhn and based on an analysis of peer-reviewed scientific literature we conclude that GEOBIA is a new and evolving paradigm.

  7. A model of proto-object based saliency

    PubMed Central

    Russell, Alexander F.; Mihalaş, Stefan; von der Heydt, Rudiger; Niebur, Ernst; Etienne-Cummings, Ralph

    2013-01-01

    Organisms use the process of selective attention to optimally allocate their computational resources to the instantaneously most relevant subsets of a visual scene, ensuring that they can parse the scene in real time. Many models of bottom-up attentional selection assume that elementary image features, like intensity, color and orientation, attract attention. Gestalt psychologists, how-ever, argue that humans perceive whole objects before they analyze individual features. This is supported by recent psychophysical studies that show that objects predict eye-fixations better than features. In this report we present a neurally inspired algorithm of object based, bottom-up attention. The model rivals the performance of state of the art non-biologically plausible feature based algorithms (and outperforms biologically plausible feature based algorithms) in its ability to predict perceptual saliency (eye fixations and subjective interest points) in natural scenes. The model achieves this by computing saliency as a function of proto-objects that establish the perceptual organization of the scene. All computational mechanisms of the algorithm have direct neural correlates, and our results provide evidence for the interface theory of attention. PMID:24184601

  8. Object-based rapid change detection for disaster management

    NASA Astrophysics Data System (ADS)

    Thunig, Holger; Michel, Ulrich; Ehlers, Manfred; Reinartz, Peter

    2011-11-01

    Rapid change detection is used in cases of natural hazards and disasters. This analysis lead to quick information about areas of damage. In certain cases the lack of information after catastrophe events is obstructing supporting measures within disaster management. Earthquakes, tsunamis, civil war, volcanic eruption, droughts and floods have much in common: people are directly affected, landscapes and buildings are destroyed. In every case geospatial data is necessary to gain knowledge as basement for decision support. Where to go first? Which infrastructure is usable? How much area is affected? These are essential questions which need to be answered before appropriate, eligible help can be established. This study presents an innovative strategy to retrieve post event information by use of an object-based change detection approach. Within a transferable framework, the developed algorithms can be implemented for a set of remote sensing data among different investigation areas. Several case studies are the base for the retrieved results. Within a coarse dividing into statistical parts and the segmentation in meaningful objects, the framework is able to deal with different types of change. By means of an elaborated normalized temporal change index (NTCI) panchromatic datasets are used to extract areas which are destroyed, areas which were not affected and in addition areas which are developing new for cases where rebuilding has already started. The results of the study are also feasible for monitoring urban growth.

  9. Object-based approaches to image classification for hyperspatial and hyperspectral data

    NASA Astrophysics Data System (ADS)

    Sridharan, Harini

    The prime objective of this research is to develop a suitable object based classifier for detailed land use/land cover classification (LULC) of remote sensing data with high spatial and spectral resolution. Owing to technical limitations, remote sensing data were available either at high spatial resolution (<4m) or with a moderate to hyperspectral resolution (>4 bands) but not with combination of both until recently. Processing of the high spectral resolution imagery for LULC classification was predominantly pixel based due to the lack of sufficient spatial resolution for identifying individual objects. For high spatial resolution imagery, object based analysis was devised that performed classification at individual object level. But detailed object classification was restricted due to the limitations in the spectral resolution. Recently, the advancements in remote sensing technology have made hyperspectral imagery with high spatial resolution available that permits object-based processing of these datasets for a detailed LULC classification. However, currently available object-based classifiers are only modifications of the pixel based classifiers developed for multispectral data. They are either parametric in nature with the assumption of Gaussian distribution and/or do not completely exploit the rich spectral information available in the hyperspectral imagery. This research proposes a supervised non-parametric fuzzy classifier that performs classification based on the object-level distribution of reflectance values. A fuzzy Kolmogorov-Smirnov based classifier is proposed that performs an object-to-object matching of the empirical distribution of the reflectance values of each object and derives a fuzzy membership grade for each class without any distributional assumptions. This object based classification procedure was tested for its robustness on three different sensors with varying combinations of spectral and spatial resolutions. General land use/land cover

  10. Advancing the discussion about systematic classroom behavioral observation, a product review of Tenny, J. (2010). eCOVE observation software. Pacific City, OR: eCOVE Software, LLC.

    PubMed

    Froiland, John Mark; Smith, Liana

    2014-05-01

    Applied child psychologists and behavioral consultants often use systematic behavioral observations to inform the psychological assessment and intervention development process for children referred for attention and hyperactivity problems. This article provides a review of the 2010 version of the eCOVE classroom observation software in terms of its utility in tracking the progress of children with attention and hyperactive behaviors and its use in evaluating teacher behaviors that may impede or promote children's attention and positive behavior. The eCOVE shows promise as an efficient tool for psychologists and behavioral consultants who want to evaluate the effects of interventions for children with symptoms of ADHD, ODD, mood disorders and learning disorders; however, some research-based improvements for future models are suggested. The reviewers also share their firsthand experience in using eCOVE to evaluate teacher and student behavior exhibited on a television show about teaching urban high school students and during a movie about an eccentric new kindergarten teacher. Rich examples are provided of using strategic behavioral observations to reveal how to improve the classroom environment so as to facilitate attention, motivation and positive behavior among youth. Broader implications for enhancing the use of systematic behavioral observations in the assessment of children and adolescents with attention disorders and related behavioral problems are discussed. Key issues are examined such as the use of behavioral observations during psychological consultation to prevent the previously found gender bias in referrals for ADHD. Using behavioral observations to enhance differential diagnosis is also discussed.

  11. Object-based detection of vehicles in airborne data

    NASA Astrophysics Data System (ADS)

    Schilling, Hendrik; Bulatov, Dimitri; Middelmann, Wolfgang

    2015-10-01

    Robust detection of vehicles in airborne data is a challenging task since a high variation in the object signatures - depending on data resolution - and often a small contrast between objects and background lead to high false classification rates and missed detections. Despite these facts, many applications require reliable results which can be obtained in a short time. In this paper, an object-based approach for vehicle detection in airborne laser scans (ALS) and photogrammetrically reconstructed 2.5D data is described. The focus of this paper lies on a robust object segmentation algorithm as well as the identification of features for a reliable separation between vehicles and background (all nonevehicle objects) on different scenes. The described method is based on three consecutive steps, namely, object segmentation, feature extraction and supervised classification. In the first step, the 2.5D data is segmented and possible targets are identified. The segmentation progress is based on the morphological top-hat filtering, which leaves areas that are smaller than a given filter size and higher (brighter) than their surroundings. The approach is chosen due to the low computational effort of this filter, which allows a fast computation even for large areas. The next step is feature extraction. Based on the initial segmentation, features for every identified object are extracted. In addition to frequently used features like height above ground, object area, or point distribution, more complex features like object planarity, entropy in the intensity image, and lineness measures are used. The last step contains classification of each object. For this purpose, a random forest classifier (RF) using the normalized features extracted in the previous step is chosen. RFs are suitable for high dimensional and nonlinear problems. In contrast to other approaches (e.g. maximum likelihood classifier), RFs achieves good results even with relatively small training samples.

  12. Advanced Resistive Exercise Device (ARED) Flight Software (FSW): A Unique Approach to Exercise in Long Duration Habitats

    NASA Technical Reports Server (NTRS)

    Mangieri, Mark

    2005-01-01

    ARED flight instrumentation software is associated with an overall custom designed resistive exercise system that will be deployed on the International Space Station (ISS). This innovative software application fuses together many diverse and new technologies into a robust and usable package. The software takes advantage of touchscreen user interface technology by providing a graphical user interface on a Windows based tablet PC, meeting a design constraint of keyboard-less interaction with flight crewmembers. The software interacts with modified commercial data acquisition (DAQ) hardware to acquire multiple channels of sensor measurment from the ARED device. This information is recorded on the tablet PC and made available, via International Space Station (ISS) Wireless LAN (WLAN) and telemetry subsystems, to ground based mission medics and trainers for analysis. The software includes a feature to accept electronically encoded prescriptions of exercises that guide crewmembers through a customized regimen of resistive weight training, based on personal analysis. These electronically encoded prescriptions are provided to the crew via ISS WLAN and telemetry subsystems. All personal data is securely associated with an individual crew member, based on a PIN ID mechanism.

  13. Scheduling Software

    NASA Technical Reports Server (NTRS)

    1993-01-01

    Advanced Scheduling Environment is a software product designed and marketed by AVYX, Inc. to provide scheduling solutions for complex manufacturing environments. It can be adapted to specific scheduling and manufacturing processes and has led to substantial cost savings. The system was originally developed for NASA use in scheduling Space Shuttle flights and satellite activities. AVYX, Inc. is an offshoot of a company formed to provide computer-related services to NASA. TREES-plus, the company's initial product became the programming language for the advanced scheduling environment system.

  14. Object-Based Change Detection Using Georeferenced Uav Images

    NASA Astrophysics Data System (ADS)

    Shi, J.; Wang, J.; Xu, Y.

    2011-09-01

    Unmanned aerial vehicles (UAV) have been widely used to capture and down-link real-time videos/images. However, their role as a low-cost airborne platform for capturing high-resolution, geo-referenced still imagery has not been fully utilized. The images obtained from UAV are advantageous over remote sensing images as they can be obtained at a low cost and potentially no risk to human life. However, these images are distorted due to the noise generated by the rotary wings which limits the usefulness of such images. One potential application of such images is to detect changes between the images of the same area which are collected over time. Change detection is of widespread interest due to a large number of applications, including surveillance and civil infrastructure. Although UAVs can provide images with high resolution in a portable and easy way, such images only cover small parts of the entire field of interest and are often with high deformation. Until now, there is not much application of change detection for UAV images. Also the traditional pixel-based change detection method does not give satisfactory results for such images. In this paper, we have proposed a novel object-based method for change detection using UAV images which can overcome the effect of deformation and can fully utilize the high resolution capability of UAV images. The developed method can be divided into five main blocks: pre-processing, image matching, image segmentation and feature extraction, change detection and accuracy evaluation. The pre-processing step is further divided into two sub-steps: the first sub-step is to geometrically correct the bi-temporal image based on the geo-reference information (GPS/INS) installed on the UAV system, and the second sub-step is the radiometric normalization using a histogram method. The image matching block uses the well-known scale-invariant feature transform (SIFT) algorithm to match the same areas in the images and then resample them. The

  15. Payload software technology: Software technology development plan

    NASA Technical Reports Server (NTRS)

    1977-01-01

    Programmatic requirements for the advancement of software technology are identified for meeting the space flight requirements in the 1980 to 1990 time period. The development items are described, and software technology item derivation worksheets are presented along with the cost/time/priority assessments.

  16. An object-based approach to hierarchical classification of the Earth's topography from SRTM data

    NASA Astrophysics Data System (ADS)

    Eisank, C.; Dragut, L.

    2012-04-01

    Digital classification of the Earth's surface has significantly benefited from the availability of global DEMs and recent advances in image processing techniques. Such an innovative approach is object-based analysis, which integrates multi-scale segmentation and rule-based classification. Since the classification is based on spatially configured objects and no longer on solely thematically defined cells, the resulting landforms or landform types are represented in a more realistic way. However, up to now, the object-based approach has not been adopted for broad-scale topographic modelling. Existing global to almost-global terrain classification systems have been implemented on per cell schemes, accepting disadvantages such as the speckled character of outputs and the non-consideration of space. We introduce the first object-based method to automatically classify the Earth's surface as represented by the SRTM into a three-level hierarchy of topographic regions. The new method relies on the concept of decomposing land-surface complexity into ever more homogeneous domains. The SRTM elevation layer is automatically segmented and classified at three levels that represent domains of complexity by using self-adaptive, data-driven techniques. For each domain, scales in the data are detected with the help of local variance and segmentation is performed at these recognised scales. Objects resulting from segmentation are partitioned into sub-domains based on thresholds given by the mean values of elevation and standard deviation of elevation respectively. Results resemble patterns of existing global and regional classifications, displaying a level of detail close to manually drawn maps. Statistical evaluation indicates that most of the classes satisfy the regionalisation requirements of maximising internal homogeneity while minimising external homogeneity. Most objects have boundaries matching natural discontinuities at the regional level. The method is simple and fully

  17. Combining TerraSAR-X and SPOT-5 data for object-based landslide detection

    NASA Astrophysics Data System (ADS)

    Friedl, B.; Hölbling, D.; Füreder, P.

    2012-04-01

    Landslide detection and classification is an essential requirement in pre- and post-disaster hazard analysis. In earlier studies landslide detection often was achieved through time-consuming and cost-intensive field surveys and visual orthophoto interpretation. Recent studies show that Earth Observation (EO) data offer new opportunities for fast, reliable and accurate landslide detection and classification, which may conduce to an effective landslide monitoring and landslide hazard management. To ensure the fast recognition and classification of landslides at a regional scale, a (semi-)automated object-based landslide detection approach is established for a study site situated in the Huaguoshan catchment, Southern Taiwan. The study site exhibits a high vulnerability to landslides and debris flows, which are predominantly typhoon-induced. Through the integration of optical satellite data (SPOT-5 with 2.5 m GSD), SAR (Synthetic Aperture Radar) data (TerraSAR-X Spotlight with 2.95 m GSD) and digital elevation information (DEM with 5 m GSD) including its derived products (e.g. slope, curvature, flow accumulation) landslides may be examined in a more efficient way as if relying on single data sources only. The combination of optical and SAR data in an object-based image analysis (OBIA) domain for landslide detection and classification has not been investigated so far, even if SAR imagery show valuable properties for landslide detection, which differ from optical data (e.g. high sensitivity to surface roughness and soil moisture). The main purpose of this study is to recognize and analyze existing landslides by applying object-based image analysis making use of eCognition software. OBIA provides a framework for examining features defined by spectral, spatial, textural, contextual as well as hierarchical properties. Objects are derived through image segmentation and serve as input for the classification process, which relies on transparent rulesets, representing knowledge

  18. Revealing text in a complexly rolled silver scroll from Jerash with computed tomography and advanced imaging software

    NASA Astrophysics Data System (ADS)

    Hoffmann Barfod, Gry; Larsen, John Møller; Lichtenberger, Achim; Raja, Rubina

    2015-12-01

    Throughout Antiquity magical amulets written on papyri, lead and silver were used for apotropaic reasons. While papyri often can be unrolled and deciphered, metal scrolls, usually very thin and tightly rolled up, cannot easily be unrolled without damaging the metal. This leaves us with unreadable results due to the damage done or with the decision not to unroll the scroll. The texts vary greatly and tell us about the cultural environment and local as well as individual practices at a variety of locations across the Mediterranean. Here we present the methodology and the results of the digital unfolding of a silver sheet from Jerash in Jordan from the mid-8th century CE. The scroll was inscribed with 17 lines in presumed pseudo-Arabic as well as some magical signs. The successful unfolding shows that it is possible to digitally unfold complexly folded scrolls, but that it requires a combination of the know-how of the software and linguistic knowledge.

  19. Computer Software.

    ERIC Educational Resources Information Center

    Kay, Alan

    1984-01-01

    Discusses the nature and development of computer software. Programing, programing languages, types of software (including dynamic spreadsheets), and software of the future are among the topics considered. (JN)

  20. Problem-Solving Software

    NASA Technical Reports Server (NTRS)

    1992-01-01

    CBR Express software solves problems by adapting sorted solutions to new problems specified by a user. It is applicable to a wide range of situations. The technology was originally developed by Inference Corporation for Johnson Space Center's Advanced Software Development Workstation. The project focused on the reuse of software designs, and Inference used CBR as part of the ACCESS prototype software. The commercial CBR Express is used as a "help desk" for customer support, enabling reuse of existing information when necessary. It has been adopted by several companies, among them American Airlines, which uses it to solve reservation system software problems.

  1. Payload software technology

    NASA Technical Reports Server (NTRS)

    1976-01-01

    A software analysis was performed of known STS sortie payload elements and their associated experiments. This provided basic data for STS payload software characteristics and sizes. A set of technology drivers was identified based on a survey of future technology needs and an assessment of current software technology. The results will be used to evolve a planned approach to software technology development. The purpose of this plan is to ensure that software technology is advanced at a pace and a depth sufficient to fulfill the identified future needs.

  2. Revealing text in a complexly rolled silver scroll from Jerash with computed tomography and advanced imaging software

    PubMed Central

    Hoffmann Barfod, Gry; Larsen, John Møller; Raja, Rubina

    2015-01-01

    Throughout Antiquity magical amulets written on papyri, lead and silver were used for apotropaic reasons. While papyri often can be unrolled and deciphered, metal scrolls, usually very thin and tightly rolled up, cannot easily be unrolled without damaging the metal. This leaves us with unreadable results due to the damage done or with the decision not to unroll the scroll. The texts vary greatly and tell us about the cultural environment and local as well as individual practices at a variety of locations across the Mediterranean. Here we present the methodology and the results of the digital unfolding of a silver sheet from Jerash in Jordan from the mid-8th century CE. The scroll was inscribed with 17 lines in presumed pseudo-Arabic as well as some magical signs. The successful unfolding shows that it is possible to digitally unfold complexly folded scrolls, but that it requires a combination of the know-how of the software and linguistic knowledge. PMID:26648504

  3. Software for the Canadian Advanced Nanospace eXperiment-4/5 (CanX-4/-5) Mission

    NASA Astrophysics Data System (ADS)

    Leonard, Matthew Leigh

    The CanX-4 and CanX-5 mission currently under development at The University of Toronto Institute for Aerospace Studies Space Flight Laboratory UTIAS/SFL is a challenging formation flying technology demonstration. Its requirements of sub-metre control accuracy have yet to be realized with nanosatellites. Many large technical challenges must be addressed in order to ensure the success of the CanX-4/5 mission. This includes the development of software for an intersatellite communication system, integration and optimization of key formation flying algorithms onto the Payload On-Board Computer as well as the development of a Hardware-In-The-Loop simulator for full on-orbit mission simulations. This thesis will provide background knowledge of the Space Flight Laboratory and its activities, the CanX-4/5 mission, and finally highlight the authors contributions to overcoming each of these technical challenges and ensuring the success of the CanX-4 and CanX-5 mission.

  4. Status Report on the Development of Micro-Scheduling Software for the Advanced Outage Control Center Project

    SciTech Connect

    Shawn St. Germain; Kenneth Thomas; Ronald Farris; Jeffrey Joe

    2014-09-01

    The long-term viability of existing nuclear power plants (NPPs) in the United States (U.S.) is dependent upon a number of factors, including maintaining high capacity factors, maintaining nuclear safety, and reducing operating costs, particularly those associated with refueling outages. Refueling outages typically take 20-30 days, and for existing light water NPPs in the U.S., the reactor cannot be in operation during the outage. Furthermore, given that many NPPs generate between $1-1.5 million/day in revenue when in operation, there is considerable interest in shortening the length of refueling outages. Yet, refueling outages are highly complex operations, involving multiple concurrent and dependent activities that are difficult to coordinate. Finding ways to improve refueling outage performance while maintaining nuclear safety has proven to be difficult. The Advanced Outage Control Center project is a research and development (R&D) demonstration activity under the Light Water Reactor Sustainability (LWRS) Program. LWRS is a R&D program which works with industry R&D programs to establish technical foundations for the licensing and managing of long-term, safe, and economical operation of current NPPs. The Advanced Outage Control Center project has the goal of improving the management of commercial NPP refueling outages. To accomplish this goal, this INL R&D project is developing an advanced outage control center (OCC) that is specifically designed to maximize the usefulness of communication and collaboration technologies for outage coordination and problem resolution activities. This report describes specific recent efforts to develop a capability called outage Micro-Scheduling. Micro-Scheduling is the ability to allocate and schedule outage support task resources on a sub-hour basis. Micro-Scheduling is the real-time fine-tuning of the outage schedule to react to the actual progress of the primary outage activities to ensure that support task resources are

  5. Optimizing Object-Based Classification in Urban Environments Using Very High Resolution GEOEYE-1 Imagery

    NASA Astrophysics Data System (ADS)

    Aguilar, M. A.; Vicente, R.; Aguilar, F. J.; Fernández, A.; Saldaña, M. M.

    2012-07-01

    The latest breed of very high resolution (VHR) commercial satellites opens new possibilities for cartographic and remote sensing applications. In fact, one of the most common applications of remote sensing images is the extraction of land cover information for digital image base maps by means of classification techniques. When VHR satellite images are used, an object-based classification strategy can potentially improve classification accuracy compared to pixel based classification. The aim of this work is to carry out an accuracy assessment test on the classification accuracy in urban environments using pansharpened and panchromatic GeoEye-1 orthoimages. In this work, the influence on object-based supervised classification accuracy is evaluated with regard to the sets of image object (IO) features used for classification of the land cover classes selected. For the classification phase the nearest neighbour classifier and the eCognition v. 8 software were used, using seven sets of IO features, including texture, geometry and the principal layer values features. The IOs were attained by eCognition using a multiresolution segmentation approach that is a bottom-up regionmerging technique starting with one-pixel. Four different sets or repetitions of training samples, always representing a 10% for each classes were extracted from IOs while the remaining objects were used for accuracy validation. A statistical test was carried out in order to strengthen the conclusions. An overall accuracy of 79.4% was attained with the panchromatic, red, blue, green and near infrared (NIR) bands from the panchromatic and pansharpened orthoimages, the brightness computed for the red, blue, green and infrared bands, the Maximum Difference, a mean of soil-adjusted vegetation index (SAVI), and, finally the normalized Digital Surface Model or Object Model (nDSM), computed from LiDAR data. For buildings classification, nDSM was the most important feature attaining producer and user

  6. A novel mask proximity correction software combining accuracy and reduced writing time for the manufacturing of advanced photomasks

    NASA Astrophysics Data System (ADS)

    Schiavone, Patrick; Martin, Luc; Browning, Clyde; Farys, Vincent; Sundermann, Frank; Narukawa, Shogo; Takikawa, Tadahiko; Hayashi, Naoya

    2012-06-01

    The new generations of photomasks are seen to bring more and more challenges to the mask manufacturer. Maskshops face two conflicting requirements, namely improving pattern fidelity and reducing or at least maintaining acceptable writing time. These requirements are getting more and more challenging since pattern size continuously shrinks and data volumes continuously grows. Although the classical dose modulation Proximity Effect Correction is able to provide sufficient process control to the mainstream products, an increased number of published and wafer data show that the mask process is becoming a nonnegligible contributor to the 28nm technology yield. We will show in this paper that a novel approach of mask proximity effect correction is able to meet the dual challenge of the new generation of masks. Unlike the classical approach, the technique presented in this paper is based on a concurrent optimization of the dose and geometry of the fractured shots. Adding one more parameter allows providing the best possible compromise between accuracy and writing time since energy latitude can be taken into account as well. This solution is implemented in the Inscale software package from Aselta Nanographics. We have assessed the capability of this technology on several levels of a 28nm technology. On this set, the writing time has been reduced up to 25% without sacrificing the accuracy which at the same time has been improved significantly compared to the existing process. The experiments presented in the paper confirm that a versatile proximity effect correction strategy, combining dose and geometry modulation helps the users to tradeoff between resolution/accuracy and e-beam write time.

  7. A Symphony of Software.

    ERIC Educational Resources Information Center

    Currents, 2002

    2002-01-01

    Offers a descriptive table of databases that help higher education institutions orchestrate advancement operations. Information includes vendor, contact, software, price, database engine/server platform, recommended reporting tools, record capacity, and client type. (EV)

  8. Biological Imaging Software Tools

    PubMed Central

    Eliceiri, Kevin W.; Berthold, Michael R.; Goldberg, Ilya G.; Ibáñez, Luis; Manjunath, B.S.; Martone, Maryann E.; Murphy, Robert F.; Peng, Hanchuan; Plant, Anne L.; Roysam, Badrinath; Stuurman, Nico; Swedlow, Jason R.; Tomancak, Pavel; Carpenter, Anne E.

    2013-01-01

    Few technologies are more widespread in modern biological laboratories than imaging. Recent advances in optical technologies and instrumentation are providing hitherto unimagined capabilities. Almost all these advances have required the development of software to enable the acquisition, management, analysis, and visualization of the imaging data. We review each computational step that biologists encounter when dealing with digital images, the challenges in that domain, and the overall status of available software for bioimage informatics, focusing on open source options. PMID:22743775

  9. Cathodoluminescence Spectrum Imaging Software

    2011-04-07

    The software developed for spectrum imaging is applied to the analysis of the spectrum series generated by our cathodoluminescence instrumentation. This software provides advanced processing capabilities s such: reconstruction of photon intensity (resolved in energy) and photon energy maps, extraction of the spectrum from selected areas, quantitative imaging mode, pixel-to-pixel correlation spectrum line scans, ASCII, output, filling routines, drift correction, etc.

  10. The Development of Objective-Based Instructional Programs in Career Education.

    ERIC Educational Resources Information Center

    Hambleton, Ronald K.; Stetz, Frank P.

    1979-01-01

    Outlines the specifications for objective-based programs in career education, describes the development of one component (an item and objective bank) of an objective-based woodworking program, and discusses the development of objectives and test items and the utilization of criterion-referenced testing data. A sample of the woodworking objectives…

  11. Keeping it Together: Advanced algorithms and software for magma dynamics (and other coupled multi-physics problems)

    NASA Astrophysics Data System (ADS)

    Spiegelman, M.; Wilson, C. R.

    2011-12-01

    A quantitative theory of magma production and transport is essential for understanding the dynamics of magmatic plate boundaries, intra-plate volcanism and the geochemical evolution of the planet. It also provides one of the most challenging computational problems in solid Earth science, as it requires consistent coupling of fluid and solid mechanics together with the thermodynamics of melting and reactive flows. Considerable work on these problems over the past two decades shows that small changes in assumptions of coupling (e.g. the relationship between melt fraction and solid rheology), can have profound changes on the behavior of these systems which in turn affects critical computational choices such as discretizations, solvers and preconditioners. To make progress in exploring and understanding this physically rich system requires a computational framework that allows more flexible, high-level description of multi-physics problems as well as increased flexibility in composing efficient algorithms for solution of the full non-linear coupled system. Fortunately, recent advances in available computational libraries and algorithms provide a platform for implementing such a framework. We present results from a new model building system that leverages functionality from both the FEniCS project (www.fenicsproject.org) and PETSc libraries (www.mcs.anl.gov/petsc) along with a model independent options system and gui, Spud (amcg.ese.ic.ac.uk/Spud). Key features from FEniCS include fully unstructured FEM with a wide range of elements; a high-level language (ufl) and code generation compiler (FFC) for describing the weak forms of residuals and automatic differentiation for calculation of exact and approximate jacobians. The overall strategy is to monitor/calculate residuals and jacobians for the entire non-linear system of equations within a global non-linear solve based on PETSc's SNES routines. PETSc already provides a wide range of solvers and preconditioners, from

  12. Exploring the relationship between object realism and object-based attention effects.

    PubMed

    Roque, Nelson; Boot, Walter R

    2015-09-01

    Visual attention prioritizes processing of locations in space, and evidence also suggests that the benefits of attention can be shaped by the presence of objects (object-based attention). However, the prevalence of object-based attention effects has been called into question recently by evidence from a large-sampled study employing classic attention paradigms (Pilz et al., 2012). We conducted two experiments to explore factors that might determine when and if object-based attention effects are observed, focusing on the degree to which the concreteness and realism of objects might contribute to these effects. We adapted the classic attention paradigm first reported by Egly, Driver, and Rafal (1994) by replacing abstract bar stimuli in some conditions with objects that were more concrete and familiar to participants: items of silverware. Furthermore, we varied the realism of these items of silverware, presenting either cartoon versions or photo-realistic versions. Contrary to predictions, increased realism did not increase the size of object-based effects. In fact, no clear object-based effects were observed in either experiment, consistent with previous failures to replicate these effects in similar paradigms. While object-based attention may exist, and may have important influences on how we parse the visual world, these and other findings suggest that the two-object paradigm typically relied upon to study object-based effects may not be the best paradigm to investigate these issues.

  13. Selecting Software.

    ERIC Educational Resources Information Center

    Pereus, Steven C.

    2002-01-01

    Describes a comprehensive computer software selection and evaluation process, including documenting district needs, evaluating software packages, weighing the alternatives, and making the purchase. (PKP)

  14. Estimation of Trees Outside Forests using IRS High Resolution data by Object Based Image Analysis

    NASA Astrophysics Data System (ADS)

    Pujar, G. S.; Reddy, P. M.; Reddy, C. S.; Jha, C. S.; Dadhwal, V. K.

    2014-11-01

    Assessment of Trees outside forests (TOF) is widely being recognized as a pivotal theme, in sustainable natural resource management, due to their role in offering variety of goods, such as timber, fruits and fodder as well as services like water, carbon, biodiversity. Forest Conservation efforts involving reduction of deforestation and degradation may have to increasingly rely on alternatives provided by TOF in catering to economic demands in forest edges. Spatial information systems involving imaging, analysis and monitoring to achieve objectives under protocols like REDD+, require incorporation of information content from areas under forest as well as trees outside forests, to aid holistic decisions. In this perspective, automation in retrieving information on area under trees, growing outside forests, using high resolution imaging is essential so that measuring and verification of extant carbon pools, are strengthened. Retrieval of this tree cover is demonstrated herewith, using object based image analysis in a forest edge of dry deciduous forests of Eastern Ghats, in Khammam district of Telangana state of India. IRS high resolution panchromatic 2.5 m data (Cartosat-1 Orthorectified) used in tandem with 5.8 m multispectral LISS IV data, discerns tree crowns and clusters at a detailed scale and hence semi-automated approach is attempted to classify TOF from a pair of image from relatively crop and cloud free season. Object based image analysis(OBIA) approach as implemented in commercial suite of e-Cognition (Ver 8.9) consists of segmentation at user defined scale followed by application of wide range of spectral, textural and object geometry based parameters for classification. Software offers innovative blend of raster and vector features that can be juxtaposed flexibly, across scales horizontally or vertically. Segmentation was carried out at multiple scales to discern first the major land covers, such as forest, water, agriculture followed by that at a finer

  15. Object-based glacier mapping in the Hohe Tauern Mountains of Austria

    NASA Astrophysics Data System (ADS)

    Aubrey Robson, Benjamin; Hölbling, Daniel; Nuth, Christopher; Olaf Dahl, Svein

    2015-04-01

    Up-to-date and frequent glacier outlines are a necessity for many applications within glaciology. While multispectral band ratios are a comparatively robust method for automatically classifying clean ice on a pixel-based level, semi- or fully automated glacier inventories are complicated by spectral similarities between classes such as debris-covered glacier ice and the surrounding bedrock and moraines, or between clean ice and turbid pro-glacial water. Most glacier inventories therefore require a great deal of manual correction. Here, we present a glacier inventory of the Hohe Tauern Mountains in the Central Eastern Alps in Austria. Numerous glaciers, including the Pasterze Glacier, which is the longest glacier in the Eastern Alps, shape this mountainous region. The mapping of glaciers is based on object-based image analysis (OBIA) using both high resolution (HR) satellite imagery from Landsat 8 and a digital elevation model (DEM) derived from Airborne Laser Scanning (ALS) data. We automatically classify clean ice, debris-covered ice and glacial lakes. Image objects are created by applying the multiresolution segmentation algorithm implemented in the eCognition (Trimble) software. The resulting image objects are classified using a combination of various features, whereby a focus was put on the selection of robust features that are ideally applicable for mapping large areas, for example spectral indices such as the Normalized Differenced Vegetation Index (NDVI), Normalized Difference Snow and Ice Index (NDSI), Normalised Difference Water Index (NDWI), Land and Water Mask (LWK) and a ratio of the SWIR and NIR spectral bands. The ability of OBIA to incorporate optical and elevation data and to individually address data-specific characteristics helps differentiate debris-covered ice from surrounding features not only by using spectral properties but also based on morphological and topographic parameters, while the inclusion of rulesets relying on contextuality, size

  16. A comparison of pixel-based and object-based approaches for land use land cover classification in semi-arid areas, Sudan

    NASA Astrophysics Data System (ADS)

    Adam, H. E.; Csaplovics, E.; Elhaja, M. E.

    2016-06-01

    This paper deals with the comparison between application of pixel-based and object- based approaches in land use land cover classification in semi-arid areas in Sudan. The second aim is to assess the accuracy of classification for each approach. The study was conducted in the gum arabic belt in North Kordofan State, which is affected by modifications in conditions and composition of vegetation cover trends. The study used ASTER L1B registered radiance at the sensor image acquired on (19.10.2010). The image was radiometrically corrected by using ENVI-FLAASH software. Subset with an area of (40880) ha was created. The image classification (pixel-based and object-based) and accuracy assessment were conducted. Total number of (47) GCPs were surveyed and used in accuracy assessment using ERDAS 9.1. Image segmentation process was implemented using Definiens eCognition 7.1 software. Segmentation level 4 of scale parameter 25 was selected for classification based on colour and form homogeneity. Land use land cover classes were derived by classification using the nearest neighbor classifier with membership functions (fuzzy logic) for each class. The land use land cover distribution in the area for forest dominated by Acacia Senegal is (20%) and for residential area is (1.50%) for the two methods of classification. While for bare and farm land, grass and bush land and mixed woodland classes are (6.69% and 1.63%), (18.62% and 15.16%) and (53% and 61%) for pixel based and object based methods, respectively. The overall accuracy and Kappa statistic of the classification produced by the pixel-based and object-based were (72.92%, and 54.17%) and (0.6259 and 0.3810), respectively. The pixel based approach performed slightly better than the object-based approach in land use land cover classification in the semi-arid land in gum Arabic belt.

  17. System For Retrieving Reusable Software

    NASA Technical Reports Server (NTRS)

    Van Warren, Lloyd; Beckman, Brian C.

    1993-01-01

    Encyclopedia of Software Components (ESC) is information-retrieval system of computer hardware and software providing access to generic reusable software tools and parts. Core of ESC is central tool base, which is repository of reusable software. It receives queries and submissions from user through local browser subsystem and receives authorized updates from maintenance subsystem. Sends retrievals to local browser subsystem and user's submissions to maintenance subsystem. Future versions will provide for advanced media, including voice and video, and will link system to database-management system. Programmers will not only retrieve software, but also modify, execute, and cross-link with other software.

  18. Target-object integration, attention distribution, and object orientation interactively modulate object-based selection.

    PubMed

    Al-Janabi, Shahd; Greenberg, Adam S

    2016-10-01

    The representational basis of attentional selection can be object-based. Various studies have suggested, however, that object-based selection is less robust than spatial selection across experimental paradigms. We sought to examine the manner by which the following factors might explain this variation: Target-Object Integration (targets 'on' vs. part 'of' an object), Attention Distribution (narrow vs. wide), and Object Orientation (horizontal vs. vertical). In Experiment 1, participants discriminated between two targets presented 'on' an object in one session, or presented as a change 'of' an object in another session. There was no spatial cue-thus, attention was initially focused widely-and the objects were horizontal or vertical. We found evidence of object-based selection only when targets constituted a change 'of' an object. Additionally, object orientation modulated the sign of object-based selection: We observed a same-object advantage for horizontal objects, but a same-object cost for vertical objects. In Experiment 2, an informative cue preceded a single target presented 'on' an object or as a change 'of' an object (thus, attention was initially focused narrowly). Unlike in Experiment 1, we found evidence of object-based selection independent of target-object integration. We again found that the sign of selection was modulated by the objects' orientation. This result may reflect a meridian effect, which emerged due to anisotropies in the cortical representations when attention is oriented endogenously. Experiment 3 revealed that object orientation did not modulate object-based selection when attention was oriented exogenously. Our findings suggest that target-object integration, attention distribution, and object orientation modulate object-based selection, but only in combination.

  19. Target-object integration, attention distribution, and object orientation interactively modulate object-based selection.

    PubMed

    Al-Janabi, Shahd; Greenberg, Adam S

    2016-10-01

    The representational basis of attentional selection can be object-based. Various studies have suggested, however, that object-based selection is less robust than spatial selection across experimental paradigms. We sought to examine the manner by which the following factors might explain this variation: Target-Object Integration (targets 'on' vs. part 'of' an object), Attention Distribution (narrow vs. wide), and Object Orientation (horizontal vs. vertical). In Experiment 1, participants discriminated between two targets presented 'on' an object in one session, or presented as a change 'of' an object in another session. There was no spatial cue-thus, attention was initially focused widely-and the objects were horizontal or vertical. We found evidence of object-based selection only when targets constituted a change 'of' an object. Additionally, object orientation modulated the sign of object-based selection: We observed a same-object advantage for horizontal objects, but a same-object cost for vertical objects. In Experiment 2, an informative cue preceded a single target presented 'on' an object or as a change 'of' an object (thus, attention was initially focused narrowly). Unlike in Experiment 1, we found evidence of object-based selection independent of target-object integration. We again found that the sign of selection was modulated by the objects' orientation. This result may reflect a meridian effect, which emerged due to anisotropies in the cortical representations when attention is oriented endogenously. Experiment 3 revealed that object orientation did not modulate object-based selection when attention was oriented exogenously. Our findings suggest that target-object integration, attention distribution, and object orientation modulate object-based selection, but only in combination. PMID:27198915

  20. Reuse Metrics for Object Oriented Software

    NASA Technical Reports Server (NTRS)

    Bieman, James M.

    1998-01-01

    One way to increase the quality of software products and the productivity of software development is to reuse existing software components when building new software systems. In order to monitor improvements in reuse, the level of reuse must be measured. In this NASA supported project we (1) derived a suite of metrics which quantify reuse attributes for object oriented, object based, and procedural software, (2) designed prototype tools to take these measurements in Ada, C++, Java, and C software, (3) evaluated the reuse in available software, (4) analyzed the relationship between coupling, cohesion, inheritance, and reuse, (5) collected object oriented software systems for our empirical analyses, and (6) developed quantitative criteria and methods for restructuring software to improve reusability.

  1. Using digital photographs and object-based image analysis to estimate percent ground cover in vegetation plots

    USGS Publications Warehouse

    Luscier, J.D.; Thompson, W.L.; Wilson, J.M.; Gorham, B.E.; Dragut, L.D.

    2006-01-01

    Ground vegetation influences habitat selection and provides critical resources for survival and reproduction of animals. Researchers often employ visual methods to estimate ground cover, but these approaches may be prone to observer bias. We therefore evaluated a method using digital photographs of vegetation to objectively quantify percent ground cover of grasses, forbs, shrubs, litter, and bare ground within 90 plots of 2m2. We carried out object-based image analysis, using a software program called eCognition, to divide photographs into different vegetation classes (based on similarities among neighboring pixels) to estimate percent ground cover for each category. We used the Kappa index of agreement (KIA) to quantify correctly classified, randomly selected segments of all images. Our KIA values indicated strong agreement (> 80%) of all vegetation categories, with an average of 90-96% (SE = 5%) of shrub, litter, forb, and grass segments classified correctly. We also created artificial plots with known percentages of each vegetation category to evaluate the accuracy of software predictions. Observed differences between true cover and eCognition estimates for each category ranged from 1 to 4%. This technique provides a repeatable and reliable way to estimate percent ground cover that allows quantification of classification accuracy. ?? The Ecological Society of America.

  2. The effects of visual search efficiency on object-based attention.

    PubMed

    Greenberg, Adam S; Rosen, Maya; Cutrone, Elizabeth; Behrmann, Marlene

    2015-07-01

    The attentional prioritization hypothesis of object-based attention (Shomstein & Yantis in Perception & Psychophysics, 64, 41-51, 2002) suggests a two-stage selection process comprising an automatic spatial gradient and flexible strategic (prioritization) selection. The combined attentional priorities of these two stages of object-based selection determine the order in which participants will search the display for the presence of a target. The strategic process has often been likened to a prioritized visual search. By modifying the double-rectangle cueing paradigm (Egly, Driver, & Rafal in Journal of Experimental Psychology: General, 123, 161-177, 1994) and placing it in the context of a larger-scale visual search, we examined how the prioritization search is affected by search efficiency. By probing both targets located on the cued object and targets external to the cued object, we found that the attentional priority surrounding a selected object is strongly modulated by search mode. However, the ordering of the prioritization search is unaffected by search mode. The data also provide evidence that standard spatial visual search and object-based prioritization search may rely on distinct mechanisms. These results provide insight into the interactions between the mode of visual search and object-based selection, and help define the modulatory consequences of search efficiency for object-based attention.

  3. Software Reviews.

    ERIC Educational Resources Information Center

    Smith, Richard L., Ed.

    1985-01-01

    Reviews software packages by providing extensive descriptions and discussions of their strengths and weaknesses. Software reviewed include (1) "VISIFROG: Vertebrate Anatomy" (grade seven-adult); (2) "Fraction Bars Computer Program" (grades three to six) and (3) four telecommunications utilities. (JN)

  4. Software Program: Software Management Guidebook

    NASA Technical Reports Server (NTRS)

    1996-01-01

    The purpose of this NASA Software Management Guidebook is twofold. First, this document defines the core products and activities required of NASA software projects. It defines life-cycle models and activity-related methods but acknowledges that no single life-cycle model is appropriate for all NASA software projects. It also acknowledges that the appropriate method for accomplishing a required activity depends on characteristics of the software project. Second, this guidebook provides specific guidance to software project managers and team leaders in selecting appropriate life cycles and methods to develop a tailored plan for a software engineering project.

  5. Sandia National Laboratories Advanced Simulation and Computing (ASC) : appraisal method for the implementation of the ASC software quality engineering practices: Version 1.0.

    SciTech Connect

    Turgeon, Jennifer; Minana, Molly A.

    2008-02-01

    This document provides a guide to the process of conducting software appraisals under the Sandia National Laboratories (SNL) ASC Program. The goal of this document is to describe a common methodology for planning, conducting, and reporting results of software appraisals thereby enabling: development of an objective baseline on implementation of the software quality engineering (SQE) practices identified in the ASC Software Quality Plan across the ASC Program; feedback from project teams on SQE opportunities for improvement; identification of strengths and opportunities for improvement for individual project teams; guidance to the ASC Program on the focus of future SQE activities Document contents include process descriptions, templates to promote consistent conduct of appraisals, and an explanation of the relationship of this procedure to the SNL ASC software program.

  6. Proprietary software

    NASA Technical Reports Server (NTRS)

    Marnock, M. J.

    1971-01-01

    The protection of intellectual property by a patent, a copyright, or trade secrets is reviewed. The present and future use of computers and software are discussed, along with the governmental uses of software. The popularity of contractual agreements for sale or lease of computer programs and software services is also summarized.

  7. The pixel rectangle index used in object based building extraction from high resolution images

    NASA Astrophysics Data System (ADS)

    Cui, W. H.; Feng, X.; Qin, K.

    2014-03-01

    An improved high resolution object-based building extraction method based on Pixel Rectangle Index is presented in this paper. We use Minimum Span Tree optimal theory to realize object-based high resolution image segmentation. First, we proposed a rotation invariant Pixel Rectangle Index by introducing the principal direction of homogeneous area. Second, we improved the calculation of edge-weight by introducing the band-weight and Pixel Rectangle Index. The QuickBird high resolution images were used to do the building extraction experiment. The experiment result proved that this method can obtain high extraction accuracy and this algorithm can be efficiently used in remote sensing images.

  8. JTpack90: A parallel, object-based, Fortran 90 linear algebra package

    SciTech Connect

    Turner, J.A.; Kothe, D.B.; Ferrell, R.C.

    1997-03-01

    The authors have developed an object-based linear algebra package, currently with emphasis on sparse Krylov methods, driven primarily by needs of the Los Alamos National Laboratory parallel unstructured-mesh casting simulation tool Telluride. Support for a number of sparse storage formats, methods, and preconditioners have been implemented, driven primarily by application needs. They describe the object-based Fortran 90 approach, which enhances maintainability, performance, and extensibility, the parallelization approach using a new portable gather/scatter library (PGSLib), current capabilities and future plans, and present preliminary performance results on a variety of platforms.

  9. Computer software.

    PubMed

    Rosenthal, L E

    1986-10-01

    Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips.

  10. Using object-based image analysis to guide the selection of field sample locations

    Technology Transfer Automated Retrieval System (TEKTRAN)

    One of the most challenging tasks for resource management and research is designing field sampling schemes to achieve unbiased estimates of ecosystem parameters as efficiently as possible. This study focused on the potential of fine-scale image objects from object-based image analysis (OBIA) to be u...

  11. The Object-Based Simon Effect: Grasping Affordance or Relative Location of the Graspable Part?

    ERIC Educational Resources Information Center

    Cho, Dongbin; Proctor, Robert W.

    2010-01-01

    Reaction time is often shorter when the irrelevant graspable handle of an object corresponds with the location of a keypress response to the relevant attribute than when it does not. This object-based Simon effect has been attributed to an affordance for grasping the handle with the hand to the same side. Because a grasping affordance should…

  12. Activity in human visual and parietal cortex reveals object-based attention in working memory.

    PubMed

    Peters, Benjamin; Kaiser, Jochen; Rahm, Benjamin; Bledowski, Christoph

    2015-02-25

    Visual attention enables observers to select behaviorally relevant information based on spatial locations, features, or objects. Attentional selection is not limited to physically present visual information, but can also operate on internal representations maintained in working memory (WM) in service of higher-order cognition. However, only little is known about whether attention to WM contents follows the same principles as attention to sensory stimuli. To address this question, we investigated in humans whether the typically observed effects of object-based attention in perception are also evident for object-based attentional selection of internal object representations in WM. In full accordance with effects in visual perception, the key behavioral and neuronal characteristics of object-based attention were observed in WM. Specifically, we found that reaction times were shorter when shifting attention to memory positions located on the currently attended object compared with equidistant positions on a different object. Furthermore, functional magnetic resonance imaging and multivariate pattern analysis of visuotopic activity in visual (areas V1-V4) and parietal cortex revealed that directing attention to one position of an object held in WM also enhanced brain activation for other positions on the same object, suggesting that attentional selection in WM activates the entire object. This study demonstrated that all characteristic features of object-based attention are present in WM and thus follows the same principles as in perception.

  13. Is Object-Based Attention Mandatory? Strategic Control over Mode of Attention

    ERIC Educational Resources Information Center

    Yeari, Menahem; Goldsmith, Morris

    2010-01-01

    Is object-based attention mandatory or under strategic control? In an adapted spatial cuing paradigm, participants focused initially on a central arrow cue that was part of a perceptual group (Experiment 1) or a uniformly connected object (Experiment 2), encompassing one of the potential target locations. The cue always pointed to an opposite,…

  14. Object-based attention overrides perceptual load to modulate visual distraction

    PubMed Central

    Cosman, Joshua D.; Vecera, Shaun P.

    2014-01-01

    The ability to ignore task-irrelevant information and overcome distraction is central to our ability to efficiently carry out a number of tasks. One factor shown to strongly influence distraction is the perceptual load of the task being performed; as the perceptual load of task-relevant information processing increases, the likelihood that task-irrelevant information will be processed and interfere with task performance decreases. However, it has also been demonstrated that other attentional factors play an important role in whether or not distracting information affects performance. Specifically, object-based attention can modulate the extent of distractor processing, leaving open the possibility that object-based attention mechanisms may directly modulate the way in which perceptual load affects distractor processing. Here, we show that object-based attention dominates perceptual load to determine the extent of task-irrelevant information processing, with distractors affecting performance only when they are contained within the same object as the task-relevant search display. These results suggest that object-based attention effects play a central role in selective attention regardless of the perceptual load of the task being performed. PMID:22390296

  15. Object-Based Image Analysis Beyond Remote Sensing - the Human Perspective

    NASA Astrophysics Data System (ADS)

    Blaschke, T.; Lang, S.; Tiede, D.; Papadakis, M.; Györi, A.

    2016-06-01

    We introduce a prototypical methodological framework for a place-based GIS-RS system for the spatial delineation of place while incorporating spatial analysis and mapping techniques using methods from different fields such as environmental psychology, geography, and computer science. The methodological lynchpin for this to happen - when aiming to delineate place in terms of objects - is object-based image analysis (OBIA).

  16. A multiple-point spatially weighted k-NN method for object-based classification

    NASA Astrophysics Data System (ADS)

    Tang, Yunwei; Jing, Linhai; Li, Hui; Atkinson, Peter M.

    2016-10-01

    Object-based classification, commonly referred to as object-based image analysis (OBIA), is now commonly regarded as able to produce more appealing classification maps, often of greater accuracy, than pixel-based classification and its application is now widespread. Therefore, improvement of OBIA using spatial techniques is of great interest. In this paper, multiple-point statistics (MPS) is proposed for object-based classification enhancement in the form of a new multiple-point k-nearest neighbour (k-NN) classification method (MPk-NN). The proposed method first utilises a training image derived from a pre-classified map to characterise the spatial correlation between multiple points of land cover classes. The MPS borrows spatial structures from other parts of the training image, and then incorporates this spatial information, in the form of multiple-point probabilities, into the k-NN classifier. Two satellite sensor images with a fine spatial resolution were selected to evaluate the new method. One is an IKONOS image of the Beijing urban area and the other is a WorldView-2 image of the Wolong mountainous area, in China. The images were object-based classified using the MPk-NN method and several alternatives, including the k-NN, the geostatistically weighted k-NN, the Bayesian method, the decision tree classifier (DTC), and the support vector machine classifier (SVM). It was demonstrated that the new spatial weighting based on MPS can achieve greater classification accuracy relative to the alternatives and it is, thus, recommended as appropriate for object-based classification.

  17. Space-based and object-based visual attention: shared and specific neural domains.

    PubMed

    Fink, G R; Dolan, R J; Halligan, P W; Marshall, J C; Frith, C D

    1997-11-01

    Visual attention can be primarily allocated to either where an object is in space (with little emphasis on the structure of the object itself) or to the structure of the object (with little emphasis on where in space the object is located). Using PET measures of regional cerebral blood flow (rCBF) to index neural activity, we investigated the shared and specific functional anatomy underlying both of these types of visual attention in a controlled non-cueing non-blocked paradigm that involved identical stimuli across the conditions of interest. The interaction of eye movements with these attentional systems was studied by introducing fixation or free vision as an additional factor. Relative to the control condition, object-based and space-based attention showed significant activations of the left and right medial superior parietal cortex and the left lateral inferior parietal cortex, the left prefrontal cortex and the cerebellar vermis. Significant differential activations were observed during object-based attention in the left striate and prestriate cortex. Space-based attention activated the right prefrontal cortex and the right inferior temporal-occipital cortex. Differential neural activity due to free vision or fixation was observed in occipital areas only. Significant interactions of free vision/fixation on activations due to object-based and space-based attention were observed in the right medial superior parietal cortex and left lateral inferior parietal cortex, respectively. The study provides direct evidence for the importance of the parietal cortex in the control of object-based and space-based visual attention. The results show that object-based and space-based attention share common neural mechanisms in the parietal lobes, in addition to task specific mechanisms in early visual processing areas of temporal and occipital cortices. PMID:9397018

  18. Software safety

    NASA Technical Reports Server (NTRS)

    Leveson, Nancy

    1987-01-01

    Software safety and its relationship to other qualities are discussed. It is shown that standard reliability and fault tolerance techniques will not solve the safety problem for the present. A new attitude requires: looking at what you do NOT want software to do along with what you want it to do; and assuming things will go wrong. New procedures and changes to entire software development process are necessary: special software safety analysis techniques are needed; and design techniques, especially eliminating complexity, can be very helpful.

  19. Software Reviews.

    ERIC Educational Resources Information Center

    Beezer, Robert A.; And Others

    1988-01-01

    Reviews for three software packages are given. Those packages are: Linear Algebra Computer Companion; Probability and Statistics Demonstrations and Tutorials; and Math Utilities: CURVES, SURFS, AND DIFFS. (PK)

  20. NASA's Software Safety Standard

    NASA Technical Reports Server (NTRS)

    Ramsay, Christopher M.

    2007-01-01

    requirements. This allows the projects leeway to meet these requirements in many forms that best suit a particular project's needs and safety risk. In other words, it tells the project what to do, not how to do it. This update also incorporated advances in the state of the practice of software safety from academia and private industry. It addresses some of the more common issues now facing software developers in the NASA environment such as the use of Commercial-Off-the-Shelf Software (COTS), Modified OTS (MOTS), Government OTS (GOTS), and reused software. A team from across NASA developed the update and it has had both NASA-wide internal reviews by software engineering, quality, safety, and project management. It has also had expert external review. This presentation and paper will discuss the new NASA Software Safety Standard, its organization, and key features. It will start with a brief discussion of some NASA mission failures and incidents that had software as one of their root causes. It will then give a brief overview of the NASA Software Safety Process. This will include an overview of the key personnel responsibilities and functions that must be performed for safety-critical software.

  1. Weed mapping in early-season maize fields using object-based analysis of unmanned aerial vehicle (UAV) images.

    PubMed

    Peña, José Manuel; Torres-Sánchez, Jorge; de Castro, Ana Isabel; Kelly, Maggi; López-Granados, Francisca

    2013-01-01

    The use of remote imagery captured by unmanned aerial vehicles (UAV) has tremendous potential for designing detailed site-specific weed control treatments in early post-emergence, which have not possible previously with conventional airborne or satellite images. A robust and entirely automatic object-based image analysis (OBIA) procedure was developed on a series of UAV images using a six-band multispectral camera (visible and near-infrared range) with the ultimate objective of generating a weed map in an experimental maize field in Spain. The OBIA procedure combines several contextual, hierarchical and object-based features and consists of three consecutive phases: 1) classification of crop rows by application of a dynamic and auto-adaptive classification approach, 2) discrimination of crops and weeds on the basis of their relative positions with reference to the crop rows, and 3) generation of a weed infestation map in a grid structure. The estimation of weed coverage from the image analysis yielded satisfactory results. The relationship of estimated versus observed weed densities had a coefficient of determination of r(2)=0.89 and a root mean square error of 0.02. A map of three categories of weed coverage was produced with 86% of overall accuracy. In the experimental field, the area free of weeds was 23%, and the area with low weed coverage (<5% weeds) was 47%, which indicated a high potential for reducing herbicide application or other weed operations. The OBIA procedure computes multiple data and statistics derived from the classification outputs, which permits calculation of herbicide requirements and estimation of the overall cost of weed management operations in advance.

  2. Weed mapping in early-season maize fields using object-based analysis of unmanned aerial vehicle (UAV) images.

    PubMed

    Peña, José Manuel; Torres-Sánchez, Jorge; de Castro, Ana Isabel; Kelly, Maggi; López-Granados, Francisca

    2013-01-01

    The use of remote imagery captured by unmanned aerial vehicles (UAV) has tremendous potential for designing detailed site-specific weed control treatments in early post-emergence, which have not possible previously with conventional airborne or satellite images. A robust and entirely automatic object-based image analysis (OBIA) procedure was developed on a series of UAV images using a six-band multispectral camera (visible and near-infrared range) with the ultimate objective of generating a weed map in an experimental maize field in Spain. The OBIA procedure combines several contextual, hierarchical and object-based features and consists of three consecutive phases: 1) classification of crop rows by application of a dynamic and auto-adaptive classification approach, 2) discrimination of crops and weeds on the basis of their relative positions with reference to the crop rows, and 3) generation of a weed infestation map in a grid structure. The estimation of weed coverage from the image analysis yielded satisfactory results. The relationship of estimated versus observed weed densities had a coefficient of determination of r(2)=0.89 and a root mean square error of 0.02. A map of three categories of weed coverage was produced with 86% of overall accuracy. In the experimental field, the area free of weeds was 23%, and the area with low weed coverage (<5% weeds) was 47%, which indicated a high potential for reducing herbicide application or other weed operations. The OBIA procedure computes multiple data and statistics derived from the classification outputs, which permits calculation of herbicide requirements and estimation of the overall cost of weed management operations in advance. PMID:24146963

  3. Weed Mapping in Early-Season Maize Fields Using Object-Based Analysis of Unmanned Aerial Vehicle (UAV) Images

    PubMed Central

    Peña, José Manuel; Torres-Sánchez, Jorge; de Castro, Ana Isabel; Kelly, Maggi; López-Granados, Francisca

    2013-01-01

    The use of remote imagery captured by unmanned aerial vehicles (UAV) has tremendous potential for designing detailed site-specific weed control treatments in early post-emergence, which have not possible previously with conventional airborne or satellite images. A robust and entirely automatic object-based image analysis (OBIA) procedure was developed on a series of UAV images using a six-band multispectral camera (visible and near-infrared range) with the ultimate objective of generating a weed map in an experimental maize field in Spain. The OBIA procedure combines several contextual, hierarchical and object-based features and consists of three consecutive phases: 1) classification of crop rows by application of a dynamic and auto-adaptive classification approach, 2) discrimination of crops and weeds on the basis of their relative positions with reference to the crop rows, and 3) generation of a weed infestation map in a grid structure. The estimation of weed coverage from the image analysis yielded satisfactory results. The relationship of estimated versus observed weed densities had a coefficient of determination of r2=0.89 and a root mean square error of 0.02. A map of three categories of weed coverage was produced with 86% of overall accuracy. In the experimental field, the area free of weeds was 23%, and the area with low weed coverage (<5% weeds) was 47%, which indicated a high potential for reducing herbicide application or other weed operations. The OBIA procedure computes multiple data and statistics derived from the classification outputs, which permits calculation of herbicide requirements and estimation of the overall cost of weed management operations in advance. PMID:24146963

  4. Developmental Changes in Mental Rotation: A Dissociation Between Object-Based and Egocentric Transformations

    PubMed Central

    Kaltner, Sandra; Jansen, Petra

    2016-01-01

    The present study was conducted to investigate developmental changes of mental rotation performance. We compared children, adults, and older adults regarding their performance in object-based and egocentric transformations. Both children and older adults showed higher overall reaction times compared to adults. Results were interpreted against the background of impaired working memory capacity in both children and older adults. Since mental changes in working memory are mediated by age differences in cognitive processing speed, cognitive speed is supposed to be the underlying factor. Regarding both types of transformations, an advantage of egocentric over object-based human figures was only found in adults which led us to tentatively propose that children and older adults show deficits in perspective taking compared to adults. PMID:27512525

  5. Feature-based and object-based attention orientation during short-term memory maintenance.

    PubMed

    Ku, Yixuan

    2015-12-01

    Top-down attention biases the short-term memory (STM) processing at multiple stages. Orienting attention during the maintenance period of STM by a retrospective cue (retro-cue) strengthens the representation of the cued item and improves the subsequent STM performance. In a recent article, Backer et al. (Backer KC, Binns MA, Alain C. J Neurosci 35: 1307-1318, 2015) extended these findings from the visual to the auditory domain and combined electroencephalography to dissociate neural mechanisms underlying feature-based and object-based attention orientation. Both event-related potentials and neural oscillations explained the behavioral benefits of retro-cues and favored the theory that feature-based and object-based attention orientation were independent. PMID:26084908

  6. Developmental Changes in Mental Rotation: A Dissociation Between Object-Based and Egocentric Transformations.

    PubMed

    Kaltner, Sandra; Jansen, Petra

    2016-01-01

    The present study was conducted to investigate developmental changes of mental rotation performance. We compared children, adults, and older adults regarding their performance in object-based and egocentric transformations. Both children and older adults showed higher overall reaction times compared to adults. Results were interpreted against the background of impaired working memory capacity in both children and older adults. Since mental changes in working memory are mediated by age differences in cognitive processing speed, cognitive speed is supposed to be the underlying factor. Regarding both types of transformations, an advantage of egocentric over object-based human figures was only found in adults which led us to tentatively propose that children and older adults show deficits in perspective taking compared to adults. PMID:27512525

  7. Effect of Pansharpened Image on Some of Pixel Based and Object Based Classification Accuracy

    NASA Astrophysics Data System (ADS)

    Karakus, P.; Karabork, H.

    2016-06-01

    Classification is the most important method to determine type of crop contained in a region for agricultural planning. There are two types of the classification. First is pixel based and the other is object based classification method. While pixel based classification methods are based on the information in each pixel, object based classification method is based on objects or image objects that formed by the combination of information from a set of similar pixels. Multispectral image contains a higher degree of spectral resolution than a panchromatic image. Panchromatic image have a higher spatial resolution than a multispectral image. Pan sharpening is a process of merging high spatial resolution panchromatic and high spectral resolution multispectral imagery to create a single high resolution color image. The aim of the study was to compare the potential classification accuracy provided by pan sharpened image. In this study, SPOT 5 image was used dated April 2013. 5m panchromatic image and 10m multispectral image are pan sharpened. Four different classification methods were investigated: maximum likelihood, decision tree, support vector machine at the pixel level and object based classification methods. SPOT 5 pan sharpened image was used to classification sun flowers and corn in a study site located at Kadirli region on Osmaniye in Turkey. The effects of pan sharpened image on classification results were also examined. Accuracy assessment showed that the object based classification resulted in the better overall accuracy values than the others. The results that indicate that these classification methods can be used for identifying sun flower and corn and estimating crop areas.

  8. Software Bridge

    NASA Technical Reports Server (NTRS)

    1995-01-01

    I-Bridge is a commercial version of software developed by I-Kinetics under a NASA Small Business Innovation Research (SBIR) contract. The software allows users of Windows applications to gain quick, easy access to databases, programs and files on UNIX services. Information goes directly onto spreadsheets and other applications; users need not manually locate, transfer and convert data.

  9. Software Reviews.

    ERIC Educational Resources Information Center

    Miller, Anne, Ed.; Radziemski, Cathy, Ed.

    1988-01-01

    Reviews two software packages for the Macintosh series. "Course Builder 2.0," a courseware authoring system, allows the user to create programs which stand alone and may be used independently in the classroom. "World Builder," an artificial intelligence software package, allows creative thinking, problem-solving, and decision-making. (YP)

  10. Software Reviews.

    ERIC Educational Resources Information Center

    Bitter, Gary G., Ed.

    1990-01-01

    Reviews three computer software: (1) "Elastic Lines: The Electronic Geoboard" on elementary geometry; (2) "Wildlife Adventures: Whales" on environmental science; and (3) "What Do You Do with a Broken Calculator?" on computation and problem solving. Summarizes the descriptions, strengths and weaknesses, and applications of each software. (YP)

  11. Software Reviews.

    ERIC Educational Resources Information Center

    Wulfson, Stephen

    1988-01-01

    Presents reviews of six computer software programs for teaching science. Provides the publisher, grade level, cost, and descriptions of software, including: (1) "Recycling Logic"; (2) "Introduction to Biochemistry"; (3) "Food for Thought"; (4) "Watts in a Home"; (5) "Geology in Action"; and (6) "Biomes." All are for Apple series microcomputers.…

  12. Software Reviews.

    ERIC Educational Resources Information Center

    Science and Children, 1988

    1988-01-01

    Reviews six software packages for the Apple II family. Programs reviewed include "Science Courseware: Earth Science Series"; "Heat and Light"; "In Search of Space: Introduction to Model Rocketry"; "Drug Education Series: Drugs--Their Effects on You'"; "Uncertainties and Measurement"; and "Software Films: Learning about Science Series," which…

  13. Serial grouping of 2D-image regions with object-based attention in humans

    PubMed Central

    Jeurissen, Danique; Self, Matthew W; Roelfsema, Pieter R

    2016-01-01

    After an initial stage of local analysis within the retina and early visual pathways, the human visual system creates a structured representation of the visual scene by co-selecting image elements that are part of behaviorally relevant objects. The mechanisms underlying this perceptual organization process are only partially understood. We here investigate the time-course of perceptual grouping of two-dimensional image-regions by measuring the reaction times of human participants and report that it is associated with the gradual spread of object-based attention. Attention spreads fastest over large and homogeneous areas and is slowed down at locations that require small-scale processing. We find that the time-course of the object-based selection process is well explained by a 'growth-cone' model, which selects surface elements in an incremental, scale-dependent manner. We discuss how the visual cortical hierarchy can implement this scale-dependent spread of object-based attention, leveraging the different receptive field sizes in distinct cortical areas. DOI: http://dx.doi.org/10.7554/eLife.14320.001 PMID:27291188

  14. Nonretinotopic perception of orientation: Temporal integration of basic features operates in object-based coordinates.

    PubMed

    Wutz, Andreas; Drewes, Jan; Melcher, David

    2016-08-01

    Early, feed-forward visual processing is organized in a retinotopic reference frame. In contrast, visual feature integration on longer time scales can involve object-based or spatiotopic coordinates. For example, in the Ternus-Pikler (T-P) apparent motion display, object identity is mapped across the object motion path. Here, we report evidence from three experiments supporting nonretinotopic feature integration even for the most paradigmatic example of retinotopically-defined features: orientation. We presented observers with a repeated series of T-P displays in which the perceived rotation of Gabor gratings indicates processing in either retinotopic or object-based coordinates. In Experiment 1, the frequency of perceived retinotopic rotations decreased exponentially for longer interstimulus intervals (ISIs) between T-P display frames, with object-based percepts dominating after about 150-250 ms. In a second experiment, we show that motion and rotation judgments depend on the perception of a moving object during the T-P display ISIs rather than only on temporal factors. In Experiment 3, we cued the observers' attentional state either toward a retinotopic or object motion-based reference frame and then tracked both the observers' eye position and the time course of the perceptual bias while viewing identical T-P display sequences. Overall, we report novel evidence for spatiotemporal integration of even basic visual features such as orientation in nonretinotopic coordinates, in order to support perceptual constancy across self- and object motion.

  15. Nonretinotopic perception of orientation: Temporal integration of basic features operates in object-based coordinates.

    PubMed

    Wutz, Andreas; Drewes, Jan; Melcher, David

    2016-08-01

    Early, feed-forward visual processing is organized in a retinotopic reference frame. In contrast, visual feature integration on longer time scales can involve object-based or spatiotopic coordinates. For example, in the Ternus-Pikler (T-P) apparent motion display, object identity is mapped across the object motion path. Here, we report evidence from three experiments supporting nonretinotopic feature integration even for the most paradigmatic example of retinotopically-defined features: orientation. We presented observers with a repeated series of T-P displays in which the perceived rotation of Gabor gratings indicates processing in either retinotopic or object-based coordinates. In Experiment 1, the frequency of perceived retinotopic rotations decreased exponentially for longer interstimulus intervals (ISIs) between T-P display frames, with object-based percepts dominating after about 150-250 ms. In a second experiment, we show that motion and rotation judgments depend on the perception of a moving object during the T-P display ISIs rather than only on temporal factors. In Experiment 3, we cued the observers' attentional state either toward a retinotopic or object motion-based reference frame and then tracked both the observers' eye position and the time course of the perceptual bias while viewing identical T-P display sequences. Overall, we report novel evidence for spatiotemporal integration of even basic visual features such as orientation in nonretinotopic coordinates, in order to support perceptual constancy across self- and object motion. PMID:27494545

  16. Information extraction with object based support vector machines and vegetation indices

    NASA Astrophysics Data System (ADS)

    Ustuner, Mustafa; Abdikan, Saygin; Balik Sanli, Fusun

    2016-07-01

    Information extraction through remote sensing data is important for policy and decision makers as extracted information provide base layers for many application of real world. Classification of remotely sensed data is the one of the most common methods of extracting information however it is still a challenging issue because several factors are affecting the accuracy of the classification. Resolution of the imagery, number and homogeneity of land cover classes, purity of training data and characteristic of adopted classifiers are just some of these challenging factors. Object based image classification has some superiority than pixel based classification for high resolution images since it uses geometry and structure information besides spectral information. Vegetation indices are also commonly used for the classification process since it provides additional spectral information for vegetation, forestry and agricultural areas. In this study, the impacts of the Normalized Difference Vegetation Index (NDVI) and Normalized Difference Red Edge Index (NDRE) on the classification accuracy of RapidEye imagery were investigated. Object based Support Vector Machines were implemented for the classification of crop types for the study area located in Aegean region of Turkey. Results demonstrated that the incorporation of NDRE increase the classification accuracy from 79,96% to 86,80% as overall accuracy, however NDVI decrease the classification accuracy from 79,96% to 78,90%. Moreover it is proven than object based classification with RapidEye data give promising results for crop type mapping and analysis.

  17. Remote sensing clustering analysis based on object-based interval modeling

    NASA Astrophysics Data System (ADS)

    He, Hui; Liang, Tianheng; Hu, Dan; Yu, Xianchuan

    2016-09-01

    In object-based clustering, image data are segmented into objects (groups of pixels) and then clustered based on the objects' features. This method can be used to automatically classify high-resolution, remote sensing images, but requires accurate descriptions of object features. In this paper, we ascertain that interval-valued data model is appropriate for describing clustering prototype features. With this in mind, we developed an object-based interval modeling method for high-resolution, multiband, remote sensing data. We also designed an adaptive interval-valued fuzzy clustering method. We ran experiments utilizing images from the SPOT-5 satellite sensor, for the Pearl River Delta region and Beijing. The results indicate that the proposed algorithm considers both the anisotropy of the remote sensing data and the ambiguity of objects. Additionally, we present a new dissimilarity measure for interval vectors, which better separates the interval vectors generated by features of the segmentation units (objects). This approach effectively limits classification errors caused by spectral mixing between classes. Compared with the object-based unsupervised classification method proposed earlier, the proposed algorithm improves the classification accuracy without increasing computational complexity.

  18. An Advanced Programming Technique for a Cost-Effective Hardware-Independent Realization of Naval Software Systems. Final Technical Report, Part II.

    ERIC Educational Resources Information Center

    Computer Symbolic, Inc., Washington, DC.

    A pseudo assembly language, PAL, was developed and specified for use as the lowest level in a general, multilevel programing system for the realization of cost-effective, hardware-independent Naval software. The language was developed as part of the system called FIRMS (Fast Iterative Recursive Macro System) and is sufficiently general to allow…

  19. Software Smarts

    NASA Technical Reports Server (NTRS)

    1998-01-01

    Under an SBIR (Small Business Innovative Research) contract with Johnson Space Center, Knowledge Based Systems Inc. (KBSI) developed an intelligent software environment for modeling and analyzing mission planning activities, simulating behavior, and, using a unique constraint propagation mechanism, updating plans with each change in mission planning activities. KBSI developed this technology into a commercial product, PROJECTLINK, a two-way bridge between PROSIm, KBSI's process modeling and simulation software and leading project management software like Microsoft Project and Primavera's SureTrak Project Manager.

  20. Software testing

    NASA Astrophysics Data System (ADS)

    Price-Whelan, Adrian M.

    2016-01-01

    Now more than ever, scientific results are dependent on sophisticated software and analysis. Why should we trust code written by others? How do you ensure your own code produces sensible results? How do you make sure it continues to do so as you update, modify, and add functionality? Software testing is an integral part of code validation and writing tests should be a requirement for any software project. I will talk about Python-based tools that make managing and running tests much easier and explore some statistics for projects hosted on GitHub that contain tests.

  1. Software Reviews.

    ERIC Educational Resources Information Center

    Dwyer, Donna; And Others

    1989-01-01

    Reviewed are seven software packages for Apple and IBM computers. Included are: "Toxicology"; "Science Corner: Space Probe"; "Alcohol and Pregnancy"; "Science Tool Kit Plus"; Computer Investigations: Plant Growth"; "Climatrolls"; and "Animal Watch: Whales." (CW)

  2. Software Reviews.

    ERIC Educational Resources Information Center

    McGrath, Diane

    1990-01-01

    Reviews two programs: (1) "The Weather Machine" on understanding weather and weather forecasting and (2) "The Mystery of the Hotel Victoria" on problem solving in mathematics. Presents the descriptions, advantages, and weaknesses of the software. (YP)

  3. Software Reviews.

    ERIC Educational Resources Information Center

    Wulfson, Stephen, Ed.

    1990-01-01

    Reviewed are six computer software packages including "Lunar Greenhouse,""Dyno-Quest,""How Weather Works,""Animal Trackers,""Personal Science Laboratory," and "The Skeletal and Muscular Systems." Availability, functional, and hardware requirements are discussed. (CW)

  4. Software Reviews.

    ERIC Educational Resources Information Center

    Classroom Computer Learning, 1990

    1990-01-01

    Reviewed are three computer software packages including "Martin Luther King, Jr.: Instant Replay of History,""Weeds to Trees," and "The New Print Shop, School Edition." Discussed are hardware requirements, costs, grade levels, availability, emphasis, strengths, and weaknesses. (CW)

  5. Software Reviews.

    ERIC Educational Resources Information Center

    Davis, Shelly J., Ed.; Knaupp, Jon, Ed.

    1984-01-01

    Reviewed is computer software on: (1) classification of living things, a tutorial program for grades 5-10; and (2) polynomial practice using tiles, a drill-and-practice program for algebra students. (MNS)

  6. Software Reviews.

    ERIC Educational Resources Information Center

    Wulfson, Stephen, Ed.

    1987-01-01

    Reviews seven computer software programs that can be used in science education programs. Describes courseware which deals with muscles and bones, terminology, classifying animals without backbones, molecular structures, drugs, genetics, and shaping the earth's surface. (TW)

  7. Software Reviews.

    ERIC Educational Resources Information Center

    Mathematics and Computer Education, 1988

    1988-01-01

    Presents reviews of six software packages. Includes (1) "Plain Vanilla Statistics"; (2) "MathCAD 2.0"; (3) "GrFx"; (4) "Trigonometry"; (5) "Algebra II"; (6) "Algebra Drill and Practice I, II, and III." (PK)

  8. Software Reviews.

    ERIC Educational Resources Information Center

    Wulfson, Eugene T., Ed.

    1988-01-01

    Presents reviews by classroom teachers of software for teaching science. Includes material on the work of geologists, genetics, earth science, classification of living things, astronomy, endangered species, skeleton, drugs, and heartbeat. Provides information on availability and equipment needed. (RT)

  9. Software Reviews.

    ERIC Educational Resources Information Center

    Wulfson, Stephen, Ed.

    1987-01-01

    Provides a review of four science software programs. Includes topics such as plate tectonics, laboratory experiment simulations, the human body, and light and temperature. Contains information on ordering and reviewers' comments. (ML)

  10. Software Reviews.

    ERIC Educational Resources Information Center

    Wulfson, Stephen, Ed.

    1987-01-01

    Provides reviews of six computer software programs designed for use in elementary science education programs. Provides the title, publisher, grade level, and descriptions of courseware on ant farms, drugs, genetics, beachcombing, matter, and test generation. (TW)

  11. A signal and image processing object-based system using CLOS. [Common LISP Object System (CLOS)

    SciTech Connect

    Hernandez, J.E.; Lu, Shin-Yee; Sherwood, R.J.; Clark, G.A.; Lawver, B.S.

    1991-09-01

    This paper presents a LISP based system for signal and image processing. Using an object based approach the system integrates signal and image processing algorithms, supervised and unsupervised neural network algorithms, and mild-level computer vision capabilities, into a cohesive framework. This framework is suitable for prototyping complex algorithms dealing with multiple classes of data. The system, known as VISION, is currently used as a prototyping environment for wide range of scientific applications internal to LLNL. This paper highlights some of the capabilities of VISION, and how they were implemented using the Common LISP Object System, CLOS. 13 refs.

  12. A signal and image processing object-based system using CLOS

    SciTech Connect

    Hernandez, J.E.; Lu, Shin-Yee; Sherwood, R.J.; Clark, G.A.; Lawver, B.S.

    1991-09-01

    This paper presents a LISP based system for signal and image processing. Using an object based approach the system integrates signal and image processing algorithms, supervised and unsupervised neural network algorithms, and mild-level computer vision capabilities, into a cohesive framework. This framework is suitable for prototyping complex algorithms dealing with multiple classes of data. The system, known as VISION, is currently used as a prototyping environment for wide range of scientific applications internal to LLNL. This paper highlights some of the capabilities of VISION, and how they were implemented using the Common LISP Object System, CLOS. 13 refs.

  13. SAR Product Control Software

    NASA Astrophysics Data System (ADS)

    Meadows, P. J.; Hounam, D.; Rye, A. J.; Rosich, B.; Börner, T.; Closa, J.; Schättler, B.; Smith, P. J.; Zink, M.

    2003-03-01

    As SAR instruments and their operating modes become more complex, as new applications place more and more demands on image quality and as our understanding of their imperfections becomes more sophisticated, there is increasing recognition that SAR data quality has to be controlled more completely to keep pace. The SAR product CONtrol software (SARCON) is a comprehensive SAR product control software suite tailored to the latest generation of SAR sensors. SARCON profits from the most up-to-date thinking on SAR image performance derived from other spaceborne and airborne SAR projects and is based on the newest applications. This paper gives an overview of the structure and the features of this new software tool, which is a product of a co-operation between teams at BAE SYSTEMS Advanced Technology Centre and DLR under contract to ESA (ESRIN). Work on SARCON began in 1999 and is continuing.

  14. Parameter-Based Performance Analysis of Object-Based Image Analysis Using Aerial and Quikbird-2 Images

    NASA Astrophysics Data System (ADS)

    Kavzoglu, T.; Yildiz, M.

    2014-09-01

    Opening new possibilities for research, very high resolution (VHR) imagery acquired by recent commercial satellites and aerial systems requires advanced approaches and techniques that can handle large volume of data with high local variance. Delineation of land use/cover information from VHR images is a hot research topic in remote sensing. In recent years, object-based image analysis (OBIA) has become a popular solution for image analysis tasks as it considers shape, texture and content information associated with the image objects. The most important stage of OBIA is the image segmentation process applied prior to classification. Determination of optimal segmentation parameters is of crucial importance for the performance of the selected classifier. In this study, effectiveness and applicability of the segmentation method in relation to its parameters was analysed using two VHR images, an aerial photo and a Quickbird-2 image. Multi-resolution segmentation technique was employed with its optimal parameters of scale, shape and compactness that were defined after an extensive trail process on the data sets. Nearest neighbour classifier was applied on the segmented images, and then the accuracy assessment was applied. Results show that segmentation parameters have a direct effect on the classification accuracy, and low values of scale-shape combinations produce the highest classification accuracies. Also, compactness parameter was found to be having minimal effect on the construction of image objects, hence it can be set to a constant value in image classification.

  15. Object-based land-cover classification for metropolitan Phoenix, Arizona, using aerial photography

    NASA Astrophysics Data System (ADS)

    Li, Xiaoxiao; Myint, Soe W.; Zhang, Yujia; Galletti, Chritopher; Zhang, Xiaoxiang; Turner, Billie L.

    2014-12-01

    Detailed land-cover mapping is essential for a range of research issues addressed by the sustainability and land system sciences and planning. This study uses an object-based approach to create a 1 m land-cover classification map of the expansive Phoenix metropolitan area through the use of high spatial resolution aerial photography from National Agricultural Imagery Program. It employs an expert knowledge decision rule set and incorporates the cadastral GIS vector layer as auxiliary data. The classification rule was established on a hierarchical image object network, and the properties of parcels in the vector layer were used to establish land cover types. Image segmentations were initially utilized to separate the aerial photos into parcel sized objects, and were further used for detailed land type identification within the parcels. Characteristics of image objects from contextual and geometrical aspects were used in the decision rule set to reduce the spectral limitation of the four-band aerial photography. Classification results include 12 land-cover classes and subclasses that may be assessed from the sub-parcel to the landscape scales, facilitating examination of scale dynamics. The proposed object-based classification method provides robust results, uses minimal and readily available ancillary data, and reduces computational time.

  16. Object-based cropland degradation identification: a case study in Uzbekistan

    NASA Astrophysics Data System (ADS)

    Dubovyk, Olena; Menz, Gunter; Conrad, Christopher; Khamzina, Asia

    2012-10-01

    Sustainability of irrigated agriculture-based economies, such as in Central Asia, is threatened by cropland degradation. The field-based identification of the degraded agricultural areas can aid in developing appropriate land rehabilitation and monitoring programs. This paper combined the object-based change detection and spectral mixture analysis to develop an approach for identifying parcels of irrigated degraded cropland in Northern Uzbekistan, Central Asia. A linear spectral unmixing, followed by the object-based change vector analysis, was applied to the multiple Landsat TM images, acquired in 1987 and 2009. Considering a spectral dimensionality of Landsat TM, a multiple 4-endmember model (green vegetation, water, dark soil, and bright soil) was set up for the analysis. The spectral unmixing results were valid, as indicated by the overall root mean square errors of <2.5% reflectance for all images. The results of change detection revealed that about 33% (84,540 ha) of cropland in the study area were affected by the degradation processes to varying degrees. Spatial distribution of degraded fields was mainly associated with the abandoned fields and lands with inherently low fertile soils. The proposed approach could be elaborated for a field-based monitoring of cropland degradation in similar landscapes of Central Asia and elsewhere.

  17. Object based change detection of Central Asian Tugai vegetation with very high spatial resolution satellite imagery

    NASA Astrophysics Data System (ADS)

    Gärtner, Philipp; Förster, Michael; Kurban, Alishir; Kleinschmit, Birgit

    2014-09-01

    Ecological restoration of degraded riparian Tugai forests in north-western China is a key driver to combat desertification in this region. Recent restoration efforts attempt to recover the forest along with its most dominant tree species, Populus euphratica. The present research observed the response of natural vegetation using an object based change detection method on QuickBird (2005) and WorldView2 (2011) data. We applied the region growing approach to derived Normalized Difference Vegetation Index (NDVI) values in order to identify single P. euphratica trees, delineate tree crown areas and quantify crown diameter changes. Results were compared to 59 reference trees. The findings confirmed a positive tree crown growth and suggest a crown diameter increase of 1.14 m, on average. On a single tree basis, tree crown diameters of larger crowns were generally underestimated. Small crowns were slightly underestimated in QuickBird and overestimated in Worldview2 images. The results of the automated tree crown delineation show a moderate relation to field reference data with R20052: 0.36 and R20112: 0.48. The object based image analysis (OBIA) method proved to be applicable in sparse riparian Tugai forests and showed great suitability to evaluate ecological restoration efforts in an endangered ecosystem.

  18. Object-Based Arctic Sea Ice Feature Extraction through High Spatial Resolution Aerial photos

    NASA Astrophysics Data System (ADS)

    Miao, X.; Xie, H.

    2015-12-01

    High resolution aerial photographs used to detect and classify sea ice features can provide accurate physical parameters to refine, validate, and improve climate models. However, manually delineating sea ice features, such as melt ponds, submerged ice, water, ice/snow, and pressure ridges, is time-consuming and labor-intensive. An object-based classification algorithm is developed to automatically extract sea ice features efficiently from aerial photographs taken during the Chinese National Arctic Research Expedition in summer 2010 (CHINARE 2010) in the MIZ near the Alaska coast. The algorithm includes four steps: (1) the image segmentation groups the neighboring pixels into objects based on the similarity of spectral and textural information; (2) the random forest classifier distinguishes four general classes: water, general submerged ice (GSI, including melt ponds and submerged ice), shadow, and ice/snow; (3) the polygon neighbor analysis separates melt ponds and submerged ice based on spatial relationship; and (4) pressure ridge features are extracted from shadow based on local illumination geometry. The producer's accuracy of 90.8% and user's accuracy of 91.8% are achieved for melt pond detection, and shadow shows a user's accuracy of 88.9% and producer's accuracies of 91.4%. Finally, pond density, pond fraction, ice floes, mean ice concentration, average ridge height, ridge profile, and ridge frequency are extracted from batch processing of aerial photos, and their uncertainties are estimated.

  19. Remote sensing and object-based techniques for mapping fine-scale industrial disturbances

    NASA Astrophysics Data System (ADS)

    Powers, Ryan P.; Hermosilla, Txomin; Coops, Nicholas C.; Chen, Gang

    2015-02-01

    Remote sensing provides an important data source for the detection and monitoring of disturbances; however, using this data to recognize fine-spatial resolution industrial disturbances dispersed across extensive areas presents unique challenges (e.g., accurate delineation and identification) and deserves further investigation. In this study, we present and assess a geographic object-based image analysis (GEOBIA) approach with high-spatial resolution imagery (SPOT 5) to map industrial disturbances using the oil sands region of Alberta's northeastern boreal forest as a case study. Key components of this study were (i) the development of additional spectral, texture, and geometrical descriptors for characterizing image-objects (groups of alike pixels) and their contextual properties, and (ii) the introduction of decision trees with boosting to perform the object-based land cover classification. Results indicate that the approach achieved an overall accuracy of 88%, and that all descriptor groups provided relevant information for the classification. Despite challenges remaining (e.g., distinguishing between spectrally similar classes, or placing discrete boundaries), the approach was able to effectively delineate and classify fine-spatial resolution industrial disturbances.

  20. An Integrative Object-Based Image Analysis Workflow for Uav Images

    NASA Astrophysics Data System (ADS)

    Yu, Huai; Yan, Tianheng; Yang, Wen; Zheng, Hong

    2016-06-01

    In this work, we propose an integrative framework to process UAV images. The overall process can be viewed as a pipeline consisting of the geometric and radiometric corrections, subsequent panoramic mosaicking and hierarchical image segmentation for later Object Based Image Analysis (OBIA). More precisely, we first introduce an efficient image stitching algorithm after the geometric calibration and radiometric correction, which employs a fast feature extraction and matching by combining the local difference binary descriptor and the local sensitive hashing. We then use a Binary Partition Tree (BPT) representation for the large mosaicked panoramic image, which starts by the definition of an initial partition obtained by an over-segmentation algorithm, i.e., the simple linear iterative clustering (SLIC). Finally, we build an object-based hierarchical structure by fully considering the spectral and spatial information of the super-pixels and their topological relationships. Moreover, an optimal segmentation is obtained by filtering the complex hierarchies into simpler ones according to some criterions, such as the uniform homogeneity and semantic consistency. Experimental results on processing the post-seismic UAV images of the 2013 Ya'an earthquake demonstrate the effectiveness and efficiency of our proposed method.

  1. Preliminary Results of Earthquake-Induced Building Damage Detection with Object-Based Image Classification

    NASA Astrophysics Data System (ADS)

    Sabuncu, A.; Uca Avci, Z. D.; Sunar, F.

    2016-06-01

    Earthquakes are the most destructive natural disasters, which result in massive loss of life, infrastructure damages and financial losses. Earthquake-induced building damage detection is a very important step after earthquakes since earthquake-induced building damage is one of the most critical threats to cities and countries in terms of the area of damage, rate of collapsed buildings, the damage grade near the epicenters and also building damage types for all constructions. Van-Ercis (Turkey) earthquake (Mw= 7.1) was occurred on October 23th, 2011; at 10:41 UTC (13:41 local time) centered at 38.75 N 43.36 E that places the epicenter about 30 kilometers northern part of the city of Van. It is recorded that, 604 people died and approximately 4000 buildings collapsed or seriously damaged by the earthquake. In this study, high-resolution satellite images of Van-Ercis, acquired by Quickbird-2 (Digital Globe Inc.) after the earthquake, were used to detect the debris areas using an object-based image classification. Two different land surfaces, having homogeneous and heterogeneous land covers, were selected as case study areas. As a first step of the object-based image processing, segmentation was applied with a convenient scale parameter and homogeneity criterion parameters. As a next step, condition based classification was used. In the final step of this preliminary study, outputs were compared with streetview/ortophotos for the verification and evaluation of the classification accuracy.

  2. Topic Modelling for Object-Based Classification of Vhr Satellite Images Based on Multiscale Segmentations

    NASA Astrophysics Data System (ADS)

    Shen, Li; Wu, Linmei; Li, Zhipeng

    2016-06-01

    Multiscale segmentation is a key prerequisite step for object-based classification methods. However, it is often not possible to determine a sole optimal scale for the image to be classified because in many cases different geo-objects and even an identical geo-object may appear at different scales in one image. In this paper, an object-based classification method based on mutliscale segmentation results in the framework of topic modelling is proposed to classify VHR satellite images in an entirely unsupervised fashion. In the stage of topic modelling, grayscale histogram distributions for each geo-object class and each segment are learned in an unsupervised manner from multiscale segments. In the stage of classification, each segment is allocated a geo-object class label by the similarity comparison between the grayscale histogram distributions of each segment and each geo-object class. Experimental results show that the proposed method can perform better than the traditional methods based on topic modelling.

  3. Addressing Software Security

    NASA Technical Reports Server (NTRS)

    Bailey, Brandon

    2015-01-01

    Historically security within organizations was thought of as an IT function (web sites/servers, email, workstation patching, etc.) Threat landscape has evolved (Script Kiddies, Hackers, Advanced Persistent Threat (APT), Nation States, etc.) Attack surface has expanded -Networks interconnected!! Some security posture factors Network Layer (Routers, Firewalls, etc.) Computer Network Defense (IPS/IDS, Sensors, Continuous Monitoring, etc.) Industrial Control Systems (ICS) Software Security (COTS, FOSS, Custom, etc.)

  4. Software reengineering

    NASA Technical Reports Server (NTRS)

    Fridge, Ernest M., III

    1991-01-01

    Today's software systems generally use obsolete technology, are not integrated properly with other software systems, and are difficult and costly to maintain. The discipline of reverse engineering is becoming prominent as organizations try to move their systems up to more modern and maintainable technology in a cost effective manner. JSC created a significant set of tools to develop and maintain FORTRAN and C code during development of the Space Shuttle. This tool set forms the basis for an integrated environment to re-engineer existing code into modern software engineering structures which are then easier and less costly to maintain and which allow a fairly straightforward translation into other target languages. The environment will support these structures and practices even in areas where the language definition and compilers do not enforce good software engineering. The knowledge and data captured using the reverse engineering tools is passed to standard forward engineering tools to redesign or perform major upgrades to software systems in a much more cost effective manner than using older technologies. A beta vision of the environment was released in Mar. 1991. The commercial potential for such re-engineering tools is very great. CASE TRENDS magazine reported it to be the primary concern of over four hundred of the top MIS executives.

  5. Hierarchical Object-based Image Analysis approach for classification of sub-meter multispectral imagery in Tanzania

    NASA Astrophysics Data System (ADS)

    Chung, C.; Nagol, J. R.; Tao, X.; Anand, A.; Dempewolf, J.

    2015-12-01

    Increasing agricultural production while at the same time preserving the environment has become a challenging task. There is a need for new approaches for use of multi-scale and multi-source remote sensing data as well as ground based measurements for mapping and monitoring crop and ecosystem state to support decision making by governmental and non-governmental organizations for sustainable agricultural development. High resolution sub-meter imagery plays an important role in such an integrative framework of landscape monitoring. It helps link the ground based data to more easily available coarser resolution data, facilitating calibration and validation of derived remote sensing products. Here we present a hierarchical Object Based Image Analysis (OBIA) approach to classify sub-meter imagery. The primary reason for choosing OBIA is to accommodate pixel sizes smaller than the object or class of interest. Especially in non-homogeneous savannah regions of Tanzania, this is an important concern and the traditional pixel based spectral signature approach often fails. Ortho-rectified, calibrated, pan sharpened 0.5 meter resolution data acquired from DigitalGlobe's WorldView-2 satellite sensor was used for this purpose. Multi-scale hierarchical segmentation was performed using multi-resolution segmentation approach to facilitate the use of texture, neighborhood context, and the relationship between super and sub objects for training and classification. eCognition, a commonly used OBIA software program, was used for this purpose. Both decision tree and random forest approaches for classification were tested. The Kappa index agreement for both algorithms surpassed the 85%. The results demonstrate that using hierarchical OBIA can effectively and accurately discriminate classes at even LCCS-3 legend.

  6. SELFI: an object-based, Bayesian method for faint emission line source detection in MUSE deep field data cubes

    NASA Astrophysics Data System (ADS)

    Meillier, Céline; Chatelain, Florent; Michel, Olivier; Bacon, Roland; Piqueras, Laure; Bacher, Raphael; Ayasso, Hacheme

    2016-04-01

    We present SELFI, the Source Emission Line FInder, a new Bayesian method optimized for detection of faint galaxies in Multi Unit Spectroscopic Explorer (MUSE) deep fields. MUSE is the new panoramic integral field spectrograph at the Very Large Telescope (VLT) that has unique capabilities for spectroscopic investigation of the deep sky. It has provided data cubes with 324 million voxels over a single 1 arcmin2 field of view. To address the challenge of faint-galaxy detection in these large data cubes, we developed a new method that processes 3D data either for modeling or for estimation and extraction of source configurations. This object-based approach yields a natural sparse representation of the sources in massive data fields, such as MUSE data cubes. In the Bayesian framework, the parameters that describe the observed sources are considered random variables. The Bayesian model leads to a general and robust algorithm where the parameters are estimated in a fully data-driven way. This detection algorithm was applied to the MUSE observation of Hubble Deep Field-South. With 27 h total integration time, these observations provide a catalog of 189 sources of various categories and with secured redshift. The algorithm retrieved 91% of the galaxies with only 9% false detection. This method also allowed the discovery of three new Lyα emitters and one [OII] emitter, all without any Hubble Space Telescope counterpart. We analyzed the reasons for failure for some targets, and found that the most important limitation of the method is when faint sources are located in the vicinity of bright spatially resolved galaxies that cannot be approximated by the Sérsic elliptical profile. The software and its documentation are available on the MUSE science web service (muse-vlt.eu/science).

  7. Antiterrorist Software

    NASA Technical Reports Server (NTRS)

    Clark, David A.

    1998-01-01

    In light of the escalation of terrorism, the Department of Defense spearheaded the development of new antiterrorist software for all Government agencies by issuing a Broad Agency Announcement to solicit proposals. This Government-wide competition resulted in a team that includes NASA Lewis Research Center's Computer Services Division, who will develop the graphical user interface (GUI) and test it in their usability lab. The team launched a program entitled Joint Sphere of Security (JSOS), crafted a design architecture (see the following figure), and is testing the interface. This software system has a state-ofthe- art, object-oriented architecture, with a main kernel composed of the Dynamic Information Architecture System (DIAS) developed by Argonne National Laboratory. DIAS will be used as the software "breadboard" for assembling the components of explosions, such as blast and collapse simulations.

  8. A comparison of the accuracy of pixel based and object based classifications of integrated optical and LiDAR data

    NASA Astrophysics Data System (ADS)

    Gajda, Agnieszka; Wójtowicz-Nowakowska, Anna

    2013-04-01

    A comparison of the accuracy of pixel based and object based classifications of integrated optical and LiDAR data Land cover maps are generally produced on the basis of high resolution imagery. Recently, LiDAR (Light Detection and Ranging) data have been brought into use in diverse applications including land cover mapping. In this study we attempted to assess the accuracy of land cover classification using both high resolution aerial imagery and LiDAR data (airborne laser scanning, ALS), testing two classification approaches: a pixel-based classification and object-oriented image analysis (OBIA). The study was conducted on three test areas (3 km2 each) in the administrative area of Kraków, Poland, along the course of the Vistula River. They represent three different dominating land cover types of the Vistula River valley. Test site 1 had a semi-natural vegetation, with riparian forests and shrubs, test site 2 represented a densely built-up area, and test site 3 was an industrial site. Point clouds from ALS and ortophotomaps were both captured in November 2007. Point cloud density was on average 16 pt/m2 and it contained additional information about intensity and encoded RGB values. Ortophotomaps had a spatial resolution of 10 cm. From point clouds two raster maps were generated: intensity (1) and (2) normalised Digital Surface Model (nDSM), both with the spatial resolution of 50 cm. To classify the aerial data, a supervised classification approach was selected. Pixel based classification was carried out in ERDAS Imagine software. Ortophotomaps and intensity and nDSM rasters were used in classification. 15 homogenous training areas representing each cover class were chosen. Classified pixels were clumped to avoid salt and pepper effect. Object oriented image object classification was carried out in eCognition software, which implements both the optical and ALS data. Elevation layers (intensity, firs/last reflection, etc.) were used at segmentation stage due to

  9. Using object-based geomorphometry for hydro-geomorphological analysis in a Mediterranean research catchment

    NASA Astrophysics Data System (ADS)

    Guida, Domenico; Cuomo, Albina; Palmieri, Vincenzo

    2016-08-01

    The aim of the paper is to apply an object-based geomorphometric procedure to define the runoff contribution areas and support a hydro-geomorphological analysis of a 3 km2 Mediterranean research catchment (southern Italy). Daily and sub-hourly discharge and electrical conductivity data were collected and recorded during a 3-year monitoring activity. Hydro-chemograph analyses carried out on these data revealed a strong seasonal hydrological response in the catchment that differed from the stormflow events that occur in the wet periods and in dry periods. This analysis enabled us to define the hydro-chemograph signatures related to increasing flood magnitude, which progressively involves various runoff components (baseflow, subsurface flow and surficial flow) and an increasing contributing area to discharge. Field surveys and water table/discharge measurements carried out during a selected storm event enabled us to identify and map specific runoff source areas with homogeneous geomorphological units previously defined as hydro-geomorphotypes (spring points, diffuse seepage along the main channel, seepage along the riparian corridors, diffuse outflow from hillslope taluses and concentrate sapping from colluvial hollows). Following the procedures previously proposed and used by authors for object-based geomorphological mapping, a hydro-geomorphologically oriented segmentation and classification was performed with the eCognition (Trimble, Inc.) package. The best agreement with the expert-based geomorphological mapping was obtained with weighted plan curvature at different-sized windows. By combining the hydro-chemical analysis and object-based hydro-geomorphotype map, the variability of the contribution areas was graphically modeled for the selected event, which occurred during the wet season, by using the log values of flow accumulation that better fit the contribution areas. The results allow us to identify the runoff component on hydro-chemographs for each time step

  10. Mapping Arctic Ocean Coastline Change With Landsat Archive Data And Object-Based Image Analysis

    NASA Astrophysics Data System (ADS)

    Hulslander, D.

    2010-12-01

    The melting of arctic permafrost is a significant effect of climate change. The combination of rising sea level, longer periods of ice-free conditions in the Arctic Ocean and melting permafrost can greatly accelerate coastline changes in general and arctic coastal erosion in particular. Anderson et al. (2009; Geology News) have measured erosion rates of 15 m per year at sites along the Alaskan Arctic Ocean coastline dominated by ice-cemented peats and silt-rich permafrost. With over 45,000 km of Arctic Ocean coastline, it is important that coastline movement and transgressive oceanic regimes be mapped and tracked with accurate data. Determining historic coastal erosion rates for this region is as important as mapping the current extent of the phenomenon to create as complete a picture as possible and locate where rapid erosion is an emergent process. The extent of the area involved combined with its inaccessibility and inhospitable conditions makes geologic remote sensing an appropriate tool for characterizing Arctic Ocean coastal erosion. Traditional weaknesses associated with using remote sensing in the geosciences have included a lack of historical data or baseline information as well as difficulties in systematization of feature mapping. Using object-based image analysis on Landsat archive data can overcome these issues and may allow for a potential multi-decadal map of Arctic Ocean coastline changes. The Landsat family of sensors (MSS 1-3 and TM/ETM 4, 5, and 7) have been providing imagery as frequently as every 16 days since July 1972. The frequent revisits maximize the chance of getting cloud-free imagery at least once per year in most study areas. Also, Landsat data are well characterized, extensively studied, and freely available from the USGS EROS Data Center Archive, making it an ideal and stable source of data for mapping the Arctic Ocean coastline. Delineating large sections of coastline from imagery by hand digitization would be impractical due to the

  11. Mental rotation performance in soccer players and gymnasts in an object-based mental rotation task.

    PubMed

    Jansen, Petra; Lehmann, Jennifer

    2013-01-01

    In this study, the effect of motor expertise on an object-based mental rotation task was investigated. 60 males and 60 females (40 soccer players, 40 gymnasts, and 40 non-athletes, equivalent males and females in each group) solved a psychometric mental rotation task with both cube and human figures. The results revealed that all participants had a higher mental rotation accuracy for human figures compared to cubed figures, that the gender difference was reduced with human figures, and that gymnasts demonstrated a better mental rotation performance than non-athletes. The results are discussed against the background of the existing literature on motor experts, mental rotation performance as well as the importance of the testing situation and the test construction. PMID:23833695

  12. Building and Programming a Smart Robotic System for Distinguishing Objects Based on their Shape and Colour

    NASA Astrophysics Data System (ADS)

    Sharari, T. M.

    2015-03-01

    This paper presents a robotic system designed for holding and placing objects based on their colour and shape. The presented robot is given a complete set of instructions of positions and orientation angles for each manipulation motion. The main feature in this paper is that the developed robot used a combination of vision and motion systems for holding and placing the work-objects, mounted on the flat work-plane, based on their shapes and colors. This combination improves the flexibility of manipulation which may help eliminate the use of some expensive manipulation tasks in a variety of industrial applications. The robotic system presented in this paper is designed as an educational robot that possesses the ability for holding-and-placing operations with limited load. To process the various instructions for holding and placing the work objects, a main control unit - Manipulation Control Unit (MCU) is used as well as a slave unit that performed the actual instructions from the MCU.

  13. Mapping potential Blanding's turtle habitat using aerial orthophotographic imagery and object based classification

    NASA Astrophysics Data System (ADS)

    Barker, Rebecca

    Blanding's turtle (Emydoidea blandingii) is a threatened species in southern Quebec that is being inventoried to determine abundance and potential habitat by the Quebec Ministry of Natural Resources and Wildlife. In collaboration with that program and using spring leaf-off aerial orthophotos of Gatineau Park, attributes associated with known habitat criteria were analyzed: wetlands with open water, vegetation mounds for camouflage and thermoregulation, and logs for spring sun-basking. Pixel-based classification to separate wetlands from other land cover types was followed by object-based segmentation and rule-based classification of within--wetland vegetation and logs. Classifications integrated several image characteristics including texture, context, shape, area and spectral attributes. Field data and visual interpretation showed the accuracies of wetland and within wetland habitat feature classifications to be over 82.5%. The wetland classification results were used to develop a ranked potential habitat suitability map for Blanding's turtle that can be employed in conservation planning and management.

  14. [Software version and medical device software supervision].

    PubMed

    Peng, Liang; Liu, Xiaoyan

    2015-01-01

    The importance of software version in the medical device software supervision does not cause enough attention at present. First of all, the effect of software version in the medical device software supervision is discussed, and then the necessity of software version in the medical device software supervision is analyzed based on the discussion of the misunderstanding of software version. Finally the concrete suggestions on software version naming rules, software version supervision for the software in medical devices, and software version supervision scheme are proposed.

  15. A bio-inspired method and system for visual object-based attention and segmentation

    NASA Astrophysics Data System (ADS)

    Huber, David J.; Khosla, Deepak

    2010-04-01

    This paper describes a method and system of human-like attention and object segmentation in visual scenes that (1) attends to regions in a scene in their rank of saliency in the image, (2) extracts the boundary of an attended proto-object based on feature contours, and (3) can be biased to boost the attention paid to specific features in a scene, such as those of a desired target object in static and video imagery. The purpose of the system is to identify regions of a scene of potential importance and extract the region data for processing by an object recognition and classification algorithm. The attention process can be performed in a default, bottom-up manner or a directed, top-down manner which will assign a preference to certain features over others. One can apply this system to any static scene, whether that is a still photograph or imagery captured from video. We employ algorithms that are motivated by findings in neuroscience, psychology, and cognitive science to construct a system that is novel in its modular and stepwise approach to the problems of attention and region extraction, its application of a flooding algorithm to break apart an image into smaller proto-objects based on feature density, and its ability to join smaller regions of similar features into larger proto-objects. This approach allows many complicated operations to be carried out by the system in a very short time, approaching real-time. A researcher can use this system as a robust front-end to a larger system that includes object recognition and scene understanding modules; it is engineered to function over a broad range of situations and can be applied to any scene with minimal tuning from the user.

  16. Transmission of object based fine-granular-scalability video over networks

    NASA Astrophysics Data System (ADS)

    Shi, Xu-li; Jin, Zhi-cheng; Teng, Guo-wei; Zhang, Zhao-yang; An, Ping; Xiao, Guang

    2006-05-01

    It is a hot focus of current researches in video standards that how to transmit video streams over Internet and wireless networks. One of the key methods is FGS(Fine-Granular-Scalability), which can always adapt to the network bandwidth varying but with some sacrifice of coding efficiency, is supported by MPEG-4. Object-based video coding algorithm has been firstly included in MPEG-4 standard that can be applied in interactive video. However, the real time segmentation of VOP(video object plan) is difficult that limit the application of MPEG-4 standard in interactive video. H.264/AVC is the up-to-date video-coding standard, which enhance compression performance and provision a network-friendly video representation. In this paper, we proposed a new Object Based FGS(OBFGS) coding algorithm embedded in H.264/AVC that is different from that in mpeg-4. After the algorithms optimization for the H.264 encoder, the FGS first finish the base-layer coding. Then extract moving VOP using the base-layer information of motion vectors and DCT coefficients. Sparse motion vector field of p-frame composed of 4*4 blocks, 4*8 blocks and 8*4 blocks in base-layer is interpolated. The DCT coefficient of I-frame is calculated by using information of spatial intra-prediction. After forward projecting each p-frame vector to the immediate adjacent I-frame, the method extracts moving VOPs (video object plan) using a recursion 4*4 block classification process. Only the blocks that belong to the moving VOP in 4*4 block-level accuracy is coded to produce enhancement-layer stream. Experimental results show that our proposed system can obtain high interested VOP quality at the cost of fewer coding efficiency.

  17. Object-based class modelling for multi-scale riparian forest habitat mapping

    NASA Astrophysics Data System (ADS)

    Strasser, Thomas; Lang, Stefan

    2015-05-01

    Object-based class modelling allows for mapping complex, hierarchical habitat systems. The riparian zone, including forests, represents such a complex ecosystem. Forests within riparian zones are biologically high productive and characterized by a rich biodiversity; thus considered of high community interest with an imperative to be protected and regularly monitored. Satellite earth observation (EO) provides tools for capturing the current state of forest habitats such as forest composition including intermixture of non-native tree species. Here we present a semi-automated object based image analysis (OBIA) approach for the mapping of riparian forests by applying class modelling of habitats based on the European Nature Information System (EUNIS) habitat classifications and the European Habitats Directive (HabDir) Annex 1. A very high resolution (VHR) WorldView-2 satellite image provided the required spatial and spectral details for a multi-scale image segmentation and rule-base composition to generate a six-level hierarchical representation of riparian forest habitats. Thereby habitats were hierarchically represented within an image object hierarchy as forest stands, stands of homogenous tree species and single trees represented by sunlit tree crowns. 522 EUNIS level 3 (EUNIS-3) habitat patches with a mean patch size (MPS) of 12,349.64 m2 were modelled from 938 forest stand patches (MPS = 6868.20 m2) and 43,742 tree stand patches (MPS = 140.79 m2). The delineation quality of the modelled EUNIS-3 habitats (focal level) was quantitatively assessed to an expert-based visual interpretation showing a mean deviation of 11.71%.

  18. Evaluation of satellite-based precipitation estimates in winter season using an object-based approach

    NASA Astrophysics Data System (ADS)

    Li, J.; Hsu, K.; AghaKouchak, A.; Sorooshian, S.

    2012-12-01

    Verification has become an integral component of satellite precipitation algorithms and products. A number of object-based verification methods have been proposed to provide diagnostic information regarding the precipitation products' ability to capture the spatial pattern, intensity, and placement of precipitation. However, most object-based methods are not capable of investigating precipitation objects at the storm-scale. In this study, an image processing approach known as watershed segmentation was adopted to detect the storm-scale rainfall objects. Then, a fuzzy logic-based technique was utilized to diagnose and analyze storm-scale object attributes, including centroid distance, area ratio, intersection area ratio and orientation angle difference. Three verification metrics (i.e., false alarm ratio, missing ratio and overall membership score) were generated for validation and verification. Three satellite-based precipitation products, including PERSIANN, CMORPH, 3B42RT, were evaluated against NOAA stage IV MPE multi-sensor composite rain analysis at 0.25° by 0.25° on a daily scale in the winter season of 2010 over the contiguous United States. Winter season is dominated by frontal systems which usually have larger area coverage. All three products and the stage IV observation tend to find large size storm objects. With respect to the evaluation attributes, PERSIANN tends to obtain larger area ratio and consequently has larger centroid distance to the stage IV observations, while 3B42RT are found to be closer to the stage IV for the object size. All evaluation products give small orientation angle differences but vary significantly for the missing ratio and false alarm ratio. This implies that satellite estimates can fail to detect storms in winter. The overall membership scores are close for all three different products which indicate that all three satellite-based precipitation products perform well for capturing the spatial and geometric characteristics of

  19. Studies on pansharpening and object-based classification of Worldview-2 multispectral image

    NASA Astrophysics Data System (ADS)

    Wyczałek, I.; Wyczałek, E.

    2013-12-01

    The new information contained in four additional spectral bands of high - resolution images from the satellite sensor WorldView - 2 should provide a visible improvement in the quality of analysis of large - scale phenomena occurring at the ground. Selected part of the image of Poznan was analyzed in order to verify these possibilities in relation to the urban environment. It includes riverside green area and a number of adjacent buildings. Attention has been focused on two components of object - oriented analysis - sharpening the image and its classification. In terms of pansharpening the aim was to obtain a clear picture of terrain objects in details, what should lead to the correct division of the image into homogenous segments and the subsequent fine classification. It was intended to ensure the possibility of separating small field objects within the set of classes. The task was carried out using various computer programs that enable the development and analysis of raster data (IDRISI Andes, ESRI ArcGIS 9.3, eCognition Developer 8) and some own computational modules. The main scientific objective of this study was to determine how much information from new spectral image layers after their pansharpening affects the quality of object - based classification of land cover in green and building areas of the city. As a basis for improving the quality of the classification was above mentioned ability of using additional data from new spectral bands of WorldView - 2 image. To assess the quality of the classification we used test that examines only the uncertain areas of t he picture, that is these which lie on differently classified types of land cover. The outcome of assessment confirmed the thesis of the positive albeit small impact of additional spectral channels on the result of object - based classification. But also pansharpening itself only slightly improves the quality of classified image

  20. Software Reviews.

    ERIC Educational Resources Information Center

    Wulfson, Stephen, Ed.

    1990-01-01

    Reviewed are six software packages for Apple and/or IBM computers. Included are "Autograph,""The New Game Show,""Science Probe-Earth Science,""Pollution Patrol,""Investigating Plant Growth," and "AIDS: The Investigation." Discussed are the grade level, function, availability, cost, and hardware requirements of each. (CW)

  1. Software Reviews.

    ERIC Educational Resources Information Center

    Science and Children, 1989

    1989-01-01

    Reviews of seven software packages are presented including "The Environment I: Habitats and EcoSystems; II Cycles and Interactions"; "Super Sign Maker"; "The Great Knowledge Race: Substance Abuse"; "Exploring Science: Temperature"; "Fast Food Calculator and RD Aide"; "The Human Body: Circulation and Respiration" and "Forces in Liquids and Gases."…

  2. Star Software.

    ERIC Educational Resources Information Center

    Kloza, Brad

    2000-01-01

    Presents a collection of computer software programs designed to spark learning enthusiasm at every grade level and across the curriculum. They include Reader Rabbit's Learn to Read, Spelling Power, Mind Twister Math, Community Construction Kit, Breaking the Code, Encarta Africana 2000, Virtual Serengeti, Operation: Frog (Deluxe), and My First…

  3. Software Reviews.

    ERIC Educational Resources Information Center

    Science and Children, 1988

    1988-01-01

    Reviews five software packages for use with school age children. Includes "Science Toolkit Module 2: Earthquake Lab"; "Adaptations and Identification"; "Geoworld"; "Body Systems II Series: The Blood System: A Liquid of Life," all for Apple II, and "Science Courseware: Life Science/Biology" for Apple II and IBM. (CW)

  4. Software Update.

    ERIC Educational Resources Information Center

    Currents, 2000

    2000-01-01

    A chart of 40 alumni-development database systems provides information on vendor/Web site, address, contact/phone, software name, price range, minimum suggested workstation/suggested server, standard reports/reporting tools, minimum/maximum record capacity, and number of installed sites/client type. (DB)

  5. Software Comparison

    NASA Technical Reports Server (NTRS)

    Blanchard, D. C.

    1986-01-01

    Software Comparison Package (SCP) compares similar files. Normally, these are 90-character files produced by CDC UPDATE utility from program libraries that contain FORTRAN source code plus identifier. SCP also used to compare load maps, cross-reference outputs, and UPDATE corrections sets. Helps wherever line-by-line comparison of similarly structured files required.

  6. Software Reviews.

    ERIC Educational Resources Information Center

    Classroom Computer Learning, 1990

    1990-01-01

    Reviewed are two computer software packages: "Super Solvers Midnight Rescue!" a problem-solving program for IBM PCs; and "Interactive Physics," a simulation program for the Macintosh computer. The functions of the package are discussed including strengths and weaknesses and teaching suggestions. (CW)

  7. Software Reviews.

    ERIC Educational Resources Information Center

    Bitter, Gary G., Ed.

    1989-01-01

    Describes three software packages: (1) "MacMendeleev"--database/graphic display for chemistry, grades 10-12, Macintosh; (2) "Geometry One: Foundations"--geometry tutorial, grades 7-12, IBM; (3) "Mathematics Exploration Toolkit"--algebra and calculus tutorial, grades 8-12, IBM. (MVL)

  8. Software reengineering

    NASA Technical Reports Server (NTRS)

    Fridge, Ernest M., III

    1991-01-01

    Programs in use today generally have all of the function and information processing capabilities required to do their specified job. However, older programs usually use obsolete technology, are not integrated properly with other programs, and are difficult to maintain. Reengineering is becoming a prominent discipline as organizations try to move their systems to more modern and maintainable technologies. The Johnson Space Center (JSC) Software Technology Branch (STB) is researching and developing a system to support reengineering older FORTRAN programs into more maintainable forms that can also be more readily translated to a modern languages such as FORTRAN 8x, Ada, or C. This activity has led to the development of maintenance strategies for design recovery and reengineering. These strategies include a set of standards, methodologies, and the concepts for a software environment to support design recovery and reengineering. A brief description of the problem being addressed and the approach that is being taken by the STB toward providing an economic solution to the problem is provided. A statement of the maintenance problems, the benefits and drawbacks of three alternative solutions, and a brief history of the STB experience in software reengineering are followed by the STB new FORTRAN standards, methodology, and the concepts for a software environment.

  9. Software Patents.

    ERIC Educational Resources Information Center

    Burke, Edmund B.

    1994-01-01

    Outlines basic patent law information that pertains to computer software programs. Topics addressed include protection in other countries; how to obtain patents; kinds of patents; duration; classes of patentable subject matter, including machines and processes; patentability searches; experimental use prior to obtaining a patent; and patent…

  10. Software Reviews.

    ERIC Educational Resources Information Center

    Mathematics and Computer Education, 1987

    1987-01-01

    Presented are reviews of several microcomputer software programs. Included are reviews of: (1) Microstat (Zenith); (2) MathCAD (MathSoft); (3) Discrete Mathematics (True Basic); (4) CALCULUS (True Basic); (5) Linear-Kit (John Wiley); and (6) Geometry Sensei (Broderbund). (RH)

  11. Software Reviews.

    ERIC Educational Resources Information Center

    Smith, Richard L., Ed.

    1988-01-01

    Reviews two software packages, "Solutions Unlimited" and "BASIC Data Base System." Provides a description, summary, strengths and weaknesses, availability and costs. Includes reviews of three structured BASIC packages: "True BASIC (2.0)"; "Turbo BASIC (1.0)"; and "QuickBASIC (3.0)." Explains significant features such as graphics, costs,…

  12. Reviews: Software.

    ERIC Educational Resources Information Center

    Mackenzie, Norma N.; And Others

    1988-01-01

    Reviews four computer software packages including: "The Physical Science Series: Sound" which demonstrates making waves, speed of sound, doppler effect, and human hearing; "Andromeda" depicting celestial motions in any direction; "Biology Quiz: Humans" covering chemistry, cells, viruses, and human biology; and "MacStronomy" covering information on…

  13. Reviews, Software.

    ERIC Educational Resources Information Center

    Science Teacher, 1988

    1988-01-01

    Reviews two software programs for Apple series computers. Includes "Orbital Mech," a basic planetary orbital simulation for the Macintosh, and "START: Stimulus and Response Tools for Experiments in Memory, Learning, Cognition, and Perception," a program that demonstrates basic psychological principles and experiments. (CW)

  14. Software Reviews.

    ERIC Educational Resources Information Center

    Teles, Elizabeth, Ed.; And Others

    1990-01-01

    Reviewed are two computer software packages for Macintosh microcomputers including "Phase Portraits," an exploratory graphics tool for studying first-order planar systems; and "MacMath," a set of programs for exploring differential equations, linear algebra, and other mathematical topics. Features, ease of use, cost, availability, and hardware…

  15. Software Reviews.

    ERIC Educational Resources Information Center

    Wulfson, Stephen, Ed.

    1989-01-01

    Six software packages are described in this review. Included are "Molecules and Atoms: Exploring the Essence of Matter"; "Heart Probe"; "GM Sunraycer"; "Six Puzzles"; "Information Laboratory--Life Science"; and "Science Test Builder." Hardware requirements, prices, and a summary of the abilities of each program are presented. (CW)

  16. Software Reviews.

    ERIC Educational Resources Information Center

    Wulfson, Stephen, Ed.

    1989-01-01

    Presents comments by classroom teachers on software for science teaching including topics on: the size of a molecule, matter, leaves, vitamins and minerals, dinosaurs, and collecting and measuring data. Each is an Apple computer series. Availability and costs are included. (RT)

  17. Software Reviews.

    ERIC Educational Resources Information Center

    Smith, Richard L., Ed.

    1987-01-01

    Reviewed are three computer software programs: the Astronomer (astronomy program for middle school students and older); Hands-on-Statistics: Explorations with a Microcomputer (statistics program for secondary school students and older); and CATGEN (a genetics program for secondary school students and older). Each review provides information on:…

  18. Software Review.

    ERIC Educational Resources Information Center

    McGrath, Diane, Ed.

    1989-01-01

    Reviewed is a computer software package entitled "Audubon Wildlife Adventures: Grizzly Bears" for Apple II and IBM microcomputers. Included are availability, hardware requirements, cost, and a description of the program. The murder-mystery flavor of the program is stressed in this program that focuses on illegal hunting and game management. (CW)

  19. Software Reviews.

    ERIC Educational Resources Information Center

    Science and Children, 1990

    1990-01-01

    Reviewed are seven computer software packages for IBM and/or Apple Computers. Included are "Windows on Science: Volume 1--Physical Science"; "Science Probe--Physical Science"; "Wildlife Adventures--Grizzly Bears"; "Science Skills--Development Programs"; "The Clean Machine"; "Rock Doctor"; and "Geology Search." Cost, quality, hardware, and…

  20. Software Reviews.

    ERIC Educational Resources Information Center

    McGrath, Diane, Ed.

    1989-01-01

    Reviewed are two computer software programs for Apple II computers on weather for upper elementary and middle school grades. "Weather" introduces the major factors (temperature, humidity, wind, and air pressure) affecting weather. "How Weather Works" uses simulation and auto-tutorial formats on sun, wind, fronts, clouds, and storms. (YP)

  1. Software Reviews.

    ERIC Educational Resources Information Center

    Bitter, Gary G., Ed.

    1989-01-01

    Reviews three software packages: (1) "The Weather Machine Courseware Kit" for grades 7-12; (2) "Exploring Measurement, Time, and Money--Level I," for primary level mathematics; and (3) "Professor DOS with SmartGuide for DOS" providing an extensive tutorial covering DOS 2.1 to 4.0. Discusses the strengths and weaknesses of each package. (YP)

  2. Software Reviews.

    ERIC Educational Resources Information Center

    Science and Children, 1990

    1990-01-01

    Reviewed are six computer software packages including "Invisible Bugs,""Chaos Plus...,""The Botanist's Apprentice,""A Baby is Born," Storyboard Plus-Version 2.0," and "Weather." Hardware requirements, functions, performance, and use in the classroom are discussed. (CW)

  3. Software Reviews.

    ERIC Educational Resources Information Center

    Science and Children, 1988

    1988-01-01

    Reviews six software packages for use with school age children ranging from grade 3 to grade 12. Includes "The Microcomputer Based Lab Project: Motion, Sound"; "Genetics"; "Geologic History"; "The Microscope Simulator"; and "Wiz Works" all for Apple II and "Reading for Information: Level II" for IBM. (CW)

  4. Software Reviews.

    ERIC Educational Resources Information Center

    Mackenzie, Norma N.; And Others

    1988-01-01

    Describes computer software for use with various age groups. Topics include activities involving temperature, simulations, earth science, the circulatory system, human body, reading in science, and ecology. Provides information on equipment needed, availability, package contents, and price. Comments of reviews are presented by classroom teachers.…

  5. Segmentation and Classification of Remotely Sensed Images: Object-Based Image Analysis

    NASA Astrophysics Data System (ADS)

    Syed, Abdul Haleem

    Land-use-and-land-cover (LULC) mapping is crucial in precision agriculture, environmental monitoring, disaster response, and military applications. The demand for improved and more accurate LULC maps has led to the emergence of a key methodology known as Geographic Object-Based Image Analysis (GEOBIA). The core idea of the GEOBIA for an object-based classification system (OBC) is to change the unit of analysis from single-pixels to groups-of-pixels called `objects' through segmentation. While this new paradigm solved problems and improved global accuracy, it also raised new challenges such as the loss of accuracy in categories that are less abundant, but potentially important. Although this trade-off may be acceptable in some domains, the consequences of such an accuracy loss could be potentially fatal in others (for instance, landmine detection). This thesis proposes a method to improve OBC performance by eliminating such accuracy losses. Specifically, we examine the two key players of an OBC system: Hierarchical Segmentation and Supervised Classification. Further, we propose a model to understand the source of accuracy errors in minority categories and provide a method called Scale Fusion to eliminate those errors. This proposed fusion method involves two stages. First, the characteristic scale for each category is estimated through a combination of segmentation and supervised classification. Next, these estimated scales (segmentation maps) are fused into one combined-object-map. Classification performance is evaluated by comparing results of the multi-cut-and-fuse approach (proposed) to the traditional single-cut (SC) scale selection strategy. Testing on four different data sets revealed that our proposed algorithm improves accuracy on minority classes while performing just as well on abundant categories. Another active obstacle, presented by today's remotely sensed images, is the volume of information produced by our modern sensors with high spatial and

  6. Quantification of Impervious Surfaces Along the Wasatch Front, Utah: AN Object-Based Image Analysis Approach to Identifying AN Indicator for Wetland Stress

    NASA Astrophysics Data System (ADS)

    Leydsman-McGinty, E. I.; Ramsey, R. D.; McGinty, C.

    2013-12-01

    The Remote Sensing/GIS Laboratory at Utah State University, in cooperation with the United States Environmental Protection Agency, is quantifying impervious surfaces for three watershed sub-basins in Utah. The primary objective of developing watershed-scale quantifications of impervious surfaces is to provide an indicator of potential impacts to wetlands that occur within the Wasatch Front and along the Great Salt Lake. A geospatial layer of impervious surfaces can assist state agencies involved with Utah's Wetlands Program Plan (WPP) in understanding the impacts of impervious surfaces on wetlands, as well as support them in carrying out goals and actions identified in the WPP. The three watershed sub-basins, Lower Bear-Malad, Lower Weber, and Jordan, span the highly urbanized Wasatch Front and are consistent with focal areas in need of wetland monitoring and assessment as identified in Utah's WPP. Geospatial layers of impervious surface currently exist in the form of national and regional land cover datasets; however, these datasets are too coarse to be utilized in fine-scale analyses. In addition, the pixel-based image processing techniques used to develop these coarse datasets have proven insufficient in smaller scale or detailed studies, particularly when applied to high-resolution satellite imagery or aerial photography. Therefore, object-based image analysis techniques are being implemented to develop the geospatial layer of impervious surfaces. Object-based image analysis techniques employ a combination of both geospatial and image processing methods to extract meaningful information from high-resolution imagery. Spectral, spatial, textural, and contextual information is used to group pixels into image objects and then subsequently used to develop rule sets for image classification. eCognition, an object-based image analysis software program, is being utilized in conjunction with one-meter resolution National Agriculture Imagery Program (NAIP) aerial

  7. Object-Based Classification as an Alternative Approach to the Traditional Pixel-Based Classification to Identify Potential Habitat of the Grasshopper Sparrow

    NASA Astrophysics Data System (ADS)

    Jobin, Benoît; Labrecque, Sandra; Grenier, Marcelle; Falardeau, Gilles

    2008-01-01

    The traditional method of identifying wildlife habitat distribution over large regions consists of pixel-based classification of satellite images into a suite of habitat classes used to select suitable habitat patches. Object-based classification is a new method that can achieve the same objective based on the segmentation of spectral bands of the image creating homogeneous polygons with regard to spatial or spectral characteristics. The segmentation algorithm does not solely rely on the single pixel value, but also on shape, texture, and pixel spatial continuity. The object-based classification is a knowledge base process where an interpretation key is developed using ground control points and objects are assigned to specific classes according to threshold values of determined spectral and/or spatial attributes. We developed a model using the eCognition software to identify suitable habitats for the Grasshopper Sparrow, a rare and declining species found in southwestern Québec. The model was developed in a region with known breeding sites and applied on other images covering adjacent regions where potential breeding habitats may be present. We were successful in locating potential habitats in areas where dairy farming prevailed but failed in an adjacent region covered by a distinct Landsat scene and dominated by annual crops. We discuss the added value of this method, such as the possibility to use the contextual information associated to objects and the ability to eliminate unsuitable areas in the segmentation and land cover classification processes, as well as technical and logistical constraints. A series of recommendations on the use of this method and on conservation issues of Grasshopper Sparrow habitat is also provided.

  8. Object-based classification as an alternative approach to the traditional pixel-based classification to identify potential habitat of the grasshopper sparrow.

    PubMed

    Jobin, Benoît; Labrecque, Sandra; Grenier, Marcelle; Falardeau, Gilles

    2008-01-01

    The traditional method of identifying wildlife habitat distribution over large regions consists of pixel-based classification of satellite images into a suite of habitat classes used to select suitable habitat patches. Object-based classification is a new method that can achieve the same objective based on the segmentation of spectral bands of the image creating homogeneous polygons with regard to spatial or spectral characteristics. The segmentation algorithm does not solely rely on the single pixel value, but also on shape, texture, and pixel spatial continuity. The object-based classification is a knowledge base process where an interpretation key is developed using ground control points and objects are assigned to specific classes according to threshold values of determined spectral and/or spatial attributes. We developed a model using the eCognition software to identify suitable habitats for the Grasshopper Sparrow, a rare and declining species found in southwestern Québec. The model was developed in a region with known breeding sites and applied on other images covering adjacent regions where potential breeding habitats may be present. We were successful in locating potential habitats in areas where dairy farming prevailed but failed in an adjacent region covered by a distinct Landsat scene and dominated by annual crops. We discuss the added value of this method, such as the possibility to use the contextual information associated to objects and the ability to eliminate unsuitable areas in the segmentation and land cover classification processes, as well as technical and logistical constraints. A series of recommendations on the use of this method and on conservation issues of Grasshopper Sparrow habitat is also provided.

  9. Computer-Aided Software Engineering - An approach to real-time software development

    NASA Technical Reports Server (NTRS)

    Walker, Carrie K.; Turkovich, John J.

    1989-01-01

    A new software engineering discipline is Computer-Aided Software Engineering (CASE), a technology aimed at automating the software development process. This paper explores the development of CASE technology, particularly in the area of real-time/scientific/engineering software, and a history of CASE is given. The proposed software development environment for the Advanced Launch System (ALS CASE) is described as an example of an advanced software development system for real-time/scientific/engineering (RT/SE) software. The Automated Programming Subsystem of ALS CASE automatically generates executable code and corresponding documentation from a suitably formatted specification of the software requirements. Software requirements are interactively specified in the form of engineering block diagrams. Several demonstrations of the Automated Programming Subsystem are discussed.

  10. Object Based Image Analysis Combining High Spatial Resolution Imagery and Laser Point Clouds for Urban Land Cover

    NASA Astrophysics Data System (ADS)

    Zou, Xiaoliang; Zhao, Guihua; Li, Jonathan; Yang, Yuanxi; Fang, Yong

    2016-06-01

    With the rapid developments of the sensor technology, high spatial resolution imagery and airborne Lidar point clouds can be captured nowadays, which make classification, extraction, evaluation and analysis of a broad range of object features available. High resolution imagery, Lidar dataset and parcel map can be widely used for classification as information carriers. Therefore, refinement of objects classification is made possible for the urban land cover. The paper presents an approach to object based image analysis (OBIA) combing high spatial resolution imagery and airborne Lidar point clouds. The advanced workflow for urban land cover is designed with four components. Firstly, colour-infrared TrueOrtho photo and laser point clouds were pre-processed to derive the parcel map of water bodies and nDSM respectively. Secondly, image objects are created via multi-resolution image segmentation integrating scale parameter, the colour and shape properties with compactness criterion. Image can be subdivided into separate object regions. Thirdly, image objects classification is performed on the basis of segmentation and a rule set of knowledge decision tree. These objects imagery are classified into six classes such as water bodies, low vegetation/grass, tree, low building, high building and road. Finally, in order to assess the validity of the classification results for six classes, accuracy assessment is performed through comparing randomly distributed reference points of TrueOrtho imagery with the classification results, forming the confusion matrix and calculating overall accuracy and Kappa coefficient. The study area focuses on test site Vaihingen/Enz and a patch of test datasets comes from the benchmark of ISPRS WG III/4 test project. The classification results show higher overall accuracy for most types of urban land cover. Overall accuracy is 89.5% and Kappa coefficient equals to 0.865. The OBIA approach provides an effective and convenient way to combine high

  11. The EOSDIS software challenge

    NASA Astrophysics Data System (ADS)

    Jaworski, Allan

    1993-08-01

    The Earth Observing System (EOS) Data and Information System (EOSDIS) will serve as a major resource for the earth science community, supporting both command and control of complex instruments onboard the EOS spacecraft and the archiving, distribution, and analysis of data. The scale of EOSDIS and the volume of multidisciplinary research to be conducted using EOSDIS resources will produce unparalleled needs for technology transparency, data integration, and system interoperability. The scale of this effort far outscopes any previous scientific data system in its breadth or operational and performance needs. Modern hardware technology can meet the EOSDIS technical challenge. Multiprocessing speeds of many giga-flops are being realized by modern computers. Online storage disk, optical disk, and videocassette libraries with storage capacities of many terabytes are now commercially available. Radio frequency and fiber optics communications networks with gigabit rates are demonstrable today. It remains, of course, to perform the system engineering to establish the requirements, architectures, and designs that will implement the EOSDIS systems. Software technology, however, has not enjoyed the price/performance advances of hardware. Although we have learned to engineer hardware systems which have several orders of magnitude greater complexity and performance than those built in the 1960's, we have not made comparable progress in dramatically reducing the cost of software development. This lack of progress may significantly reduce our capabilities to achieve economically the types of highly interoperable, responsive, integraded, and productive environments which are needed by the earth science community. This paper describes some of the EOSDIS software requirements and current activities in the software community which are applicable to meeting the EOSDIS challenge. Some of these areas include intelligent user interfaces, software reuse libraries, and domain engineering

  12. Space Station Software Issues

    NASA Technical Reports Server (NTRS)

    Voigt, S. (Editor); Beskenis, S. (Editor)

    1985-01-01

    Issues in the development of software for the Space Station are discussed. Software acquisition and management, software development environment, standards, information system support for software developers, and a future software advisory board are addressed.

  13. Semi-Automated Classification of Gray Scale Aerial Photographs using Geographic Object Based Image Analysis (GEOBIA) Technique

    NASA Astrophysics Data System (ADS)

    Harb Rabia, Ahmed; Terribile, Fabio

    2013-04-01

    Aerial photography is an important source of high resolution remotely sensed data. Before 1970, aerial photographs were the only remote sensing data source for land use and land cover classification. Using these old aerial photographs improve the final output of land use and land cover change detection. However, classic techniques of aerial photographs classification like manual interpretation or screen digitization require great experience, long processing time and vast effort. A new technique needs to be developed in order to reduce processing time and effort and to give better results. Geographic object based image analysis (GEOBIA) is a newly developed area of Geographic Information Science and remote sensing in which automatic segmentation of images into objects of similar spectral, temporal and spatial characteristics is undertaken. Unlike pixel-based technique, GEOBIA deals with the object properties such as texture, square fit, roundness and many other properties that can improve classification results. GEOBIA technique can be divided into two main steps; segmentation and classification. Segmentation process is grouping adjacent pixels into objects of similar spectral and spatial characteristics. Classification process is assigning classes to the generated objects based on the characteristics of the individual objects. This study aimed to use GEOBIA technique to develop a novel approach for land use and land cover classification of aerial photographs that saves time and effort and gives improved results. Aerial photographs from 1954 of Valle Telesina in Italy were used in this study. Images were rectified and georeferenced in Arcmap using topographic maps. Images were then processed in eCognition software to generate land use and land cover map of 1954. A decision tree rule set was developed in eCognition to classify images and finally nine classes of general land use and land cover in the study area were recognized (forest, trees stripes, agricultural

  14. Automated digital image analysis of islet cell mass using Nikon's inverted eclipse Ti microscope and software to improve engraftment may help to advance the therapeutic efficacy and accessibility of islet transplantation across centers.

    PubMed

    Gmyr, Valery; Bonner, Caroline; Lukowiak, Bruno; Pawlowski, Valerie; Dellaleau, Nathalie; Belaich, Sandrine; Aluka, Isanga; Moermann, Ericka; Thevenet, Julien; Ezzouaoui, Rimed; Queniat, Gurvan; Pattou, Francois; Kerr-Conte, Julie

    2015-01-01

    Reliable assessment of islet viability, mass, and purity must be met prior to transplanting an islet preparation into patients with type 1 diabetes. The standard method for quantifying human islet preparations is by direct microscopic analysis of dithizone-stained islet samples, but this technique may be susceptible to inter-/intraobserver variability, which may induce false positive/negative islet counts. Here we describe a simple, reliable, automated digital image analysis (ADIA) technique for accurately quantifying islets into total islet number, islet equivalent number (IEQ), and islet purity before islet transplantation. Islets were isolated and purified from n = 42 human pancreata according to the automated method of Ricordi et al. For each preparation, three islet samples were stained with dithizone and expressed as IEQ number. Islets were analyzed manually by microscopy or automatically quantified using Nikon's inverted Eclipse Ti microscope with built-in NIS-Elements Advanced Research (AR) software. The AIDA method significantly enhanced the number of islet preparations eligible for engraftment compared to the standard manual method (p < 0.001). Comparisons of individual methods showed good correlations between mean values of IEQ number (r(2) = 0.91) and total islet number (r(2) = 0.88) and thus increased to r(2) = 0.93 when islet surface area was estimated comparatively with IEQ number. The ADIA method showed very high intraobserver reproducibility compared to the standard manual method (p < 0.001). However, islet purity was routinely estimated as significantly higher with the manual method versus the ADIA method (p < 0.001). The ADIA method also detected small islets between 10 and 50 µm in size. Automated digital image analysis utilizing the Nikon Instruments software is an unbiased, simple, and reliable teaching tool to comprehensively assess the individual size of each islet cell preparation prior to transplantation. Implementation of this

  15. Three Dimentional Reconstruction of Large Cultural Heritage Objects Based on Uav Video and Tls Data

    NASA Astrophysics Data System (ADS)

    Xu, Z.; Wu, T. H.; Shen, Y.; Wu, L.

    2016-06-01

    This paper investigates the synergetic use of unmanned aerial vehicle (UAV) and terrestrial laser scanner (TLS) in 3D reconstruction of cultural heritage objects. Rather than capturing still images, the UAV that equips a consumer digital camera is used to collect dynamic videos to overcome its limited endurance capacity. Then, a set of 3D point-cloud is generated from video image sequences using the automated structure-from-motion (SfM) and patch-based multi-view stereo (PMVS) methods. The TLS is used to collect the information that beyond the reachability of UAV imaging e.g., partial building facades. A coarse to fine method is introduced to integrate the two sets of point clouds UAV image-reconstruction and TLS scanning for completed 3D reconstruction. For increased reliability, a variant of ICP algorithm is introduced using local terrain invariant regions in the combined designation. The experimental study is conducted in the Tulou culture heritage building in Fujian province, China, which is focused on one of the TuLou clusters built several hundred years ago. Results show a digital 3D model of the Tulou cluster with complete coverage and textural information. This paper demonstrates the usability of the proposed method for efficient 3D reconstruction of heritage object based on UAV video and TLS data.

  16. Object-based approach to national land cover mapping using HJ satellite imagery

    NASA Astrophysics Data System (ADS)

    Zhang, Lei; Li, Xiaosong; Yuan, Quanzhi; Liu, Yu

    2014-01-01

    To meet the carbon storage estimate in ecosystems for a national carbon strategy, we introduce a consistent database of China land cover. The Chinese Huan Jing (HJ) satellite is proven efficient in the cloud-free acquisition of seasonal image series in a monsoon region and in vegetation identification for mesoscale land cover mapping. Thirty-eight classes of level II land cover are generated based on the Land Cover Classification System of the United Nations Food and Agriculture Organization that follows a standard and quantitative definition. Twenty-four layers of derivative spectral, environmental, and spatial features compose the classification database. Object-based approach characterizing additional nonspectral features is conducted through mapping, and multiscale segmentations are applied on object boundary match to target real-world conditions. This method sufficiently employs spatial information, in addition to spectral characteristics, to improve classification accuracy. The algorithm of hierarchical classification is employed to follow step-by-step procedures that effectively control classification quality. This algorithm divides the dual structures of universal and local trees. Consistent universal trees suitable to most regions are performed first, followed by local trees that depend on specific features of nine climate stratifications. The independent validation indicates the overall accuracy reaches 86%.

  17. A supervised method for object-based 3D building change detection on aerial stereo images

    NASA Astrophysics Data System (ADS)

    Qin, R.; Gruen, A.

    2014-08-01

    There is a great demand for studying the changes of buildings over time. The current trend for building change detection combines the orthophoto and DSM (Digital Surface Models). The pixel-based change detection methods are very sensitive to the quality of the images and DSMs, while the object-based methods are more robust towards these problems. In this paper, we propose a supervised method for building change detection. After a segment-based SVM (Support Vector Machine) classification with features extracted from the orthophoto and DSM, we focus on the detection of the building changes of different periods by measuring their height and texture differences, as well as their shapes. A decision tree analysis is used to assess the probability of change for each building segment and the traffic lighting system is used to indicate the status "change", "non-change" and "uncertain change" for building segments. The proposed method is applied to scanned aerial photos of the city of Zurich in 2002 and 2007, and the results have demonstrated that our method is able to achieve high detection accuracy.

  18. An object-based approach for areal rainfall estimation and validation of atmospheric models

    NASA Astrophysics Data System (ADS)

    Trömel, Silke; Simmer, Clemens

    2012-02-01

    Integral radar volume descriptors (IRVD) are introduced and discussed as a pathway to an object-based characterization of precipitation systems. A set of IRVD values characterize the temporal development of precipitation systems which constitute the objects. The IRVDs are based on the temporal evolution of the three-dimensional distribution of radar reflectivities produced by the objects. In a first step a set of descriptors, i.e. potential IRVDs, are postulated, which characterize a precipitating system observable by a scanning radar e.g. the mean echo-top-height or the temporal change of the bright band depth of a raining system. In a second step a statistical analysis identifies those descriptors, which bear the most significant information about system surface precipitation yield, which are called IRVDs the values of which describe the objects. IRVDs are derived both from pseudo-radar observations retrieved from a weather forecast model and from real radar observations. Since different sets of IRVDs suggest also different precipitation generation mechanisms acting in the model and reality, the IRVD concept is proposed as a more process-oriented approach to model validation. Finally, the potential of IRVDs to improve estimates of radar-derived precipitation system yields when used on top of Z-R relations is demonstrated.

  19. An object-based classification method for automatic detection of lunar impact craters from topographic data

    NASA Astrophysics Data System (ADS)

    Vamshi, Gasiganti T.; Martha, Tapas R.; Vinod Kumar, K.

    2016-05-01

    Identification of impact craters is a primary requirement to study past geological processes such as impact history. They are also used as proxies for measuring relative ages of various planetary or satellite bodies and help to understand the evolution of planetary surfaces. In this paper, we present a new method using object-based image analysis (OBIA) technique to detect impact craters of wide range of sizes from topographic data. Multiresolution image segmentation of digital terrain models (DTMs) available from the NASA's LRO mission was carried out to create objects. Subsequently, objects were classified into impact craters using shape and morphometric criteria resulting in 95% detection accuracy. The methodology developed in a training area in parts of Mare Imbrium in the form of a knowledge-based ruleset when applied in another area, detected impact craters with 90% accuracy. The minimum and maximum sizes (diameters) of impact craters detected in parts of Mare Imbrium by our method are 29 m and 1.5 km, respectively. Diameters of automatically detected impact craters show good correlation (R2 > 0.85) with the diameters of manually detected impact craters.

  20. Object-based image analysis for the impact of sewage pollution in Malad Creek, Mumbai, India.

    PubMed

    Shirke, Shivani; Pinto, Shannon M; Kushwaha, Vikash K; Mardikar, Trupti; Vijay, Ritesh

    2016-02-01

    Today, object-based image analysis provides an option for integrating spatial information beyond conventional pixel-based classifications for high-resolution imagery. Due to its rare applicability in pollution assessment, an attempt has been made to assess the spatial extent of sewage pollution in Malad Creek, Mumbai, India. Based on multiresolution segmentation of an IRS P6 (LISS IV) image and the Normalized Difference Turbidity Index (NDTI), the various water quality regions in the creek were classified. The existing literature implies that the reflectance of turbid water is similar to that of bare soil which gives positive NDTI values. In contrast to this, negative values of NDTI are observed in the present study due to the presence of organic matter which absorbs light and imparts turbidity, which is supported by the significant correlation between NDTI and turbidity. A strong relationship is observed between turbidity and water quality parameters, implying the impact of organic matter through discharges of sewage in the creek. Based on the classified regions and the water quality parameters, the extent of pollution was ranked as high, moderate, low and least. The methodology developed in the present study was successfully applied on an IKONOS image for the same study area but a different time frame. The approach will help in impact assessment of sewage pollution and its spatial extent in other water bodies.

  1. A Framework for Geographic Object-Based Image Analysis (GEOBIA) based on geographic ontology

    NASA Astrophysics Data System (ADS)

    Gu, H. Y.; Li, H. T.; Yan, L.; Lu, X. J.

    2015-06-01

    GEOBIA (Geographic Object-Based Image Analysis) is not only a hot topic of current remote sensing and geographical research. It is believed to be a paradigm in remote sensing and GIScience. The lack of a systematic approach designed to conceptualize and formalize the class definitions makes GEOBIA a highly subjective and difficult method to reproduce. This paper aims to put forward a framework for GEOBIA based on geographic ontology theory, which could implement "Geographic entities - Image objects - Geographic objects" true reappearance. It consists of three steps, first, geographical entities are described by geographic ontology, second, semantic network model is built based on OWL(ontology web language), at last, geographical objects are classified with decision rule or other classifiers. A case study of farmland ontology was conducted for describing the framework. The strength of this framework is that it provides interpretation strategies and global framework for GEOBIA with the property of objective, overall, universal, universality, etc., which avoids inconsistencies caused by different experts' experience and provides an objective model for mage analysis.

  2. Web of Objects Based Ambient Assisted Living Framework for Emergency Psychiatric State Prediction

    PubMed Central

    Alam, Md Golam Rabiul; Abedin, Sarder Fakhrul; Al Ameen, Moshaddique; Hong, Choong Seon

    2016-01-01

    Ambient assisted living can facilitate optimum health and wellness by aiding physical, mental and social well-being. In this paper, patients’ psychiatric symptoms are collected through lightweight biosensors and web-based psychiatric screening scales in a smart home environment and then analyzed through machine learning algorithms to provide ambient intelligence in a psychiatric emergency. The psychiatric states are modeled through a Hidden Markov Model (HMM), and the model parameters are estimated using a Viterbi path counting and scalable Stochastic Variational Inference (SVI)-based training algorithm. The most likely psychiatric state sequence of the corresponding observation sequence is determined, and an emergency psychiatric state is predicted through the proposed algorithm. Moreover, to enable personalized psychiatric emergency care, a service a web of objects-based framework is proposed for a smart-home environment. In this framework, the biosensor observations and the psychiatric rating scales are objectified and virtualized in the web space. Then, the web of objects of sensor observations and psychiatric rating scores are used to assess the dweller’s mental health status and to predict an emergency psychiatric state. The proposed psychiatric state prediction algorithm reported 83.03 percent prediction accuracy in an empirical performance study. PMID:27608023

  3. Web of Objects Based Ambient Assisted Living Framework for Emergency Psychiatric State Prediction.

    PubMed

    Alam, Md Golam Rabiul; Abedin, Sarder Fakhrul; Al Ameen, Moshaddique; Hong, Choong Seon

    2016-01-01

    Ambient assisted living can facilitate optimum health and wellness by aiding physical, mental and social well-being. In this paper, patients' psychiatric symptoms are collected through lightweight biosensors and web-based psychiatric screening scales in a smart home environment and then analyzed through machine learning algorithms to provide ambient intelligence in a psychiatric emergency. The psychiatric states are modeled through a Hidden Markov Model (HMM), and the model parameters are estimated using a Viterbi path counting and scalable Stochastic Variational Inference (SVI)-based training algorithm. The most likely psychiatric state sequence of the corresponding observation sequence is determined, and an emergency psychiatric state is predicted through the proposed algorithm. Moreover, to enable personalized psychiatric emergency care, a service a web of objects-based framework is proposed for a smart-home environment. In this framework, the biosensor observations and the psychiatric rating scales are objectified and virtualized in the web space. Then, the web of objects of sensor observations and psychiatric rating scores are used to assess the dweller's mental health status and to predict an emergency psychiatric state. The proposed psychiatric state prediction algorithm reported 83.03 percent prediction accuracy in an empirical performance study.

  4. UAV-based urban structural damage assessment using object-based image analysis and semantic reasoning

    NASA Astrophysics Data System (ADS)

    Fernandez Galarreta, J.; Kerle, N.; Gerke, M.

    2015-06-01

    Structural damage assessment is critical after disasters but remains a challenge. Many studies have explored the potential of remote sensing data, but limitations of vertical data persist. Oblique imagery has been identified as more useful, though the multi-angle imagery also adds a new dimension of complexity. This paper addresses damage assessment based on multi-perspective, overlapping, very high resolution oblique images obtained with unmanned aerial vehicles (UAVs). 3-D point-cloud assessment for the entire building is combined with detailed object-based image analysis (OBIA) of façades and roofs. This research focuses not on automatic damage assessment, but on creating a methodology that supports the often ambiguous classification of intermediate damage levels, aiming at producing comprehensive per-building damage scores. We identify completely damaged structures in the 3-D point cloud, and for all other cases provide the OBIA-based damage indicators to be used as auxiliary information by damage analysts. The results demonstrate the usability of the 3-D point-cloud data to identify major damage features. Also the UAV-derived and OBIA-processed oblique images are shown to be a suitable basis for the identification of detailed damage features on façades and roofs. Finally, we also demonstrate the possibility of aggregating the multi-perspective damage information at building level.

  5. A building extraction approach for Airborne Laser Scanner data utilizing the Object Based Image Analysis paradigm

    NASA Astrophysics Data System (ADS)

    Tomljenovic, Ivan; Tiede, Dirk; Blaschke, Thomas

    2016-10-01

    In the past two decades Object-Based Image Analysis (OBIA) established itself as an efficient approach for the classification and extraction of information from remote sensing imagery and, increasingly, from non-image based sources such as Airborne Laser Scanner (ALS) point clouds. ALS data is represented in the form of a point cloud with recorded multiple returns and intensities. In our work, we combined OBIA with ALS point cloud data in order to identify and extract buildings as 2D polygons representing roof outlines in a top down mapping approach. We performed rasterization of the ALS data into a height raster for the purpose of the generation of a Digital Surface Model (DSM) and a derived Digital Elevation Model (DEM). Further objects were generated in conjunction with point statistics from the linked point cloud. With the use of class modelling methods, we generated the final target class of objects representing buildings. The approach was developed for a test area in Biberach an der Riß (Germany). In order to point out the possibilities of the adaptation-free transferability to another data set, the algorithm has been applied "as is" to the ISPRS Benchmarking data set of Toronto (Canada). The obtained results show high accuracies for the initial study area (thematic accuracies of around 98%, geometric accuracy of above 80%). The very high performance within the ISPRS Benchmark without any modification of the algorithm and without any adaptation of parameters is particularly noteworthy.

  6. An object-based assessment of the High Resolution Rapid Refresh Model over the western US

    NASA Astrophysics Data System (ADS)

    Bytheway, J. L.; Kummerow, C. D.

    2015-12-01

    The High Resolution Rapid Refresh (HRRR) model is an hourly-updated, radar-initialized, 3km storm-resolving model run operationally in the US since September 2014, with the main goal of improving forecasts of convective precipitation. Here, we identify precipitating objects in the forecast and observed fields in a manner similar to the Method for Object Based Deterministic Evaluation (MODE). The identified objects are used to perform an assessment of HRRR model forecasts of warm season convective precipitation over the complex terrain of the western US (west of 105W). The constellation of satellites comprising the Global Precipitation Measurement (GPM) Mission provide observations of precipitating objects in the domain, and can be used to assess not only the quality of the rain/no-rain forecast, but also the model's representation of the microphysical and other environmental properties governing precipitation behavior (e.g. latent heating, vertically integrated liquid, etc.). Special attention will be paid to vertical profiles of liquid and ice water contents that can be inferred from the observed reflectivity profiles of the GPM Dual-frequency Precipitation Radar (DPR) and other imaging and sounding instruments. Potential relationships between these properties and the success or failure of the model forecast will be discussed.

  7. Object-based spatial attention when objects have sufficient depth cues.

    PubMed

    Takeya, Ryuji; Kasai, Tetsuko

    2015-01-01

    Attention directed to a part of an object tends to obligatorily spread over all of the spatial regions that belong to the object, which may be critical for rapid object-recognition in cluttered visual scenes. Previous studies have generally used simple rectangles as objects and have shown that attention spreading is reflected by amplitude modulation in the posterior N1 component (150-200 ms poststimulus) of event-related potentials, while other interpretations (i.e., rectangular holes) may arise implicitly in early visual processing stages. By using modified Kanizsa-type stimuli that provided less ambiguity of depth ordering, the present study examined early event-related potential spatial-attention effects for connected and separated objects, both of which were perceived in front of (Experiment 1) and in back of (Experiment 2) the surroundings. Typical P1 (100-140 ms) and N1 (150-220 ms) attention effects of ERP in response to unilateral probes were observed in both experiments. Importantly, the P1 attention effect was decreased for connected objects compared to separated objects only in Experiment 1, and the typical object-based modulations of N1 were not observed in either experiment. These results suggest that spatial attention spreads over a figural object at earlier stages of processing than previously indicated, in three-dimensional visual scenes with multiple depth cues. PMID:26382007

  8. Object-based forest classification to facilitate landscape-scale conservation in the Mississippi Alluvial Valley

    USGS Publications Warehouse

    Mitchell, Michael; Wilson, R. Randy; Twedt, Daniel J.; Mini, Anne E.; James, J. Dale

    2016-01-01

    The Mississippi Alluvial Valley is a floodplain along the southern extent of the Mississippi River extending from southern Missouri to the Gulf of Mexico. This area once encompassed nearly 10 million ha of floodplain forests, most of which has been converted to agriculture over the past two centuries. Conservation programs in this region revolve around protection of existing forest and reforestation of converted lands. Therefore, an accurate and up to date classification of forest cover is essential for conservation planning, including efforts that prioritize areas for conservation activities. We used object-based image analysis with Random Forest classification to quickly and accurately classify forest cover. We used Landsat band, band ratio, and band index statistics to identify and define similar objects as our training sets instead of selecting individual training points. This provided a single rule-set that was used to classify each of the 11 Landsat 5 Thematic Mapper scenes that encompassed the Mississippi Alluvial Valley. We classified 3,307,910±85,344 ha (32% of this region) as forest. Our overall classification accuracy was 96.9% with Kappa statistic of 0.96. Because this method of forest classification is rapid and accurate, assessment of forest cover can be regularly updated and progress toward forest habitat goals identified in conservation plans can be periodically evaluated.

  9. Geographic Object-Based Image Analysis – Towards a new paradigm

    PubMed Central

    Blaschke, Thomas; Hay, Geoffrey J.; Kelly, Maggi; Lang, Stefan; Hofmann, Peter; Addink, Elisabeth; Queiroz Feitosa, Raul; van der Meer, Freek; van der Werff, Harald; van Coillie, Frieke; Tiede, Dirk

    2014-01-01

    The amount of scientific literature on (Geographic) Object-based Image Analysis – GEOBIA has been and still is sharply increasing. These approaches to analysing imagery have antecedents in earlier research on image segmentation and use GIS-like spatial analysis within classification and feature extraction approaches. This article investigates these development and its implications and asks whether or not this is a new paradigm in remote sensing and Geographic Information Science (GIScience). We first discuss several limitations of prevailing per-pixel methods when applied to high resolution images. Then we explore the paradigm concept developed by Kuhn (1962) and discuss whether GEOBIA can be regarded as a paradigm according to this definition. We crystallize core concepts of GEOBIA, including the role of objects, of ontologies and the multiplicity of scales and we discuss how these conceptual developments support important methods in remote sensing such as change detection and accuracy assessment. The ramifications of the different theoretical foundations between the ‘per-pixel paradigm’ and GEOBIA are analysed, as are some of the challenges along this path from pixels, to objects, to geo-intelligence. Based on several paradigm indications as defined by Kuhn and based on an analysis of peer-reviewed scientific literature we conclude that GEOBIA is a new and evolving paradigm. PMID:24623958

  10. Modeling and query the uncertainty of network constrained moving objects based on RFID data

    NASA Astrophysics Data System (ADS)

    Han, Liang; Xie, Kunqing; Ma, Xiujun; Song, Guojie

    2007-06-01

    The management of network constrained moving objects is more and more practical, especially in intelligent transportation system. In the past, the location information of moving objects on network is collected by GPS, which cost high and has the problem of frequent update and privacy. The RFID (Radio Frequency IDentification) devices are used more and more widely to collect the location information. They are cheaper and have less update. And they interfere in the privacy less. They detect the id of the object and the time when moving object passed by the node of the network. They don't detect the objects' exact movement in side the edge, which lead to a problem of uncertainty. How to modeling and query the uncertainty of the network constrained moving objects based on RFID data becomes a research issue. In this paper, a model is proposed to describe the uncertainty of network constrained moving objects. A two level index is presented to provide efficient access to the network and the data of movement. The processing of imprecise time-slice query and spatio-temporal range query are studied in this paper. The processing includes four steps: spatial filter, spatial refinement, temporal filter and probability calculation. Finally, some experiments are done based on the simulated data. In the experiments the performance of the index is studied. The precision and recall of the result set are defined. And how the query arguments affect the precision and recall of the result set is also discussed.

  11. Object-based Encoding in Visual Working Memory: Evidence from Memory-driven Attentional Capture.

    PubMed

    Gao, Zaifeng; Yu, Shixian; Zhu, Chengfeng; Shui, Rende; Weng, Xuchu; Li, Peng; Shen, Mowei

    2016-01-01

    Visual working memory (VWM) adopts a specific manner of object-based encoding (OBE) to extract perceptual information: Whenever one feature-dimension is selected for entry into VWM, the others are also extracted. Currently most studies revealing OBE probed an 'irrelevant-change distracting effect', where changes of irrelevant-features dramatically affected the performance of the target feature. However, the existence of irrelevant-feature change may affect participants' processing manner, leading to a false-positive result. The current study conducted a strict examination of OBE in VWM, by probing whether irrelevant-features guided the deployment of attention in visual search. The participants memorized an object's colour yet ignored shape and concurrently performed a visual-search task. They searched for a target line among distractor lines, each embedded within a different object. One object in the search display could match the shape, colour, or both dimensions of the memory item, but this object never contained the target line. Relative to a neutral baseline, where there was no match between the memory and search displays, search time was significantly prolonged in all match conditions, regardless of whether the memory item was displayed for 100 or 1000 ms. These results suggest that task-irrelevant shape was extracted into VWM, supporting OBE in VWM. PMID:26956084

  12. Web of Objects Based Ambient Assisted Living Framework for Emergency Psychiatric State Prediction.

    PubMed

    Alam, Md Golam Rabiul; Abedin, Sarder Fakhrul; Al Ameen, Moshaddique; Hong, Choong Seon

    2016-01-01

    Ambient assisted living can facilitate optimum health and wellness by aiding physical, mental and social well-being. In this paper, patients' psychiatric symptoms are collected through lightweight biosensors and web-based psychiatric screening scales in a smart home environment and then analyzed through machine learning algorithms to provide ambient intelligence in a psychiatric emergency. The psychiatric states are modeled through a Hidden Markov Model (HMM), and the model parameters are estimated using a Viterbi path counting and scalable Stochastic Variational Inference (SVI)-based training algorithm. The most likely psychiatric state sequence of the corresponding observation sequence is determined, and an emergency psychiatric state is predicted through the proposed algorithm. Moreover, to enable personalized psychiatric emergency care, a service a web of objects-based framework is proposed for a smart-home environment. In this framework, the biosensor observations and the psychiatric rating scales are objectified and virtualized in the web space. Then, the web of objects of sensor observations and psychiatric rating scores are used to assess the dweller's mental health status and to predict an emergency psychiatric state. The proposed psychiatric state prediction algorithm reported 83.03 percent prediction accuracy in an empirical performance study. PMID:27608023

  13. Use of Remote Imagery and Object-based Image Methods to Count Plants in an Open-field Container Nursery

    NASA Astrophysics Data System (ADS)

    Leiva Lopez, Josue Nahun

    In general, the nursery industry lacks an automated inventory control system. Object-based image analysis (OBIA) software and aerial images could be used to count plants in nurseries. The objectives of this research were: 1) to evaluate the effect of an unmanned aerial vehicle (UAV) flight altitude and plant canopy separation of container-grown plants on count accuracy using aerial images and 2) to evaluate the effect of plant canopy shape, presence of flowers, and plant status (living and dead) on counting accuracy of container-grown plants using remote sensing images. Images were analyzed using Feature AnalystRTM (FA) and an algorithm trained using MATLABRTM. Total count error, false positives and unidentified plants were recorded from output images using FA; only total count error was reported for the MATLAB algorithm. For objective 1, images were taken at 6, 12 and 22 m above the ground using a UAV. Plants were placed on black fabric and gravel, and spaced as follows: 5 cm between canopy edges, canopy edges touching, and 5 cm of canopy edge overlap. In general, when both methods were considered, total count error was smaller [ranging from -5 (undercount) to 4 (over count)] when plants were fully separated with the exception of images taken at 22 m. FA showed a smaller total count error (-2) than MATLAB (-5) when plants were placed on black fabric than those placed on gravel. For objective 2, the plan was to continue using the UAV, however, due to the unexpected disruption of the GPS-based navigation by heightened solar flare activity in 2013, a boom lift that could provide images on a more reliable basis was used. When images obtained using a boom lift were analyzed using FA there was no difference between variables measured when an algorithm trained with an image displaying regular or irregular plant canopy shape was applied to images displaying both plant canopy shapes even though the canopy shape of 'Sea Green' juniper is less compact than 'Plumosa Compacta

  14. How a face may affect object-based attention: evidence from adults and 8-month-old infants

    PubMed Central

    Valenza, Eloisa; Franchin, Laura; Bulf, Hermann

    2013-01-01

    Object-based attention operates on perceptual objects, opening the possibility that the costs and benefits humans have to pay to move attention between-objects might be affected by the nature of the stimuli. The current study reported two experiments with adults and 8-month-old infants investigating whether object-based-attention is affected by the type of stimulus (faces vs. non-faces stimuli). Using the well-known cueing task developed by Egly et al. (1994) to study the object-based component of attention, in Experiment 1 adult participants were presented with two upright, inverted or scrambled faces and an eye-tracker measured their saccadic latencies to find a target that could appear on the same object that was just cued or on the other object that was uncued. Data showed that an object-based effect (a smaller cost to shift attention within- compared to between-objects) occurred only with scrambled face, but not with upright or inverted faces. In Experiment 2 the same task was performed with 8-month-old infants, using upright and inverted faces. Data revealed that an object-based effect emerges only for inverted faces but not for upright faces. Overall, these findings suggest that object-based attention is modulated by the type of stimulus and by the experience acquired by the viewer with different objects. PMID:24723860

  15. Analysis Software

    NASA Technical Reports Server (NTRS)

    1994-01-01

    General Purpose Boundary Element Solution Technology (GPBEST) software employs the boundary element method of mechanical engineering analysis, as opposed to finite element. It is, according to one of its developers, 10 times faster in data preparation and more accurate than other methods. Its use results in less expensive products because the time between design and manufacturing is shortened. A commercial derivative of a NASA-developed computer code, it is marketed by Best Corporation to solve problems in stress analysis, heat transfer, fluid analysis and yielding and cracking of solids. Other applications include designing tractor and auto parts, household appliances and acoustic analysis.

  16. Space Software

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Xontech, Inc.'s software package, XonVu, simulates the missions of Voyager 1 at Jupiter and Saturn, Voyager 2 at Jupiter, Saturn, Uranus and Neptune, and Giotto in close encounter with Comet Halley. With the program, the user can generate scenes of the planets, moons, stars or Halley's nucleus and tail as seen by Giotto, all graphically reproduced with high accuracy in wireframe representation. Program can be used on a wide range of computers, including PCs. User friendly and interactive, with many options, XonVu can be used by a space novice or a professional astronomer. With a companion user's manual, it sells for $79.

  17. Simulation Software

    NASA Technical Reports Server (NTRS)

    1996-01-01

    Various NASA Small Business Innovation Research grants from Marshall Space Flight Center, Langley Research Center and Ames Research Center were used to develop the 'kernel' of COMCO's modeling and simulation software, the PHLEX finite element code. NASA needed it to model designs of flight vehicles; one of many customized commercial applications is UNISIM, a PHLEX-based code for analyzing underground flows in oil reservoirs for Texaco, Inc. COMCO's products simulate a computational mechanics problem, estimate the solution's error and produce the optimal hp-adapted mesh for the accuracy the user chooses. The system is also used as a research or training tool in universities and in mechanical design in industrial corporations.

  18. Seminar Software

    NASA Technical Reports Server (NTRS)

    1993-01-01

    The Society for Computer Simulation International is a professional technical society that distributes information on methodology techniques and uses of computer simulation. The society uses NETS, a NASA-developed program, to assist seminar participants in learning to use neural networks for computer simulation. NETS is a software system modeled after the human brain; it is designed to help scientists exploring artificial intelligence to solve pattern matching problems. Examples from NETS are presented to seminar participants, who can then manipulate, alter or enhance them for their own applications.

  19. Neural activity associated with self, other, and object-based counterfactual thinking.

    PubMed

    De Brigard, Felipe; Nathan Spreng, R; Mitchell, Jason P; Schacter, Daniel L

    2015-04-01

    Previous research has shown that autobiographical episodic counterfactual thinking-i.e., mental simulations about alternative ways in which one's life experiences could have occurred-engages the brain's default network (DN). However, it remains unknown whether or not the DN is also engaged during impersonal counterfactual thoughts, specifically those involving other people or objects. The current study compares brain activity during counterfactual simulations involving the self, others and objects. In addition, counterfactual thoughts involving others were manipulated in terms of similarity and familiarity with the simulated characters. The results indicate greater involvement of DN during person-based (i.e., self and other) as opposed to object-based counterfactual simulations. However, the involvement of different regions of the DN during other-based counterfactual simulations was modulated by how close and/or similar the simulated character was perceived to be by the participant. Simulations involving unfamiliar characters preferentially recruited dorsomedial prefrontal cortex. Simulations involving unfamiliar similar characters, characters with whom participants identified personality traits, recruited lateral temporal gyrus. Finally, our results also revealed differential coupling of right hippocampus with lateral prefrontal and temporal cortex during counterfactual simulations involving familiar similar others, but with left transverse temporal gyrus and medial frontal and inferior temporal gyri during counterfactual simulations involving either oneself or unfamiliar dissimilar others. These results suggest that different brain mechanisms are involved in the simulation of personal and impersonal counterfactual thoughts, and that the extent to which regions associated with autobiographical memory are recruited during the simulation of counterfactuals involving others depends on the perceived similarity and familiarity with the simulated individuals. PMID

  20. An object-based image analysis of pinyon and juniper woodlands treated to reduce fuels.

    PubMed

    Hulet, April; Roundy, Bruce A; Petersen, Steven L; Jensen, Ryan R; Bunting, Stephen C

    2014-03-01

    Mechanical and prescribed fire treatments are commonly used to reduce fuel loads and maintain or restore sagebrush steppe rangelands across the Great Basin where pinyon (Pinus) and juniper (Juniperus) trees are encroaching and infilling. Geospatial technologies, particularly remote sensing, could potentially be used in these ecosystems to (1) evaluate the longevity of fuel reduction treatments, (2) provide data for planning and designing future fuel-reduction treatments, and (3) assess the spatial distribution of horizontal fuel structure following fuel-reduction treatments. High-spatial resolution color-infrared imagery (0.06-m pixels) was acquired for pinyon and juniper woodland plots where fuels were reduced by either prescribed fire, tree cutting, or mastication at five sites in Oregon, California, Nevada, and Utah. Imagery was taken with a Vexcel UltraCam X digital camera in June 2009. Within each treatment plot, ground cover was measured as part of the Sagebrush Steppe Treatment Evaluation Project. Trimble eCognition Developer was used to classify land cover classes using object-based image analysis (OBIA) techniques. Differences between cover estimates using OBIA and ground-measurements were not consistently higher or lower for any land cover class and when evaluated for individual sites, were within ±5 % of each other. The overall accuracy and the K hat statistic for classified thematic maps for each treatment were: prescribed burn 85 % and 0.81; cut and fell 82 % and 0.77, and mastication 84 % and 0.80. Although cover assessments from OBIA differed somewhat from ground measurements, they are sufficiently accurate to evaluate treatment success and for supporting a broad range of management concerns.

  1. Modelling of cooperating robotized systems with the use of object-based approach

    NASA Astrophysics Data System (ADS)

    Foit, K.; Gwiazda, A.; Banas, W.; Sekala, A.; Hryniewicz, P.

    2015-11-01

    Today's robotized manufacturing systems are characterized by high efficiency. The emphasis is placed mainly on the simultaneous work of machines. It could manifest in many ways, where the most spectacular one is the cooperation of several robots, during work on the same detail. What's more, recently a dual-arm robots are used that could mimic the manipulative skills of human hands. As a result, it is often hard to deal with the situation, when it is necessary not only to maintain sufficient precision, but also the coordination and proper sequence of movements of individual robots’ arms. The successful completion of this task depends on the individual robot control systems and their respective programmed, but also on the well-functioning communication between robot controllers. A major problem in case of cooperating robots is the possibility of collision between particular links of robots’ kinematic chains. This is not a simple case, because the manufacturers of robotic systems do not disclose the details of the control algorithms, then it is hard to determine such situation. Another problem with cooperation of robots is how to inform the other units about start or completion of part of the task, so that other robots can take further actions. This paper focuses on communication between cooperating robotic units, assuming that every robot is represented by object-based model. This problem requires developing a form of communication protocol that the objects can use for collecting the information about its environment. The approach presented in the paper is not limited to the robots and could be used in a wider range, for example during modelling of the complete workcell or production line.

  2. Extraction of Benthic Cover Information from Video Tows and Photographs Using Object-Based Image Analysis

    NASA Astrophysics Data System (ADS)

    Estomata, M. T. L.; Blanco, A. C.; Nadaoka, K.; Tomoling, E. C. M.

    2012-07-01

    Mapping benthic cover in deep waters comprises a very small proportion of studies in the field of research. Majority of benthic cover mapping makes use of satellite images and usually, classification is carried out only for shallow waters. To map the seafloor in optically deep waters, underwater videos and photos are needed. Some researchers have applied this method on underwater photos, but made use of different classification methods such as: Neural Networks, and rapid classification via down sampling. In this study, accurate bathymetric data obtained using a multi-beam echo sounder (MBES) was attempted to be used as complementary data with the underwater photographs. Due to the absence of a motion reference unit (MRU), which applies correction to the data gathered by the MBES, accuracy of the said depth data was compromised. Nevertheless, even with the absence of accurate bathymetric data, object-based image analysis (OBIA), which used rule sets based on information such as shape, size, area, relative distance, and spectral information, was still applied. Compared to pixel-based classifications, OBIA was able to classify more specific benthic cover types other than coral and sand, such as rubble and fish. Through the use of rule sets on area, less than or equal to 700 pixels for fish and between 700 to 10,000 pixels for rubble, as well as standard deviation values to distinguish texture, fish and rubble were identified. OBIA produced benthic cover maps that had higher overall accuracy, 93.78±0.85%, as compared to pixel-based methods that had an average accuracy of only 87.30±6.11% (p-value = 0.0001, α = 0.05).

  3. Neural activity associated with self, other, and object-based counterfactual thinking

    PubMed Central

    De Brigard, Felipe; Spreng, R. Nathan; Mitchell, Jason P.; Schacter, Daniel L.

    2016-01-01

    Previous research has shown that autobiographical episodic counterfactual thinking—i.e., mental simulations about alternative ways in which one’s life experiences could have occurred—engages the brain’s default network (DN). However, it remains unknown whether or not the DN is also engaged during impersonal counterfactual thoughts, specifically those involving other people or objects. The current study compares brain activity during counterfactual simulations involving the self, others and objects. In addition, counterfactual thoughts involving others were manipulated in terms of similarity and familiarity with the simulated characters. The results indicate greater involvement of DN during person-based (i.e., self and other) as opposed to object-based counterfactual simulations. However, the involvement of different regions of the DN during other-based counterfactual simulations was modulated by how close and/or similar the simulated character was perceived to be by the participant. Simulations involving unfamiliar characters preferentially recruited dorsomedial prefrontal cortex. Simulations involving unfamiliar similar characters, characters with whom participants identified personality traits, recruited lateral temporal gyrus. Finally, our results also revealed differential coupling of right hippocampus with lateral prefrontal and temporal cortex during counterfactual simulations involving familiar similar others, but with left transverse temporal gyrus and medial frontal and inferior temporal gyri during counterfactual simulations involving either oneself or unfamiliar dissimilar others. These results suggest that different brain mechanisms are involved in the simulation of personal and impersonal counterfactual thoughts, and that the extent to which regions associated with autobiographical memory are recruited during the simulation of counterfactuals involving others depends on the perceived similarity and familiarity with the simulated individuals

  4. Coregistration refinement of hyperspectral images and DSM: An object-based approach using spectral information

    NASA Astrophysics Data System (ADS)

    Avbelj, Janja; Iwaszczuk, Dorota; Müller, Rupert; Reinartz, Peter; Stilla, Uwe

    2015-02-01

    For image fusion in remote sensing applications the georeferencing accuracy using position, attitude, and camera calibration measurements can be insufficient. Thus, image processing techniques should be employed for precise coregistration of images. In this article a method for multimodal object-based image coregistration refinement between hyperspectral images (HSI) and digital surface models (DSM) is presented. The method is divided in three parts: object outline detection in HSI and DSM, matching, and determination of transformation parameters. The novelty of our proposed coregistration refinement method is the use of material properties and height information of urban objects from HSI and DSM, respectively. We refer to urban objects as objects which are typical in urban environments and focus on buildings by describing them with 2D outlines. Furthermore, the geometric accuracy of these detected building outlines is taken into account in the matching step and for the determination of transformation parameters. Hence, a stochastic model is introduced to compute optimal transformation parameters. The feasibility of the method is shown by testing it on two aerial HSI of different spatial and spectral resolution, and two DSM of different spatial resolution. The evaluation is carried out by comparing the accuracies of the transformations parameters to the reference parameters, determined by considering object outlines at much higher resolution, and also by computing the correctness and the quality rate of the extracted outlines before and after coregistration refinement. Results indicate that using outlines of objects instead of only line segments is advantageous for coregistration of HSI and DSM. The extraction of building outlines in comparison to the line cue extraction provides a larger amount of assigned lines between the images and is more robust to outliers, i.e. false matches.

  5. Mapping Eroded Areas on Mountain Grassland with Terrestrial Photogrammetry and Object-Based Image Analysis

    NASA Astrophysics Data System (ADS)

    Mayr, Andreas; Rutzinger, Martin; Bremer, Magnus; Geitner, Clemens

    2016-06-01

    In the Alps as well as in other mountain regions steep grassland is frequently affected by shallow erosion. Often small landslides or snow movements displace the vegetation together with soil and/or unconsolidated material. This results in bare earth surface patches within the grass covered slope. Close-range and remote sensing techniques are promising for both mapping and monitoring these eroded areas. This is essential for a better geomorphological process understanding, to assess past and recent developments, and to plan mitigation measures. Recent developments in image matching techniques make it feasible to produce high resolution orthophotos and digital elevation models from terrestrial oblique images. In this paper we propose to delineate the boundary of eroded areas for selected scenes of a study area, using close-range photogrammetric data. Striving for an efficient, objective and reproducible workflow for this task, we developed an approach for automated classification of the scenes into the classes grass and eroded. We propose an object-based image analysis (OBIA) workflow which consists of image segmentation and automated threshold selection for classification using the Excess Green Vegetation Index (ExG). The automated workflow is tested with ten different scenes. Compared to a manual classification, grass and eroded areas are classified with an overall accuracy between 90.7% and 95.5%, depending on the scene. The methods proved to be insensitive to differences in illumination of the scenes and greenness of the grass. The proposed workflow reduces user interaction and is transferable to other study areas. We conclude that close-range photogrammetry is a valuable low-cost tool for mapping this type of eroded areas in the field with a high level of detail and quality. In future, the output will be used as ground truth for an area-wide mapping of eroded areas in coarser resolution aerial orthophotos acquired at the same time.

  6. High resolution mapping of development in the wildland-urban interface using object based image extraction

    USGS Publications Warehouse

    Caggiano, Michael D.; Tinkham, Wade T.; Hoffman, Chad; Cheng, Antony S.; Hawbaker, Todd J.

    2016-01-01

    The wildland-urban interface (WUI), the area where human development encroaches on undeveloped land, is expanding throughout the western United States resulting in increased wildfire risk to homes and communities. Although census based mapping efforts have provided insights into the pattern of development and expansion of the WUI at regional and national scales, these approaches do not provide sufficient detail for fine-scale fire and emergency management planning, which requires maps of individual building locations. Although fine-scale maps of the WUI have been developed, they are often limited in their spatial extent, have unknown accuracies and biases, and are costly to update over time. In this paper we assess a semi-automated Object Based Image Analysis (OBIA) approach that utilizes 4-band multispectral National Aerial Image Program (NAIP) imagery for the detection of individual buildings within the WUI. We evaluate this approach by comparing the accuracy and overall quality of extracted buildings to a building footprint control dataset. In addition, we assessed the effects of buffer distance, topographic conditions, and building characteristics on the accuracy and quality of building extraction. The overall accuracy and quality of our approach was positively related to buffer distance, with accuracies ranging from 50 to 95% for buffer distances from 0 to 100 m. Our results also indicate that building detection was sensitive to building size, with smaller outbuildings (footprints less than 75 m2) having detection rates below 80% and larger residential buildings having detection rates above 90%. These findings demonstrate that this approach can successfully identify buildings in the WUI in diverse landscapes while achieving high accuracies at buffer distances appropriate for most fire management applications while overcoming cost and time constraints associated with traditional approaches. This study is unique in that it evaluates the ability of an OBIA

  7. a Region-Based Multi-Scale Approach for Object-Based Image Analysis

    NASA Astrophysics Data System (ADS)

    Kavzoglu, T.; Yildiz Erdemir, M.; Tonbul, H.

    2016-06-01

    Within the last two decades, object-based image analysis (OBIA) considering objects (i.e. groups of pixels) instead of pixels has gained popularity and attracted increasing interest. The most important stage of the OBIA is image segmentation that groups spectrally similar adjacent pixels considering not only the spectral features but also spatial and textural features. Although there are several parameters (scale, shape, compactness and band weights) to be set by the analyst, scale parameter stands out the most important parameter in segmentation process. Estimating optimal scale parameter is crucially important to increase the classification accuracy that depends on image resolution, image object size and characteristics of the study area. In this study, two scale-selection strategies were implemented in the image segmentation process using pan-sharped Qickbird-2 image. The first strategy estimates optimal scale parameters for the eight sub-regions. For this purpose, the local variance/rate of change (LV-RoC) graphs produced by the ESP-2 tool were analysed to determine fine, moderate and coarse scales for each region. In the second strategy, the image was segmented using the three candidate scale values (fine, moderate, coarse) determined from the LV-RoC graph calculated for whole image. The nearest neighbour classifier was applied in all segmentation experiments and equal number of pixels was randomly selected to calculate accuracy metrics (overall accuracy and kappa coefficient). Comparison of region-based and image-based segmentation was carried out on the classified images and found that region-based multi-scale OBIA produced significantly more accurate results than image-based single-scale OBIA. The difference in classification accuracy reached to 10% in terms of overall accuracy.

  8. Object-Based Change Detection Using High-Resolution Remotely Sensed Data and GIS

    NASA Astrophysics Data System (ADS)

    Sofina, N.; Ehlers, M.

    2012-08-01

    High resolution remotely sensed images provide current, detailed, and accurate information for large areas of the earth surface which can be used for change detection analyses. Conventional methods of image processing permit detection of changes by comparing remotely sensed multitemporal images. However, for performing a successful analysis it is desirable to take images from the same sensor which should be acquired at the same time of season, at the same time of a day, and - for electro-optical sensors - in cloudless conditions. Thus, a change detection analysis could be problematic especially for sudden catastrophic events. A promising alternative is the use of vector-based maps containing information about the original urban layout which can be related to a single image obtained after the catastrophe. The paper describes a methodology for an object-based search of destroyed buildings as a consequence of a natural or man-made catastrophe (e.g., earthquakes, flooding, civil war). The analysis is based on remotely sensed and vector GIS data. It includes three main steps: (i) generation of features describing the state of buildings; (ii) classification of building conditions; and (iii) data import into a GIS. One of the proposed features is a newly developed 'Detected Part of Contour' (DPC). Additionally, several features based on the analysis of textural information corresponding to the investigated vector objects are calculated. The method is applied to remotely sensed images of areas that have been subjected to an earthquake. The results show the high reliability of the DPC feature as an indicator for change.

  9. An Object-Based Method for Estimation of River Discharge from Remotely-Sensed Imagery

    NASA Astrophysics Data System (ADS)

    Burgett, D. A.; Blesius, L.; Davis, J. D.

    2010-12-01

    High resolution satellite and aerial imagery of fluvial systems contain much information about planform river channel features. However, not much is known about how these forms quantitatively related to river channel process, specifically, discharge. This research explores methods for remote image-based river discharge estimation through Object-Based Image Processing (OBIA) and GIS techniques. Previous efforts in image-based discharge estimation have relied primarily on manual delineation of river features and the input of reach-averaged values of these features into statistically based models for estimation. In addition to analyzing OBIA techniques for channel feature delineation and measurement, this approach investigates techniques of discharge estimation model design, validation, and correction along a reach, utilizing variation in “standard” channel features (e.g. water surface width), along with less tangible channel feature metrics derived from OBIA. Rather than predefine the channel unit of analysis, this work also considers the accuracy of model parameters derived from a range of channel scales, from longer reach-averaged to cross-sectional. High resolution (1 m) color infrared orthoimagery from 2005 and 2009 National Agricultural Inventory Program (NAIP) of 50 river reaches (ranging in discharge from approximately 13 m3s-1 to 856 m3s-1) were utilized for this analysis. These reaches, all near United States Geological Survey (USGS) river gages in California, USA, were split randomly and evenly into 25 reaches each for model design and validation, respectively. This approach allows better isolation of error resulting from user bias in channel feature measurement, and moves toward a more precise, standardized system of measurement for remotely observable channel form.

  10. An Object-Based Image Analysis of Pinyon and Juniper Woodlands Treated to Reduce Fuels

    NASA Astrophysics Data System (ADS)

    Hulet, April; Roundy, Bruce A.; Petersen, Steven L.; Jensen, Ryan R.; Bunting, Stephen C.

    2014-03-01

    Mechanical and prescribed fire treatments are commonly used to reduce fuel loads and maintain or restore sagebrush steppe rangelands across the Great Basin where pinyon ( Pinus) and juniper ( Juniperus) trees are encroaching and infilling. Geospatial technologies, particularly remote sensing, could potentially be used in these ecosystems to (1) evaluate the longevity of fuel reduction treatments, (2) provide data for planning and designing future fuel-reduction treatments, and (3) assess the spatial distribution of horizontal fuel structure following fuel-reduction treatments. High-spatial resolution color-infrared imagery (0.06-m pixels) was acquired for pinyon and juniper woodland plots where fuels were reduced by either prescribed fire, tree cutting, or mastication at five sites in Oregon, California, Nevada, and Utah. Imagery was taken with a Vexcel UltraCam X digital camera in June 2009. Within each treatment plot, ground cover was measured as part of the Sagebrush Steppe Treatment Evaluation Project. Trimble eCognition Developer was used to classify land cover classes using object-based image analysis (OBIA) techniques. Differences between cover estimates using OBIA and ground-measurements were not consistently higher or lower for any land cover class and when evaluated for individual sites, were within ±5 % of each other. The overall accuracy and the K hat statistic for classified thematic maps for each treatment were: prescribed burn 85 % and 0.81; cut and fell 82 % and 0.77, and mastication 84 % and 0.80. Although cover assessments from OBIA differed somewhat from ground measurements, they are sufficiently accurate to evaluate treatment success and for supporting a broad range of management concerns.

  11. Expert System Software Assistant for Payload Operations

    NASA Technical Reports Server (NTRS)

    Rogers, Mark N.

    1997-01-01

    The broad objective of this expert system software based application was to demonstrate the enhancements and cost savings that can be achieved through expert system software utilization in a spacecraft ground control center. Spacelab provided a valuable proving ground for this advanced software technology; a technology that will be exploited and expanded for future ISS operations. Our specific focus was on demonstrating payload cadre command and control efficiency improvements through the use of "smart" software which monitors flight telemetry, provides enhanced schematic-based data visualization, and performs advanced engineering data analysis.

  12. Software system safety

    NASA Technical Reports Server (NTRS)

    Uber, James G.

    1988-01-01

    Software itself is not hazardous, but since software and hardware share common interfaces there is an opportunity for software to create hazards. Further, these software systems are complex, and proven methods for the design, analysis, and measurement of software safety are not yet available. Some past software failures, future NASA software trends, software engineering methods, and tools and techniques for various software safety analyses are reviewed. Recommendations to NASA are made based on this review.

  13. Object-based Image Classification of Arctic Sea Ice and Melt Ponds through Aerial Photos

    NASA Astrophysics Data System (ADS)

    Miao, X.; Xie, H.; Li, Z.; Lei, R.

    2013-12-01

    The last six years have marked the lowest Arctic summer sea ice extents in the modern era, with a new record summer minimum (3.4 million km2) set on 13 September 2012. It has been predicted that the Arctic could be free of summer ice within the next 25-30. The loss of Arctic summer ice could have serious consequences, such as higher water temperature due to the positive feedback of albedo, more powerful and frequent storms, rising sea levels, diminished habitats for polar animals, and more pollution due to fossil fuel exploitation and/ or increased traffic through the Northwest/ Northeast Passage. In these processes, melt ponds play an important role in Earth's radiation balance since they strongly absorb solar radiation rather than reflecting it as snow and ice do. Therefore, it is necessary to develop the ability of predicting the sea ice/ melt pond extents and space-time evolution, which is pivotal to prepare for the variation and uncertainty of the future environment, political, economic, and military needs. A lot of efforts have been put into Arctic sea ice modeling to simulate sea ice processes. However, these sea ice models were initiated and developed based on limited field surveys, aircraft or satellite image data. Therefore, it is necessary to collect high resolution sea ice aerial photo in a systematic way to tune up, validate, and improve models. Currently there are many sea ice aerial photos available, such as Chinese Arctic Exploration (CHINARE 2008, 2010, 2012), SHEBA 1998 and HOTRAX 2005. However, manually delineating of sea ice and melt pond from these images is time-consuming and labor-intensive. In this study, we use the object-based remote sensing classification scheme to extract sea ice and melt ponds efficiently from 1,727 aerial photos taken during the CHINARE 2010. The algorithm includes three major steps as follows. (1) Image segmentation groups the neighboring pixels into objects according to the similarity of spectral and texture

  14. Archetyping: A software generation and management methodology

    NASA Technical Reports Server (NTRS)

    Rothman, Hugh B.; Przybylinski, Stanley M.

    1988-01-01

    Many knowledge based software generation methods have been proposed to improve software quality and programmer productivity. Several government and industry initiatives have focused on software reusability as one solution to these problems. DARTS (trademark), a General Dynamics proprietary symbolic processing technology, provides a unique solution to the reuse problem: archtyping. Archtyping is the embedding of high order language statements in text files. An advanced macroprocessor uses the text files to generate new versions of complex software systems. A DARTS program, the Software Generation and Configuration Management (SGCM) System automates the archtyping process and maintenance cycle. The DARTS technology is briefly discussed, archtyping is described, and the SGCM system is presented in detail.

  15. A Mathematics Software Database Update.

    ERIC Educational Resources Information Center

    Cunningham, R. S.; Smith, David A.

    1987-01-01

    Contains an update of an earlier listing of software for mathematics instruction at the college level. Topics are: advanced mathematics, algebra, calculus, differential equations, discrete mathematics, equation solving, general mathematics, geometry, linear and matrix algebra, logic, statistics and probability, and trigonometry. (PK)

  16. A Multidimensional Software Engineering Course

    ERIC Educational Resources Information Center

    Barzilay, O.; Hazzan, O.; Yehudai, A.

    2009-01-01

    Software engineering (SE) is a multidimensional field that involves activities in various areas and disciplines, such as computer science, project management, and system engineering. Though modern SE curricula include designated courses that address these various subjects, an advanced summary course that synthesizes them is still missing. Such a…

  17. Geologic Sequestration Software Suite

    2013-11-04

    GS3 is the bundling of the Geological Sequestration Software Suite domain tools with the Velo wiki user interface, rich client interface, and data store. Velo is an application domain independent collaborative user environment for modeling and simulation. Velo has a web browser based wiki interface integrated with a sophisticated content management system supporting data and knowledge management required for large-scale scientific modeling projects. GS3 adds tools and capability specifically in the area of modeling subsurfacemore » reservoirs for the purpose of carbon sequestration. Velo is a core software framework to create scientific domain user environments. Velo is not tied to a specific domain although it provides novel capability needed by many application areas. A well-defined Velo integration layer allows custom applications such as GS3 to leverage the core Velo components to reduce development cost/time and ultimately provide a more capable software product. Compared with previous efforts like ECCE and SALSSA, Velo is a major advancement being a web browser based interface, having a more comprehensive data management architecture, and having intrinsic support for collaboration through the wiki. GS3 adds specific domain tools for looking at site data, developing conceptual and numerical models, building simulation input files, launching and monitoring the progress of those simulations and being able to look at and interpret simulation output.« less

  18. Geologic Sequestration Software Suite

    SciTech Connect

    Black, Gary; Bonneville, PNNL Alain; Sivaramakrishnan, PNNL Chandrika; Purohit, PNNL Sumit; White, PNNL Signe; Lansing, PNNL Carina; Gosink, PNNL Luke; Guillen, PNNL Zoe; Moeglein, PNNL William; Gorton, PNNL Ian; PNNL,

    2013-11-04

    GS3 is the bundling of the Geological Sequestration Software Suite domain tools with the Velo wiki user interface, rich client interface, and data store. Velo is an application domain independent collaborative user environment for modeling and simulation. Velo has a web browser based wiki interface integrated with a sophisticated content management system supporting data and knowledge management required for large-scale scientific modeling projects. GS3 adds tools and capability specifically in the area of modeling subsurface reservoirs for the purpose of carbon sequestration. Velo is a core software framework to create scientific domain user environments. Velo is not tied to a specific domain although it provides novel capability needed by many application areas. A well-defined Velo integration layer allows custom applications such as GS3 to leverage the core Velo components to reduce development cost/time and ultimately provide a more capable software product. Compared with previous efforts like ECCE and SALSSA, Velo is a major advancement being a web browser based interface, having a more comprehensive data management architecture, and having intrinsic support for collaboration through the wiki. GS3 adds specific domain tools for looking at site data, developing conceptual and numerical models, building simulation input files, launching and monitoring the progress of those simulations and being able to look at and interpret simulation output.

  19. Moving object-based depth map estimation using relabeling and hybrid matching

    NASA Astrophysics Data System (ADS)

    Kim, Tae-Woo; Shin, Jitae; Oh, Byung Tae

    2014-03-01

    Schemes to enhance human visual perception in three-dimensional (3-D) video applications with depth map data are proposed. Depth estimation is an important part of free viewpoint television and 3-DTV because the accuracy of depth information directly affects the synthesized video quality at an intermediate viewpoint. However, generating an accurate depth map is a complex computational process that makes real-time implementation challenging. In order to obtain accurate depth information with low complexity, a depth map relabeling algorithm and a hybrid matching algorithm are proposed in the depth estimation step. These techniques in acquisition for a depth map are based on human perception, which is more sensitive to moving objects than to a static background. Also, they consider the importance of appropriate processing of object boundaries. Experimental results demonstrate that the proposed schemes provide a synthesized view with both higher subjective visual quality and better objective quality in terms of peak signal-to-noise ratio than legacy depth estimation reference software.

  20. Automated Digital Image Analysis of islet cell mass using Nikon's inverted Eclipse Ti microscope and software to improve engraftment may help to advance the therapeutic efficacy and accessibility of islet transplantation across centers.

    PubMed

    Gmyr, Valery; Bonner, Caroline; Lukowiak, Bruno; Pawlowski, Valerie; Dellaleau, Nathalie; Belaich, Sandrine; Aluka, Isanga; Moermann, Ericka; Thevenet, Julien; Ezzouaoui, Rimed; Queniat, Gurvan; Pattou, Francois; Kerr-Conte, Julie

    2013-04-29

    Reliable assessment of islet viability, mass and purity must be met prior to transplanting an islet preparation into patients with type 1 diabetes. The standard method for quantifying human islet preparations is by direct microscopic analysis of dithizone-stained islet samples,but this technique may be susceptible to inter / intra observer variability, which may induce false positive / negative islet counts. Here we describe a simple, reliable, automated digitalimage analysis (ADIA) technique, for accurately quantifying islets into total islet number,islet equivalent number (IEQ), and islet purity before islet transplantation.Islets were isolated and purified from n=42 human pancreata according to the automated method of Ricordi et al. For each preparation, three islet samples were stained with dithizone,and expressed as IEQ number. Islets were analyzed manually by microscopy, or automaticallyquantified using Nikon's inverted Eclipse Ti microscope, with built in NIS-ElementsAdvanced Research (AR) software.The AIDA method significantly enhanced the number of islet preparations eligible forengraftment compared to the standard manual method (P<0.001). Comparisons of individual methods showed good correlations between mean values of IEQ number (r(sup)2(/sup)≤0.91), and total islet number (r(sup)2(/sup)=0.88), and thus, increased to (r(sup)2(/sup)=0.93) when islet surface area was estimated comparatively with IEQ number. The ADIA method showed very high intra-observer reproducibility compared to the standard manual method (P<0.001). However, islet purity was routinely estimated as significantly higher with the manual method vs. the ADIA method(p<0.001). The ADIA method also detected small islets between 10-50 μm in size.Automated digital image analysis utilizing the Nikon Instruments (Nikon) software is anunbiased, simple, and reliable teaching tool to comprehensively assess the individual size ofeach islet cell preparation prior to transplantation. Implementation of

  1. Choosing Software for Children.

    ERIC Educational Resources Information Center

    Spencer, Mima

    This Digest points out characteristics of quality computer software for children, describes different kinds of software, and suggests ways to get software for preview. The need to consider the purpose for which the software is to be used and the degree to which the software meets its stated goals is noted. Desirable software characteristics and…

  2. Antenna Controller Replacement Software

    NASA Technical Reports Server (NTRS)

    Chao, Roger Y.; Morgan, Scott C.; Strain, Martha M.; Rockwell, Stephen T.; Shimizu, Kenneth J.; Tehrani, Barzia J.; Kwok, Jaclyn H.; Tuazon-Wong, Michelle; Valtier, Henry; Nalbandi, Reza; Wert, Michael; Leung, Patrick

    2010-01-01

    The Antenna Controller Replacement (ACR) software accurately points and monitors the Deep Space Network (DSN) 70-m and 34-m high-efficiency (HEF) ground-based antennas that are used to track primarily spacecraft and, periodically, celestial targets. To track a spacecraft, or other targets, the antenna must be accurately pointed at the spacecraft, which can be very far away with very weak signals. ACR s conical scanning capability collects the signal in a circular pattern around the target, calculates the location of the strongest signal, and adjusts the antenna pointing to point directly at the spacecraft. A real-time, closed-loop servo control algorithm performed every 0.02 second allows accurate positioning of the antenna in order to track these distant spacecraft. Additionally, this advanced servo control algorithm provides better antenna pointing performance in windy conditions. The ACR software provides high-level commands that provide a very easy user interface for the DSN operator. The operator only needs to enter two commands to start the antenna and subreflector, and Master Equatorial tracking. The most accurate antenna pointing is accomplished by aligning the antenna to the Master Equatorial, which because of its small size and sheltered location, has the most stable pointing. The antenna has hundreds of digital and analog monitor points. The ACR software provides compact displays to summarize the status of the antenna, subreflector, and the Master Equatorial. The ACR software has two major functions. First, it performs all of the steps required to accurately point the antenna (and subreflector and Master Equatorial) at the spacecraft (or celestial target). This involves controlling the antenna/ subreflector/Master-Equatorial hardware, initiating and monitoring the correct sequence of operations, calculating the position of the spacecraft relative to the antenna, executing the real-time servo control algorithm to maintain the correct position, and

  3. Online Assistants in Children's Hypermedia Software

    ERIC Educational Resources Information Center

    Garcia, Penny Ann

    2002-01-01

    The classroom teacher's comfort and familiarity with computers and software influences student-computer use in the classroom. Teachers remain mired in repetitive introduction of basic software mechanics and rarely progress with students to advanced concepts or complex applications. An Online Assistant (OLA) was developed to accompany the…

  4. Modeling and MBL: Software Tools for Science.

    ERIC Educational Resources Information Center

    Tinker, Robert F.

    Recent technological advances and new software packages put unprecedented power for experimenting and theory-building in the hands of students at all levels. Microcomputer-based laboratory (MBL) and model-solving tools illustrate the educational potential of the technology. These tools include modeling software and three MBL packages (which are…

  5. Application of Object Based Image Analysis (OBIA) in detecting and quantifying forest loss caused by artisanal gold mining activities in Upper Mazaruni River Basin, Guyana

    NASA Astrophysics Data System (ADS)

    Mengisteab, B. S.; Blesius, L.; Hennessy, L.

    2014-12-01

    Artisanal gold mining in Guyana is mostly done in forest covered areas, causing forest loss and land degradation. Data from the Guyana Geology and Mining commission show there has been an increase of 2074% between 1986 and 2009. Our analysis of Landsat data between 1986 and 2013 for a part of the Upper Mazaruni area shows an increase from 34.2 to 278.4 hectares, which amounts to more than 800%. While the frequent coverage of Landsat data is useful for multitemporal studies, the lower resolution may not be adequate for accurate detection of mining sites. Therefore, RapidEye imagery from 2011 at a resolution of 5m was used to detect gold mining activity and to compare the results with the Landsat study. Processing was conducted in eCognition, an object-based image analysis (OBIA) software. OBIA is an image processing technique that has proven to be advantageous over traditional pixel based image processing techniques, with the primary advantage being the ability of the approach in combining both the spatial and spectral information. The satellite image was subjected to segmentation at multiple scales and classified using fuzzy sets of membership functions. Classification explicitly incorporated the different scales in order to accommodate different sizes of real-world objects and spatial relationships were utilized to establish connections between related objects. For example the presence or absence of water in pits, or the existence of sediments in the river may serve as additional indicators of mining sites besides the spectral components. Preliminary results show that OBIA approach was able to successfully detect and quantify small scale mining activities in the basin, and that the Landsat data were giving an acceptable estimate of mining sites over time. Keywords:Object Based Image Analysis, Gold Mining, Remote Sensing, Guyana

  6. Advanced computing

    NASA Technical Reports Server (NTRS)

    1991-01-01

    Advanced concepts in hardware, software and algorithms are being pursued for application in next generation space computers and for ground based analysis of space data. The research program focuses on massively parallel computation and neural networks, as well as optical processing and optical networking which are discussed under photonics. Also included are theoretical programs in neural and nonlinear science, and device development for magnetic and ferroelectric memories.

  7. Software Model Of Software-Development Process

    NASA Technical Reports Server (NTRS)

    Lin, Chi Y.; Synott, Debra J.; Levary, Reuven R.

    1990-01-01

    Collection of computer programs constitutes software tool for simulation of medium- to large-scale software-development projects. Necessary to include easily identifiable and more-readily quantifiable characteristics like costs, times, and numbers of errors. Mathematical model incorporating these and other factors of dynamics of software-development process implemented in the Software Life Cycle Simulator (SLICS) computer program. Simulates dynamics of software-development process. In combination with input and output expert software systems and knowledge-based management software system, develops information for use in managing large software-development project. Intended to aid managers in planning, managing, and controlling software-development processes by reducing uncertainties in budgets, required personnel, and schedules.

  8. Hierarchical object-based classification of ultra-high-resolution digital mapping camera (DMC) imagery for rangeland mapping and assessment

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Ultra high resolution digital aerial photography has great potential to complement or replace ground measurements of vegetation cover for rangeland monitoring and assessment. We investigated object-based image analysis (OBIA) techniques for classifying vegetation in southwestern U.S. arid rangelands...

  9. Object-Based Visual Attention in 8-Month-Old Infants: Evidence from an Eye-Tracking Study

    ERIC Educational Resources Information Center

    Bulf, Hermann; Valenza, Eloisa

    2013-01-01

    Visual attention is one of the infant's primary tools for gathering relevant information from the environment for further processing and learning. The space-based component of visual attention in infants has been widely investigated; however, the object-based component of visual attention has received scarce interest. This scarcity is…

  10. Ober: A Program for Objective Based Evaluation in Reading at the Newport-Mesa Unified School District.

    ERIC Educational Resources Information Center

    Otto, Robert C.

    OBER is the acronym coined by the Newport-Mesa Unified School District to describe its utilization of the System for Objective Based Evaluation-Reading (SOBE-R) developed by the Center for the Study of Evaluation at UCLA. OBER consists of three major components that are designed to facilitate the development and evaluation of the reading programs…

  11. Object-based Evaluation of Satellite Precipitation Retrievals: A Case Study of the Summer Season over CONUS

    NASA Astrophysics Data System (ADS)

    Li, J.; Xu, P.

    2015-12-01

    Satellite precipitation retrievals that have high spatial and temporal resolutions are suitable for various applications, such as hydrologic modeling and watershed management. Many validation studies have been established to understand the strengths and limitations of these satellite precipitation retrievals. In this study, an object-based validation approach is adopted to evaluate several satellite precipitation retrievals focusing on the spatial and geometric patterns of precipitation. This object-based validation approach identifies precipitation objects using an image processing technique referred to as watershed transform. Several object attributes are diagnosed and analyzed based on the distance measurement. Three object-based verification scores are summarized to determine the overall performances of satellite precipitation retrievals. The Integrated Multi-satellitE Retrievals for GPM (IMERG) and Precipitation Estimation from Remotely Sensed Information using Artificial Neural Networks (PERSIANN) were evaluated using the object-based approach. The NOAA stage IV MPE multi-sensor composite rain analysis was utilized as the ground observations. The comparative assessments were conducted at 0.25° by 0.25° on a daily scale in the summer season of 2014 over the continental United States (CONUS). The results suggest that IMERG possesses the similar spatial pattern of local-scale precipitation areas against stage IV observations. In addition, IMERG depicts the sizes and locations of precipitation areas more accurately against stage IV.

  12. Same Old Story: The Problem of Object-Based Thinking as a Basis for Teaching Distant Places

    ERIC Educational Resources Information Center

    Martin, Fran

    2013-01-01

    The English Geography National Curriculum encourages primary teachers to focus on similarities and differences when teaching distant places. The issues this raises are particularly acute when teaching geography in the context of the Global South. In this article I argue that comparisons based on object-based thinking can lead to views of the…

  13. RELAP-7 Software Verification and Validation Plan

    SciTech Connect

    Smith, Curtis L.; Choi, Yong-Joon; Zou, Ling

    2014-09-25

    This INL plan comprehensively describes the software for RELAP-7 and documents the software, interface, and software design requirements for the application. The plan also describes the testing-based software verification and validation (SV&V) process—a set of specially designed software models used to test RELAP-7. The RELAP-7 (Reactor Excursion and Leak Analysis Program) code is a nuclear reactor system safety analysis code being developed at Idaho National Laboratory (INL). The code is based on the INL’s modern scientific software development framework – MOOSE (Multi-Physics Object-Oriented Simulation Environment). The overall design goal of RELAP-7 is to take advantage of the previous thirty years of advancements in computer architecture, software design, numerical integration methods, and physical models. The end result will be a reactor systems analysis capability that retains and improves upon RELAP5’s capability and extends the analysis capability for all reactor system simulation scenarios.

  14. An object-based image analysis approach for aquaculture ponds precise mapping and monitoring: a case study of Tam Giang-Cau Hai Lagoon, Vietnam.

    PubMed

    Virdis, Salvatore Gonario Pasquale

    2014-01-01

    Monitoring and mapping shrimp farms, including their impact on land cover and land use, is critical to the sustainable management and planning of coastal zones. In this work, a methodology was proposed to set up a cost-effective and reproducible procedure that made use of satellite remote sensing, object-based classification approach, and open-source software for mapping aquaculture areas with high planimetric and thematic accuracy between 2005 and 2008. The analysis focused on two characteristic areas of interest of the Tam Giang-Cau Hai Lagoon (in central Vietnam), which have similar farming systems to other coastal aquaculture worldwide: the first was primarily characterised by locally referred "low tide" shrimp ponds, which are partially submerged areas; the second by earthed shrimp ponds, locally referred to as "high tide" ponds, which are non-submerged areas on the lagoon coast. The approach was based on the region-growing segmentation of high- and very high-resolution panchromatic images, SPOT5 and Worldview-1, and the unsupervised clustering classifier ISOSEG embedded on SPRING non-commercial software. The results, the accuracy of which was tested with a field-based aquaculture inventory, showed that in favourable situations (high tide shrimp ponds), the classification results provided high rates of accuracy (>95 %) through a fully automatic object-based classification. In unfavourable situations (low tide shrimp ponds), the performance degraded due to the low contrast between the water and the pond embankments. In these situations, the automatic results were improved by manual delineation of the embankments. Worldview-1 necessarily showed better thematic accuracy, and precise maps have been realised at a scale of up to 1:2,000. However, SPOT5 provided comparable results in terms of number of correctly classified ponds, but less accurate results in terms of the precision of mapped features. The procedure also demonstrated high degrees of reproducibility

  15. Software to Manage the Unmanageable

    NASA Technical Reports Server (NTRS)

    2005-01-01

    In 1995, NASA s Jet Propulsion Laboratory (JPL) contracted Redmond, Washington-based Lucidoc Corporation, to design a technology infrastructure to automate the intersection between policy management and operations management with advanced software that automates document workflow, document status, and uniformity of document layout. JPL had very specific parameters for the software. It expected to store and catalog over 8,000 technical and procedural documents integrated with hundreds of processes. The project ended in 2000, but NASA still uses the resulting highly secure document management system, and Lucidoc has managed to help other organizations, large and small, with integrating document flow and operations management to ensure a compliance-ready culture.

  16. NASA Software Engineering Benchmarking Study

    NASA Technical Reports Server (NTRS)

    Rarick, Heather L.; Godfrey, Sara H.; Kelly, John C.; Crumbley, Robert T.; Wifl, Joel M.

    2013-01-01

    was its software assurance practices, which seemed to rate well in comparison to the other organizational groups and also seemed to include a larger scope of activities. An unexpected benefit of the software benchmarking study was the identification of many opportunities for collaboration in areas including metrics, training, sharing of CMMI experiences and resources such as instructors and CMMI Lead Appraisers, and even sharing of assets such as documented processes. A further unexpected benefit of the study was the feedback on NASA practices that was received from some of the organizations interviewed. From that feedback, other potential areas where NASA could improve were highlighted, such as accuracy of software cost estimation and budgetary practices. The detailed report contains discussion of the practices noted in each of the topic areas, as well as a summary of observations and recommendations from each of the topic areas. The resulting 24 recommendations from the topic areas were then consolidated to eliminate duplication and culled into a set of 14 suggested actionable recommendations. This final set of actionable recommendations, listed below, are items that can be implemented to improve NASA's software engineering practices and to help address many of the items that were listed in the NASA top software engineering issues. 1. Develop and implement standard contract language for software procurements. 2. Advance accurate and trusted software cost estimates for both procured and in-house software and improve the capture of actual cost data to facilitate further improvements. 3. Establish a consistent set of objectives and expectations, specifically types of metrics at the Agency level, so key trends and models can be identified and used to continuously improve software processes and each software development effort. 4. Maintain the CMMI Maturity Level requirement for critical NASA projects and use CMMI to measure organizations developing software for NASA. 5

  17. Computer systems and software engineering

    NASA Technical Reports Server (NTRS)

    Mckay, Charles W.

    1988-01-01

    The High Technologies Laboratory (HTL) was established in the fall of 1982 at the University of Houston Clear Lake. Research conducted at the High Tech Lab is focused upon computer systems and software engineering. There is a strong emphasis on the interrelationship of these areas of technology and the United States' space program. In Jan. of 1987, NASA Headquarters announced the formation of its first research center dedicated to software engineering. Operated by the High Tech Lab, the Software Engineering Research Center (SERC) was formed at the University of Houston Clear Lake. The High Tech Lab/Software Engineering Research Center promotes cooperative research among government, industry, and academia to advance the edge-of-knowledge and the state-of-the-practice in key topics of computer systems and software engineering which are critical to NASA. The center also recommends appropriate actions, guidelines, standards, and policies to NASA in matters pertinent to the center's research. Results of the research conducted at the High Tech Lab/Software Engineering Research Center have given direction to many decisions made by NASA concerning the Space Station Program.

  18. Software attribute visualization for high integrity software

    SciTech Connect

    Pollock, G.M.

    1998-03-01

    This report documents a prototype tool developed to investigate the use of visualization and virtual reality technologies for improving software surety confidence. The tool is utilized within the execution phase of the software life cycle. It provides a capability to monitor an executing program against prespecified requirements constraints provided in a program written in the requirements specification language SAGE. The resulting Software Attribute Visual Analysis Tool (SAVAnT) also provides a technique to assess the completeness of a software specification.

  19. Advanced Concept Modeling

    NASA Technical Reports Server (NTRS)

    Chaput, Armand; Johns, Zachary; Hodges, Todd; Selfridge, Justin; Bevirt, Joeben; Ahuja, Vivek

    2015-01-01

    Advanced Concepts Modeling software validation, analysis, and design. This was a National Institute of Aerospace contract with a lot of pieces. Efforts ranged from software development and validation for structures and aerodynamics, through flight control development, and aeropropulsive analysis, to UAV piloting services.

  20. Global Software Engineering: A Software Process Approach

    NASA Astrophysics Data System (ADS)

    Richardson, Ita; Casey, Valentine; Burton, John; McCaffery, Fergal

    Our research has shown that many companies are struggling with the successful implementation of global software engineering, due to temporal, cultural and geographical distance, which causes a range of factors to come into play. For example, cultural, project managementproject management and communication difficulties continually cause problems for software engineers and project managers. While the implementation of efficient software processes can be used to improve the quality of the software product, published software process models do not cater explicitly for the recent growth in global software engineering. Our thesis is that global software engineering factors should be included in software process models to ensure their continued usefulness in global organisations. Based on extensive global software engineering research, we have developed a software process, Global Teaming, which includes specific practices and sub-practices. The purpose is to ensure that requirements for successful global software engineering are stipulated so that organisations can ensure successful implementation of global software engineering.

  1. Software Vulnerability Taxonomy Consolidation

    SciTech Connect

    Polepeddi, Sriram S.

    2004-12-07

    In today's environment, computers and networks are increasing exposed to a number of software vulnerabilities. Information about these vulnerabilities is collected and disseminated via various large publicly available databases such as BugTraq, OSVDB and ICAT. Each of these databases, individually, do not cover all aspects of a vulnerability and lack a standard format among them, making it difficult for end-users to easily compare various vulnerabilities. A central database of vulnerabilities has not been available until today for a number of reasons, such as the non-uniform methods by which current vulnerability database providers receive information, disagreement over which features of a particular vulnerability are important and how best to present them, and the non-utility of the information presented in many databases. The goal of this software vulnerability taxonomy consolidation project is to address the need for a universally accepted vulnerability taxonomy that classifies vulnerabilities in an unambiguous manner. A consolidated vulnerability database (CVDB) was implemented that coalesces and organizes vulnerability data from disparate data sources. Based on the work done in this paper, there is strong evidence that a consolidated taxonomy encompassing and organizing all relevant data can be achieved. However, three primary obstacles remain: lack of referencing a common ''primary key'', un-structured and free-form descriptions of necessary vulnerability data, and lack of data on all aspects of a vulnerability. This work has only considered data that can be unambiguously extracted from various data sources by straightforward parsers. It is felt that even with the use of more advanced, information mining tools, which can wade through the sea of unstructured vulnerability data, this current integration methodology would still provide repeatable, unambiguous, and exhaustive results. Though the goal of coalescing all available data, which would be of use to

  2. Report: Scientific Software.

    ERIC Educational Resources Information Center

    Borman, Stuart A.

    1985-01-01

    Discusses various aspects of scientific software, including evaluation and selection of commercial software products; program exchanges, catalogs, and other information sources; major data analysis packages; statistics and chemometrics software; and artificial intelligence. (JN)

  3. Software Engineering Guidebook

    NASA Technical Reports Server (NTRS)

    Connell, John; Wenneson, Greg

    1993-01-01

    The Software Engineering Guidebook describes SEPG (Software Engineering Process Group) supported processes and techniques for engineering quality software in NASA environments. Three process models are supported: structured, object-oriented, and evolutionary rapid-prototyping. The guidebook covers software life-cycles, engineering, assurance, and configuration management. The guidebook is written for managers and engineers who manage, develop, enhance, and/or maintain software under the Computer Software Services Contract.

  4. Change detection of built-up land: A framework of combining pixel-based detection and object-based recognition

    NASA Astrophysics Data System (ADS)

    Xiao, Pengfeng; Zhang, Xueliang; Wang, Dongguang; Yuan, Min; Feng, Xuezhi; Kelly, Maggi

    2016-09-01

    This study proposed a new framework that combines pixel-level change detection and object-level recognition to detect changes of built-up land from high-spatial resolution remote sensing images. First, an adaptive differencing method was designed to detect changes at the pixel level based on both spectral and textural features. Next, the changed pixels were subjected to a set of morphological operations to improve the completeness and to generate changed objects, achieving the transition of change detection from the pixel level to the object level. The changed objects were further recognised through the difference of morphological building index in two phases to indicate changed objects on built-up land. The transformation from changed pixels to changed objects makes the proposed framework distinct with both the pixel-based and the object-based change detection methods. Compared with the pixel-based methods, the proposed framework can improve the change detection capability through the transformation and successive recognition of objects. Compared with the object-based method, the proposed framework avoids the issue of multitemporal segmentation and can generate changed objects directly from changed pixels. The experimental results show the effectiveness of the transformation from changed pixels to changed objects and the successive object-based recognition on improving the detection accuracy, which justify the application potential of the proposed change detection framework.

  5. GENI: Grid Hardware and Software

    SciTech Connect

    2012-01-09

    GENI Project: The 15 projects in ARPA-E’s GENI program, short for “Green Electricity Network Integration,” aim to modernize the way electricity is transmitted in the U.S. through advances in hardware and software for the electric grid. These advances will improve the efficiency and reliability of electricity transmission, increase the amount of renewable energy the grid can utilize, and provide energy suppliers and consumers with greater control over their power flows in order to better manage peak power demand and cost.

  6. Exploring the Process of Adult Computer Software Training Using Andragogy, Situated Cognition, and a Minimalist Approach

    ERIC Educational Resources Information Center

    Hurt, Andrew C.

    2007-01-01

    With technology advances, computer software becomes increasingly difficult to learn. Adults often rely on software training to keep abreast of these changes. Instructor-led software training is frequently used to teach adults new software skills; however there is limited research regarding the best practices in adult computer software training.…

  7. MemAxes Visualization Software

    SciTech Connect

    2014-08-28

    Hardware advancements such as Intel's PEBS and AMD's IBS, as well as software developments such as the perf_event API in Linux have made available the acquisition of memory access samples with performance information. MemAxes is a visualization and analysis tool for memory access sample data. By mapping the samples to their associated code, variables, node topology, and application dataset, MemAxes provides intuitive views of the data.

  8. Software Configuration Management Guidebook

    NASA Technical Reports Server (NTRS)

    1995-01-01

    The growth in cost and importance of software to NASA has caused NASA to address the improvement of software development across the agency. One of the products of this program is a series of guidebooks that define a NASA concept of the assurance processes which are used in software development. The Software Assurance Guidebook, SMAP-GB-A201, issued in September, 1989, provides an overall picture of the concepts and practices of NASA in software assurance. Lower level guidebooks focus on specific activities that fall within the software assurance discipline, and provide more detailed information for the manager and/or practitioner. This is the Software Configuration Management Guidebook which describes software configuration management in a way that is compatible with practices in industry and at NASA Centers. Software configuration management is a key software development process, and is essential for doing software assurance.

  9. Fault-tolerant software for the FIMP

    NASA Technical Reports Server (NTRS)

    Hecht, H.; Hecht, M.

    1984-01-01

    The work reported here provides protection against software failures in the task dispatcher of the FTMP, a particularly critical portion of the system software. Faults in other system modules and application programs can be handled by similar techniques but are not covered in this effort. Goals of the work reported here are: (1) to develop provisions in the software design that will detect and mitigate software failures in the dispatcher portion of the FTMP Executive and, (2) to propose the implementation of specific software reliability measures in other parts of the system. Beyond the specific support to the FTMP project, the work reported here represents a considerable advance in the practical application of the recovery block methodology for fault tolerant software design.

  10. Proceedings, Conference on the Computing Environment for Mathematical Software

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Recent advances in software and hardware technology which make it economical to create computing environments appropriate for specialized applications are addressed. Topics included software tools, FORTRAN standards activity, and features of languages, operating systems, and hardware that are important for the development, testing, and maintenance of mathematical software.

  11. Computer Software for Displaying Map Projections and Comparing Distortions.

    ERIC Educational Resources Information Center

    Wikle, Thomas

    1991-01-01

    Discusses software that educators can use to teach about distortions associated with alternative map projections. Examines the Projection, MicroCAM, World, and Atlas-GIS software programs. Suggests using the software in either introductory or more advanced courses dealing with map design or thematic cartography. Explains how to obtain the…

  12. Software productivity improvement through software engineering technology

    NASA Technical Reports Server (NTRS)

    Mcgarry, F. E.

    1985-01-01

    It has been estimated that NASA expends anywhere from 6 to 10 percent of its annual budget on the acquisition, implementation and maintenance of computer software. Although researchers have produced numerous software engineering approaches over the past 5-10 years; each claiming to be more effective than the other, there is very limited quantitative information verifying the measurable impact htat any of these technologies may have in a production environment. At NASA/GSFC, an extended research effort aimed at identifying and measuring software techniques that favorably impact productivity of software development, has been active over the past 8 years. Specific, measurable, software development technologies have been applied and measured in a production environment. Resulting software development approaches have been shown to be effective in both improving quality as well as productivity in this one environment.

  13. Software distribution using xnetlib

    SciTech Connect

    Dongarra, J.J. |; Rowan, T.H.; Wade, R.C.

    1993-06-01

    Xnetlib is a new tool for software distribution. Whereas its predecessor netlib uses e-mail as the user interface to its large collection of public-domain mathematical software, xnetlib uses an X Window interface and socket-based communication. Xnetlib makes it easy to search through a large distributed collection of software and to retrieve requested software in seconds.

  14. Agile Software Development

    ERIC Educational Resources Information Center

    Biju, Soly Mathew

    2008-01-01

    Many software development firms are now adopting the agile software development method. This method involves the customer at every level of software development, thus reducing the impact of change in the requirement at a later stage. In this article, the principles of the agile method for software development are explored and there is a focus on…

  15. Software Formal Inspections Standard

    NASA Technical Reports Server (NTRS)

    1993-01-01

    This Software Formal Inspections Standard (hereinafter referred to as Standard) is applicable to NASA software. This Standard defines the requirements that shall be fulfilled by the software formal inspections process whenever this process is specified for NASA software. The objective of this Standard is to define the requirements for a process that inspects software products to detect and eliminate defects as early as possible in the software life cycle. The process also provides for the collection and analysis of inspection data to improve the inspection process as well as the quality of the software.

  16. Responsbility for unreliable software

    SciTech Connect

    Wahl, N.J.

    1994-12-31

    Unreliable software exposes software developers and distributors to legal risks. Under certain circumstances, the developer and distributor of unreliable software can be sued. To avoid lawsuits, software developers should do the following: determine what the risks am, understand the extent of the risks, and identify ways of avoiding the risks and lessening the consequences of the risks. Liability issues associated with unreliable software are explored in this article.

  17. Software Quality Assurance Metrics

    NASA Technical Reports Server (NTRS)

    McRae, Kalindra A.

    2004-01-01

    Software Quality Assurance (SQA) is a planned and systematic set of activities that ensures conformance of software life cycle processes and products conform to requirements, standards and procedures. In software development, software quality means meeting requirements and a degree of excellence and refinement of a project or product. Software Quality is a set of attributes of a software product by which its quality is described and evaluated. The set of attributes includes functionality, reliability, usability, efficiency, maintainability, and portability. Software Metrics help us understand the technical process that is used to develop a product. The process is measured to improve it and the product is measured to increase quality throughout the life cycle of software. Software Metrics are measurements of the quality of software. Software is measured to indicate the quality of the product, to assess the productivity of the people who produce the product, to assess the benefits derived from new software engineering methods and tools, to form a baseline for estimation, and to help justify requests for new tools or additional training. Any part of the software development can be measured. If Software Metrics are implemented in software development, it can save time, money, and allow the organization to identify the caused of defects which have the greatest effect on software development. The summer of 2004, I worked with Cynthia Calhoun and Frank Robinson in the Software Assurance/Risk Management department. My task was to research and collect, compile, and analyze SQA Metrics that have been used in other projects that are not currently being used by the SA team and report them to the Software Assurance team to see if any metrics can be implemented in their software assurance life cycle process.

  18. Advance Directives and Do Not Resuscitate Orders

    MedlinePlus

    ... a form. Call a lawyer. Use a computer software package for legal documents. Advance directives and living ... you write by yourself or with a computer software package should follow your state laws. You may ...

  19. Software Engineering Program: Software Process Improvement Guidebook

    NASA Technical Reports Server (NTRS)

    1996-01-01

    The purpose of this document is to provide experience-based guidance in implementing a software process improvement program in any NASA software development or maintenance community. This guidebook details how to define, operate, and implement a working software process improvement program. It describes the concept of the software process improvement program and its basic organizational components. It then describes the structure, organization, and operation of the software process improvement program, illustrating all these concepts with specific NASA examples. The information presented in the document is derived from the experiences of several NASA software organizations, including the SEL, the SEAL, and the SORCE. Their experiences reflect many of the elements of software process improvement within NASA. This guidebook presents lessons learned in a form usable by anyone considering establishing a software process improvement program within his or her own environment. This guidebook attempts to balance general and detailed information. It provides material general enough to be usable by NASA organizations whose characteristics do not directly match those of the sources of the information and models presented herein. It also keeps the ideas sufficiently close to the sources of the practical experiences that have generated the models and information.

  20. Software Defined Radio with Parallelized Software Architecture

    NASA Technical Reports Server (NTRS)

    Heckler, Greg

    2013-01-01

    This software implements software-defined radio procession over multicore, multi-CPU systems in a way that maximizes the use of CPU resources in the system. The software treats each processing step in either a communications or navigation modulator or demodulator system as an independent, threaded block. Each threaded block is defined with a programmable number of input or output buffers; these buffers are implemented using POSIX pipes. In addition, each threaded block is assigned a unique thread upon block installation. A modulator or demodulator system is built by assembly of the threaded blocks into a flow graph, which assembles the processing blocks to accomplish the desired signal processing. This software architecture allows the software to scale effortlessly between single CPU/single-core computers or multi-CPU/multi-core computers without recompilation. NASA spaceflight and ground communications systems currently rely exclusively on ASICs or FPGAs. This software allows low- and medium-bandwidth (100 bps to approx.50 Mbps) software defined radios to be designed and implemented solely in C/C++ software, while lowering development costs and facilitating reuse and extensibility.

  1. Software Defined Radio with Parallelized Software Architecture

    NASA Technical Reports Server (NTRS)

    Heckler, Greg

    2013-01-01

    This software implements software-defined radio procession over multi-core, multi-CPU systems in a way that maximizes the use of CPU resources in the system. The software treats each processing step in either a communications or navigation modulator or demodulator system as an independent, threaded block. Each threaded block is defined with a programmable number of input or output buffers; these buffers are implemented using POSIX pipes. In addition, each threaded block is assigned a unique thread upon block installation. A modulator or demodulator system is built by assembly of the threaded blocks into a flow graph, which assembles the processing blocks to accomplish the desired signal processing. This software architecture allows the software to scale effortlessly between single CPU/single-core computers or multi-CPU/multi-core computers without recompilation. NASA spaceflight and ground communications systems currently rely exclusively on ASICs or FPGAs. This software allows low- and medium-bandwidth (100 bps to .50 Mbps) software defined radios to be designed and implemented solely in C/C++ software, while lowering development costs and facilitating reuse and extensibility.

  2. a New Framework for Object-Based Image Analysis Based on Segmentation Scale Space and Random Forest Classifier

    NASA Astrophysics Data System (ADS)

    Hadavand, A.; Saadatseresht, M.; Homayouni, S.

    2015-12-01

    In this paper a new object-based framework is developed for automate scale selection in image segmentation. The quality of image objects have an important impact on further analyses. Due to the strong dependency of segmentation results to the scale parameter, choosing the best value for this parameter, for each class, becomes a main challenge in object-based image analysis. We propose a new framework which employs pixel-based land cover map to estimate the initial scale dedicated to each class. These scales are used to build segmentation scale space (SSS), a hierarchy of image objects. Optimization of SSS, respect to NDVI and DSM values in each super object is used to get the best scale in local regions of image scene. Optimized SSS segmentations are finally classified to produce the final land cover map. Very high resolution aerial image and digital surface model provided by ISPRS 2D semantic labelling dataset is used in our experiments. The result of our proposed method is comparable to those of ESP tool, a well-known method to estimate the scale of segmentation, and marginally improved the overall accuracy of classification from 79% to 80%.

  3. Object-based task-level control: A hierarchical control architecture for remote operation of space robots

    NASA Technical Reports Server (NTRS)

    Stevens, H. D.; Miles, E. S.; Rock, S. J.; Cannon, R. H.

    1994-01-01

    Expanding man's presence in space requires capable, dexterous robots capable of being controlled from the Earth. Traditional 'hand-in-glove' control paradigms require the human operator to directly control virtually every aspect of the robot's operation. While the human provides excellent judgment and perception, human interaction is limited by low bandwidth, delayed communications. These delays make 'hand-in-glove' operation from Earth impractical. In order to alleviate many of the problems inherent to remote operation, Stanford University's Aerospace Robotics Laboratory (ARL) has developed the Object-Based Task-Level Control architecture. Object-Based Task-Level Control (OBTLC) removes the burden of teleoperation from the human operator and enables execution of tasks not possible with current techniques. OBTLC is a hierarchical approach to control where the human operator is able to specify high-level, object-related tasks through an intuitive graphical user interface. Infrequent task-level command replace constant joystick operations, eliminating communications bandwidth and time delay problems. The details of robot control and task execution are handled entirely by the robot and computer control system. The ARL has implemented the OBTLC architecture on a set of Free-Flying Space Robots. The capability of the OBTLC architecture has been demonstrated by controlling the ARL Free-Flying Space Robots from NASA Ames Research Center.

  4. Object-based change detection in rapid urbanization regions with remotely sensed observations: a case study of Shenzhen, China

    NASA Astrophysics Data System (ADS)

    He, Lihuang; Dong, Guihua; Wang, Wei-Min; Yang, Lijun; Liang, Hong

    2013-10-01

    China, the most populous country on Earth, has experienced rapid urbanization which is one of the main causes of many environmental and ecological problems. Therefore, the monitoring of rapid urbanization regions and the environment is of critical importance for their sustainable development. In this study, the object-based classification is employed to detect the change of land cover in Shenzhen, which is located in South China and has been urbanized rapidly in recent three decades. First, four Landsat TM images, which were acquired on 1990, 2000 and 2010, respectively, are selected from the image database. Atmospheric corrections are conducted on these images with improved dark-object subtraction technique and surface meteorological observations. Geometric correction is processed with ground control points derived from topographic maps. Second, a region growing multi-resolution segmentation and a soft nearest neighbour classifier are used to finish object-based classification. After analyzing the fraction of difference classes over time series, we conclude that the comparison of derived land cover classes with socio-economic statistics demonstrates the strong positive correlation between built-up classes and urban population as well as gross GDP and GDPs in second and tertiary industries. Two different mechanisms of urbanization, namely new land development and redevelopment, are revealed. Consequently, we found that, the districts of Shenzhen were urbanized through different mechanisms.

  5. Space Station Software Recommendations

    NASA Technical Reports Server (NTRS)

    Voigt, S. (Editor)

    1985-01-01

    Four panels of invited experts and NASA representatives focused on the following topics: software management, software development environment, languages, and software standards. Each panel deliberated in private, held two open sessions with audience participation, and developed recommendations for the NASA Space Station Program. The major thrusts of the recommendations were as follows: (1) The software management plan should establish policies, responsibilities, and decision points for software acquisition; (2) NASA should furnish a uniform modular software support environment and require its use for all space station software acquired (or developed); (3) The language Ada should be selected for space station software, and NASA should begin to address issues related to the effective use of Ada; and (4) The space station software standards should be selected (based upon existing standards where possible), and an organization should be identified to promulgate and enforce them. These and related recommendations are described in detail in the conference proceedings.

  6. Achieving strategic surety for high consequence software

    SciTech Connect

    Pollock, G.M.

    1996-09-01

    A strategic surety roadmap for high consequence software systems under the High Integrity Software (HIS) Program at Sandia National Laboratories guides research in identifying methodologies to improve software surety. Selected research tracks within this roadmap are identified and described detailing current technology and outlining advancements to be pursued over the coming decade to reach HIS goals. The tracks discussed herein focus on Correctness by Design, and System Immunology{trademark}. Specific projects are discussed with greater detail given on projects involving Correct Specification via Visualization, Synthesis, & Analysis; Visualization of Abstract Objects; and Correct Implementation of Components.

  7. Separation of Concerns in Teaching Software Engineering

    NASA Astrophysics Data System (ADS)

    Alsmadi, Izzat M.; Dieri, Mahmoud

    Software Engineering is one of the recently evolving subjects in research and education. Instructors and books that are talking about this field of study lack a common ground of what subjects should be covered in teaching introductory or advance courses in this area. In this paper, a proposed ontology for software engineering education is formulated. This ontology divides the software engineering projects and study into different perspectives: projects, products, people, process and tools. Further or deeper levels of abstractions of those fields can be described on levels that depend on the type or level of the course to teach.

  8. Engine Structures Modeling Software System (ESMOSS)

    NASA Technical Reports Server (NTRS)

    1991-01-01

    Engine Structures Modeling Software System (ESMOSS) is the development of a specialized software system for the construction of geometric descriptive and discrete analytical models of engine parts, components, and substructures which can be transferred to finite element analysis programs such as NASTRAN. The NASA Lewis Engine Structures Program is concerned with the development of technology for the rational structural design and analysis of advanced gas turbine engines with emphasis on advanced structural analysis, structural dynamics, structural aspects of aeroelasticity, and life prediction. Fundamental and common to all of these developments is the need for geometric and analytical model descriptions at various engine assembly levels which are generated using ESMOSS.

  9. Software Engineering Improvement Plan

    NASA Technical Reports Server (NTRS)

    2006-01-01

    In performance of this task order, bd Systems personnel provided support to the Flight Software Branch and the Software Working Group through multiple tasks related to software engineering improvement and to activities of the independent Technical Authority (iTA) Discipline Technical Warrant Holder (DTWH) for software engineering. To ensure that the products, comments, and recommendations complied with customer requirements and the statement of work, bd Systems personnel maintained close coordination with the customer. These personnel performed work in areas such as update of agency requirements and directives database, software effort estimation, software problem reports, a web-based process asset library, miscellaneous documentation review, software system requirements, issue tracking software survey, systems engineering NPR, and project-related reviews. This report contains a summary of the work performed and the accomplishments in each of these areas.

  10. Commercial Data Mining Software

    NASA Astrophysics Data System (ADS)

    Zhang, Qingyu; Segall, Richard S.

    This chapter discusses selected commercial software for data mining, supercomputing data mining, text mining, and web mining. The selected software are compared with their features and also applied to available data sets. The software for data mining are SAS Enterprise Miner, Megaputer PolyAnalyst 5.0, PASW (formerly SPSS Clementine), IBM Intelligent Miner, and BioDiscovery GeneSight. The software for supercomputing are Avizo by Visualization Science Group and JMP Genomics from SAS Institute. The software for text mining are SAS Text Miner and Megaputer PolyAnalyst 5.0. The software for web mining are Megaputer PolyAnalyst and SPSS Clementine . Background on related literature and software are presented. Screen shots of each of the selected software are presented, as are conclusions and future directions.

  11. Guidelines for software inspections

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Quality control inspections are software problem finding procedures which provide defect removal as well as improvements in software functionality, maintenance, quality, and development and testing methodology is discussed. The many side benefits include education, documentation, training, and scheduling.

  12. Software assurance standard

    NASA Technical Reports Server (NTRS)

    1992-01-01

    This standard specifies the software assurance program for the provider of software. It also delineates the assurance activities for the provider and the assurance data that are to be furnished by the provider to the acquirer. In any software development effort, the provider is the entity or individual that actually designs, develops, and implements the software product, while the acquirer is the entity or individual who specifies the requirements and accepts the resulting products. This standard specifies at a high level an overall software assurance program for software developed for and by NASA. Assurance includes the disciplines of quality assurance, quality engineering, verification and validation, nonconformance reporting and corrective action, safety assurance, and security assurance. The application of these disciplines during a software development life cycle is called software assurance. Subsequent lower-level standards will specify the specific processes within these disciplines.

  13. The Problem of Software.

    ERIC Educational Resources Information Center

    Alexander, Wilma Jean

    1982-01-01

    Explains how schools can purchase computer software. Lists are presented of (1) sources of published evaluations of selected software, (2) publications which contain names and sources of programs, and (3) magazines providing program listings appropriate for classroom use. (CT)

  14. Design software for reuse

    NASA Technical Reports Server (NTRS)

    Tracz, Will

    1990-01-01

    Viewgraphs are presented on the designing of software for reuse. Topics include terminology, software reuse maxims, the science of programming, an interface design example, a modularization example, and reuse and implementation guidelines.

  15. The Legacy of Space Shuttle Flight Software

    NASA Technical Reports Server (NTRS)

    Hickey, Christopher J.; Loveall, James B.; Orr, James K.; Klausman, Andrew L.

    2011-01-01

    The initial goals of the Space Shuttle Program required that the avionics and software systems blaze new trails in advancing avionics system technology. Many of the requirements placed on avionics and software were accomplished for the first time on this program. Examples include comprehensive digital fly-by-wire technology, use of a digital databus for flight critical functions, fail operational/fail safe requirements, complex automated redundancy management, and the use of a high-order software language for flight software development. In order to meet the operational and safety goals of the program, the Space Shuttle software had to be extremely high quality, reliable, robust, reconfigurable and maintainable. To achieve this, the software development team evolved a software process focused on continuous process improvement and defect elimination that consistently produced highly predictable and top quality results, providing software managers the confidence needed to sign each Certificate of Flight Readiness (COFR). This process, which has been appraised at Capability Maturity Model (CMM)/Capability Maturity Model Integration (CMMI) Level 5, has resulted in one of the lowest software defect rates in the industry. This paper will present an overview of the evolution of the Primary Avionics Software System (PASS) project and processes over thirty years, an argument for strong statistical control of software processes with examples, an overview of the success story for identifying and driving out errors before flight, a case study of the few significant software issues and how they were either identified before flight or slipped through the process onto a flight vehicle, and identification of the valuable lessons learned over the life of the project.

  16. Evaluating software testing strategies

    NASA Technical Reports Server (NTRS)

    Selby, R. W., Jr.; Basili, V. R.; Page, J.; Mcgarry, F. E.

    1984-01-01

    The strategies of code reading, functional testing, and structural testing are compared in three aspects of software testing: fault detection effectiveness, fault detection cost, and classes of faults detected. The major results are the following: (1) Code readers detected more faults than did those using the other techniques, while functional tester detected more faults than did structural testers; (2) Code readers had a higher fault detection rate than did those using the other methods, while there was no difference between functional testers and structural testers; (3) Subjects testing the abstract data type detected the most faults and had the highest fault detection rate, while individuals testing the database maintainer found the fewest faults and spent the most effort testing; (4) Subjects of intermediate and junior expertise were not different in number or percentage of faults found, fault detection rate, or fault detection effort; (5) subjects of advanced expertise found a greater number of faults than did the others, found a greater percentage of faults than did just those of junior expertise, and were not different from the others in either fault detection rate or effort; and (6) Code readers and functional testers both detected more omission faults and more control faults than did structural testers, while code readers detected more interface faults than did those using the other methods.

  17. DSS command software update

    NASA Technical Reports Server (NTRS)

    Stinnett, W. G.

    1980-01-01

    The modifications, additions, and testing results for a version of the Deep Space Station command software, generated for support of the Voyager Saturn encounter, are discussed. The software update requirements included efforts to: (1) recode portions of the software to permit recovery of approximately 2000 words of memory; (2) correct five Voyager Ground data System liens; (3) provide capability to automatically turn off the command processor assembly local printer during periods of low activity; and (4) correct anomalies existing in the software.

  18. Software verification and testing

    NASA Technical Reports Server (NTRS)

    1985-01-01

    General procedures for software verification and validation are provided as a guide for managers, programmers, and analysts involved in software development. The verification and validation procedures described are based primarily on testing techniques. Testing refers to the execution of all or part of a software system for the purpose of detecting errors. Planning, execution, and analysis of tests are outlined in this document. Code reading and static analysis techniques for software verification are also described.

  19. Generating land cover boundaries from remotely sensed data using object-based image analysis: overview and epidemiological application

    PubMed Central

    Maxwell, Susan K.

    2010-01-01

    Satellite imagery and aerial photography represent a vast resource to significantly enhance environmental mapping and modeling applications for use in understanding spatio-temporal relationships between environment and health. Deriving boundaries of land cover objects, such as trees, buildings, and crop fields, from image data has traditionally been performed manually using a very time consuming process of hand digitizing. Boundary detection algorithms are increasingly being applied using object-based image analysis (OBIA) technology to automate the process. The purpose of this paper is to present an overview and demonstrate the application of OBIA for delineating land cover features at multiple scales using a high resolution aerial photograph (1 m) and a medium resolution Landsat image (30 m) time series in the context of a pesticide spray drift exposure application. PMID:21135917

  20. Object-Based Classification of Wetlands Using Optical and SAR Data with a Compound Kernel in Support Vector Machine (SVM)

    NASA Astrophysics Data System (ADS)

    Salehi, B.; Mahdavi, S.; Brisco, B.; Huang, W.

    2015-12-01

    Both Synthetic Aperture RADAR (SAR) and optical imagery play a pivotal role in many applications. Thus it is desirable to fuse the two independent sources of data congruously. Many of the fusion methods, however, fail to consider the different nature of SAR and optical data. Moreover, it is not straightforward to adjust the contribution of the two data sources with respect to the application. Support Vector Machine (SVM) is one of the classification methods which can provide the possibility of combination of two kinds of images considering the different nature of them. It is particularly useful when object-based classification is used, in which case features extracted from SAR and optical images can be treated differently. This paper aims to develop an object-based classification method using both optical and SAR data which treats the two data sources independently. For the implementation of the method, a RapidEye and a RADARSAT-2 Quad-polarimetric image over Avalon Peninsula in Newfoundland, Canada will be used for wetland classification. RapidEye will be segmented using multiresolution algorithm in eCognitionTM. Because of speckle, segmentation of SAR images does not have robust results. Thus the result of the segmentation from RapidEye image is superimposed on RADARSAT-2 image. Then useful SAR and optical features are extracted. Integrating features extracted from optical and SAR data, a compound kernel in SVM is applied for classification. This kernel is a combination of two kernels with different weights, each of which are for the features of one of the data sources. Using compound kernel can outperform using the same kernel for both images. The proposed method has two main advantages. First, different nature of optical and SAR images which is the result of dissimilar dynamic range, resolution, etc. is considered. Second, as the two data sources are combined with different weights, it is possible to adjust the role of each data sources for varying applications.

  1. Interaction between object-based attention and pertinence values shapes the attentional priority map of a multielement display.

    PubMed

    Gillebert, Celine R; Petersen, Anders; Van Meel, Chayenne; Müller, Tanja; McIntyre, Alexandra; Wagemans, Johan; Humphreys, Glyn W

    2016-06-01

    Previous studies have shown that the perceptual organization of the visual scene constrains the deployment of attention. Here we investigated how the organization of multiple elements into larger configurations alters their attentional weight, depending on the "pertinence" or behavioral importance of the elements' features. We assessed object-based effects on distinct aspects of the attentional priority map: top-down control, reflecting the tendency to encode targets rather than distracters, and the spatial distribution of attention weights across the visual scene, reflecting the tendency to report elements belonging to the same rather than different objects. In 2 experiments participants had to report the letters in briefly presented displays containing 8 letters and digits, in which pairs of characters could be connected with a line. Quantitative estimates of top-down control were obtained using Bundesen's Theory of Visual Attention (1990). The spatial distribution of attention weights was assessed using the "paired response index" (PRI), indicating responses for within-object pairs of letters. In Experiment 1, grouping along the task-relevant dimension (targets with targets and distracters with distracters) increased top-down control and enhanced the PRI; in contrast, task-irrelevant grouping (targets with distracters) did not affect performance. In Experiment 2, we disentangled the effect of target-target and distracter-distracter grouping: Pairwise grouping of distracters enhanced top-down control whereas pairwise grouping of targets changed the PRI. We conclude that object-based perceptual representations interact with pertinence values (of the elements' features and location) in the computation of attention weights, thereby creating a widespread pattern of attentional facilitation across the visual scene. (PsycINFO Database Record

  2. Interaction between object-based attention and pertinence values shapes the attentional priority map of a multielement display.

    PubMed

    Gillebert, Celine R; Petersen, Anders; Van Meel, Chayenne; Müller, Tanja; McIntyre, Alexandra; Wagemans, Johan; Humphreys, Glyn W

    2016-06-01

    Previous studies have shown that the perceptual organization of the visual scene constrains the deployment of attention. Here we investigated how the organization of multiple elements into larger configurations alters their attentional weight, depending on the "pertinence" or behavioral importance of the elements' features. We assessed object-based effects on distinct aspects of the attentional priority map: top-down control, reflecting the tendency to encode targets rather than distracters, and the spatial distribution of attention weights across the visual scene, reflecting the tendency to report elements belonging to the same rather than different objects. In 2 experiments participants had to report the letters in briefly presented displays containing 8 letters and digits, in which pairs of characters could be connected with a line. Quantitative estimates of top-down control were obtained using Bundesen's Theory of Visual Attention (1990). The spatial distribution of attention weights was assessed using the "paired response index" (PRI), indicating responses for within-object pairs of letters. In Experiment 1, grouping along the task-relevant dimension (targets with targets and distracters with distracters) increased top-down control and enhanced the PRI; in contrast, task-irrelevant grouping (targets with distracters) did not affect performance. In Experiment 2, we disentangled the effect of target-target and distracter-distracter grouping: Pairwise grouping of distracters enhanced top-down control whereas pairwise grouping of targets changed the PRI. We conclude that object-based perceptual representations interact with pertinence values (of the elements' features and location) in the computation of attention weights, thereby creating a widespread pattern of attentional facilitation across the visual scene. (PsycINFO Database Record PMID:26752732

  3. An object-based multisensoral approach for the derivation of urban land use structures in the city of Rostock, Germany

    NASA Astrophysics Data System (ADS)

    Lindner, Martin; Hese, Sören; Berger, Christian; Schmullius, Christiane

    2011-11-01

    The present work is part of the Enviland-2 research project, which investigates the synergism between radar- and optical satellite data for ENVIronment and LAND use applications. The urban work package of Enviland aims at the combined analysis of RapidEye and TerraSAR-X data for the parameterization of different urban land use structures. This study focuses on the development of a transferable, object-based rule set for the derivation of urban land use structures at block level. The data base consists of RapidEye and TerraSAR-X imagery, as well as height information of a LiDAR nDSM (normalized Digital Surface Model) and object boundaries of ATKIS (Official Topographic Cartographic Information System) vector data for a study area in the city of Rostock, Germany. The classification of various land cover units forms the basis of the analysis. Therefore, an object-based land cover classification is implemented that uses feature level fusion to combine the information of all available input data. Besides spectral values also shape and context features are employed to characterize and extract specific land cover objects as indicators for the prevalent land use. The different land use structures are then determined by typical combinations and constellations of the extracted land use indicators and land cover proportions. Accuracy assessment is done by utilizing the available ATKIS information. From this analysis the land use structure classes residential, industrial/commercial, other built-up, allotments, sports facility, forest, grassland, other green spaces, squares/parking areas and water are distinguished with an overall accuracy of 63.2 %.

  4. The Software Architecture of the Upgraded ESA DRAMA Software Suite

    NASA Astrophysics Data System (ADS)

    Kebschull, Christopher; Flegel, Sven; Gelhaus, Johannes; Mockel, Marek; Braun, Vitali; Radtke, Jonas; Wiedemann, Carsten; Vorsmann, Peter; Sanchez-Ortiz, Noelia; Krag, Holger

    2013-08-01

    In the beginnings of man's space flight activities there was the belief that space is so big that everybody could use it without any repercussions. However during the last six decades the increasing use of Earth's orbits has lead to a rapid growth in the space debris environment, which has a big influence on current and future space missions. For this reason ESA issued the "Requirements on Space Debris Mitigation for ESA Projects" [1] in 2008, which apply to all ESA missions henceforth. The DRAMA (Debris Risk Assessment and Mitigation Analysis) software suite had been developed to support the planning of space missions to comply with these requirements. During the last year the DRAMA software suite has been upgraded under ESA contract by TUBS and DEIMOS to include additional tools and increase the performance of existing ones. This paper describes the overall software architecture of the ESA DRAMA software suite. Specifically the new graphical user interface, which manages the five main tools ARES (Assessment of Risk Event Statistics), MIDAS (MASTER-based Impact Flux and Damage Assessment Software), OSCAR (Orbital Spacecraft Active Removal), CROC (Cross Section of Complex Bodies) and SARA (Re-entry Survival and Risk Analysis) is being discussed. The advancements are highlighted as well as the challenges that arise from the integration of the five tool interfaces. A framework had been developed at the ILR and was used for MASTER-2009 and PROOF-2009. The Java based GUI framework, enables the cross-platform deployment, and its underlying model-view-presenter (MVP) software pattern, meet strict design requirements necessary to ensure a robust and reliable method of operation in an environment where the GUI is separated from the processing back-end. While the GUI framework evolved with each project, allowing an increasing degree of integration of services like validators for input fields, it has also increased in complexity. The paper will conclude with an outlook on

  5. Methodology for automating software systems. Task 1 of the foundations for automating software systems

    NASA Technical Reports Server (NTRS)

    Moseley, Warren

    1989-01-01

    The early stages of a research program designed to establish an experimental research platform for software engineering are described. Major emphasis is placed on Computer Assisted Software Engineering (CASE). The Poor Man's CASE Tool is based on the Apple Macintosh system, employing available software including Focal Point II, Hypercard, XRefText, and Macproject. These programs are functional in themselves, but through advanced linking are available for operation from within the tool being developed. The research platform is intended to merge software engineering technology with artificial intelligence (AI). In the first prototype of the PMCT, however, the sections of AI are not included. CASE tools assist the software engineer in planning goals, routes to those goals, and ways to measure progress. The method described allows software to be synthesized instead of being written or built.

  6. Astronomical Software Directory Service

    NASA Technical Reports Server (NTRS)

    Hanisch, R. J.; Payne, H.; Hayes, J.

    1998-01-01

    This is the final report on the development of the Astronomical Software Directory Service (ASDS), a distributable, searchable, WWW-based database of software packages and their related documentation. ASDS provides integrated access to 56 astronomical software packages, with more than 16,000 URL's indexed for full-text searching.

  7. Software Shopper. Revised.

    ERIC Educational Resources Information Center

    Davis, Sandra Hart, Comp.

    This annotated index describes and illustrates a wide selection of public domain instructional software that may be useful in the education of deaf students and provides educators with a way to order the listed programs. The software programs are designed for use on Apple computers and their compatibles. The software descriptions are presented in…

  8. Software Architecture Evolution

    ERIC Educational Resources Information Center

    Barnes, Jeffrey M.

    2013-01-01

    Many software systems eventually undergo changes to their basic architectural structure. Such changes may be prompted by new feature requests, new quality attribute requirements, changing technology, or other reasons. Whatever the causes, architecture evolution is commonplace in real-world software projects. Today's software architects, however,…

  9. Java for flight software

    NASA Technical Reports Server (NTRS)

    Benowitz, E.; Niessner, A.

    2003-01-01

    This work involves developing representative mission-critical spacecraft software using the Real-Time Specification for Java (RTSJ). This work currently leverages actual flight software used in the design of actual flight software in the NASA's Deep Space 1 (DSI), which flew in 1998.

  10. Artificial intelligence and expert systems in-flight software testing

    NASA Technical Reports Server (NTRS)

    Demasie, M. P.; Muratore, J. F.

    1991-01-01

    The authors discuss the introduction of advanced information systems technologies such as artificial intelligence, expert systems, and advanced human-computer interfaces directly into Space Shuttle software engineering. The reconfiguration automation project (RAP) was initiated to coordinate this move towards 1990s software technology. The idea behind RAP is to automate several phases of the flight software testing procedure and to introduce AI and ES into space shuttle flight software testing. In the first phase of RAP, conventional tools to automate regression testing have already been developed or acquired. There are currently three tools in use.

  11. Bioinformatics software resources.

    PubMed

    Gilbert, Don

    2004-09-01

    This review looks at internet archives, repositories and lists for obtaining popular and useful biology and bioinformatics software. Resources include collections of free software, services for the collaborative development of new programs, software news media and catalogues of links to bioinformatics software and web tools. Problems with such resources arise from needs for continued curator effort to collect and update these, combined with less than optimal community support, funding and collaboration. Despite some problems, the available software repositories provide needed public access to many tools that are a foundation for analyses in bioscience research efforts.

  12. Tracker 300 Software

    SciTech Connect

    Wysor, R. Wes

    2006-01-12

    The Tracker300 software is downloaded to an off-the-shelf product called RCM3400/RCM3410 made by Rabbit Semiconductor. The software is a closed loop control which computes the sun's position and provides stability compensation. Using the RCM3400/RCM3410 module, the software stores and retrieves parameters from the onboard flash. The software also allows for communication with a host. It will allow the parameters to be downloaded or uploaded, it will show the status of the controller, it will provide real-time feedback, and it will send command acknowledgements. The software will capture the GPS response and ensure the internal clock is set correctly.

  13. CHP Installed Capacity Optimizer Software

    2004-11-30

    The CHP Installed Capacity Optimizer is a Microsoft Excel spreadsheet application that determines the most economic amount of capacity of distributed generation and thermal utilization equipment (e.g., absorption chillers) to install for any user-defined set of load and cost data. Installing the optimum amount of capacity is critical to the life-cycle economic viability of a distributed generation/cooling heat and power (CHP) application. Using advanced optimization algorithms, the software accesses the loads, utility tariffs, equipment costs,more » etc., and provides to the user the most economic amount of system capacity to install.« less

  14. Healthcare Software Assurance

    PubMed Central

    Cooper, Jason G.; Pauley, Keith A.

    2006-01-01

    Software assurance is a rigorous, lifecycle phase-independent set of activities which ensure completeness, safety, and reliability of software processes and products. This is accomplished by guaranteeing conformance to all requirements, standards, procedures, and regulations. These assurance processes are even more important when coupled with healthcare software systems, embedded software in medical instrumentation, and other healthcare-oriented life-critical systems. The current Food and Drug Administration (FDA) regulatory requirements and guidance documentation do not address certain aspects of complete software assurance activities. In addition, the FDA’s software oversight processes require enhancement to include increasingly complex healthcare systems such as Hospital Information Systems (HIS). The importance of complete software assurance is introduced, current regulatory requirements and guidance discussed, and the necessity for enhancements to the current processes shall be highlighted. PMID:17238324

  15. Mapping Arctic Coastline Change With Object-Based Image Analysis of Temporally and Geographically Distributed Landsat Archive Data

    NASA Astrophysics Data System (ADS)

    Hulslander, D.

    2011-12-01

    As a global phenomenon, climate change produces global effects. However, many of these effects are more intense in coastal and high latitude regions. Current longer periods of ice-free conditions, in combination with a rising sea level and thawing permafrost, can result in accelerated Arctic Ocean coastline change and erosion. Areas dominantly composed of ice-cemented peats and silt-rich permafrost have proven to be especially susceptible to rapid erosion. Anderson et al. (2009; Geology News) have measured erosion rates at sites along the Alaskan Arctic Ocean coast of 15 m per year. The continental scope of these changes, as well as the remote and inhospitable nature of the study area make geologic remote sensing techniques particularly well suited for studying coastal erosion along the 45,000 km of Arctic Ocean coastline. While it is valuable to determine current patterns of erosion, it is equally important to map historic rates in order to determine if coastal erosion is accelerating, if it is in a new behavioral regime, if there are areas of emergent erosion patterns, or if what is currently measured is only a single instance in a complex and constantly shifting pattern of an overall balance of erosion and deposition at high latitudes. Even in relatively stable conditions, coastline processes are dynamic and complex, making it especially important to ensure the best possible accuracy in a study of this kind. Remote sensing solutions in the earth sciences have often run in to obstacles concerning a lack of historic data and baselines as well as issues in the systemization of accurate feature mapping. Using object-based image analysis techniques on Landsat archive data allows for the possibility of a multi-decadal map of Arctic Ocean coastline changes. Landsat data (from sensors MSS 1-3 and TM/ETM 4, 5, and 7) provide imagery as frequently as every 16 days since July 1972, are well-calibrated both radiometrically and geometrically, and are freely available from

  16. NASA software documentation standard software engineering program

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The NASA Software Documentation Standard (hereinafter referred to as Standard) can be applied to the documentation of all NASA software. This Standard is limited to documentation format and content requirements. It does not mandate specific management, engineering, or assurance standards or techniques. This Standard defines the format and content of documentation for software acquisition, development, and sustaining engineering. Format requirements address where information shall be recorded and content requirements address what information shall be recorded. This Standard provides a framework to allow consistency of documentation across NASA and visibility into the completeness of project documentation. This basic framework consists of four major sections (or volumes). The Management Plan contains all planning and business aspects of a software project, including engineering and assurance planning. The Product Specification contains all technical engineering information, including software requirements and design. The Assurance and Test Procedures contains all technical assurance information, including Test, Quality Assurance (QA), and Verification and Validation (V&V). The Management, Engineering, and Assurance Reports is the library and/or listing of all project reports.

  17. Bayesian Software Health Management for Aircraft Guidance, Navigation, and Control

    NASA Technical Reports Server (NTRS)

    Schumann, Johann; Mbaya, Timmy; Menghoel, Ole

    2011-01-01

    Modern aircraft, both piloted fly-by-wire commercial aircraft as well as UAVs, more and more depend on highly complex safety critical software systems with many sensors and computer-controlled actuators. Despite careful design and V&V of the software, severe incidents have happened due to malfunctioning software. In this paper, we discuss the use of Bayesian networks (BNs) to monitor the health of the on-board software and sensor system, and to perform advanced on-board diagnostic reasoning. We will focus on the approach to develop reliable and robust health models for the combined software and sensor systems.

  18. OUTSHORE Maturity Model: Assistance for Software Offshore Outsourcing Decisions

    NASA Astrophysics Data System (ADS)

    Mäkiö, Juho; Betz, Stafanie; Oberweis, Andreas

    Offshore outsourcing software development (OOSD) is increasingly being used by the Software Industry. OOSD is a specific variant of Geographically Distributed Software Developmentdistributed software development (GDSD). Compared to the traditional mode of software development (i.e., in-house) GDSD is more edgy and puts at risk the attainment of the expected results. Although the failure of an offshore outsourcing software project may be caused by a variety of factors, one major complication is geographical distance. Consequently we argue that risk avoidance in outshore software development should be undertaken well in advance of the development launch. This could be done by testing the offshore outsourcing relevance of each software project and then the offshore outsourcing company involved. With this in mind we have developed the OUTSHORE Maturity Modeloutshore maturity model - OMM.

  19. Simple solution to the medical instrumentation software problem

    NASA Astrophysics Data System (ADS)

    Leif, Robert C.; Leif, Suzanne B.; Leif, Stephanie H.; Bingue, E.

    1995-04-01

    Medical devices now include a substantial software component, which is both difficult and expensive to produce and maintain. Medical software must be developed according to `Good Manufacturing Practices', GMP. Good Manufacturing Practices as specified by the FDA and ISO requires the definition and compliance to a software processes which ensures quality products by specifying a detailed method of software construction. The software process should be based on accepted standards. US Department of Defense software standards and technology can both facilitate the development and improve the quality of medical systems. We describe the advantages of employing Mil-Std-498, Software Development and Documentation, and the Ada programming language. Ada provides the very broad range of functionalities, from embedded real-time to management information systems required by many medical devices. It also includes advanced facilities for object oriented programming and software engineering.

  20. Improving Students Pronunciation through Accent Reduction Software

    ERIC Educational Resources Information Center

    Seferolu, Glge

    2005-01-01

    This study aimed to find out whether integrating accent reduction software in advanced English language classes at the university level would result in improvements in students pronunciation at the segmental and sup-rasegmental levels. The study made use of a quasi-experimental research design. Two classes at the Department of Foreign Language…

  1. A Markov random field approach for topology-preserving registration: application to object-based tomographic image interpolation.

    PubMed

    Cordero-Grande, Lucilio; Vegas-Sánchez-Ferrero, Gonzalo; Casaseca-de-la-Higuera, Pablo; Alberola-López, Carlos

    2012-04-01

    This paper proposes a topology-preserving multiresolution elastic registration method based on a discrete Markov random field of deformations and a block-matching procedure. The method is applied to the object-based interpolation of tomographic slices. For that purpose, the fidelity of a given deformation to the data is established by a block-matching strategy based on intensity- and gradient-related features, the smoothness of the transformation is favored by an appropriate prior on the field, and the deformation is guaranteed to maintain the topology by imposing some hard constraints on the local configurations of the field. The resulting deformation is defined as the maximum a posteriori configuration. Additionally, the relative influence of the fidelity and smoothness terms is weighted by the unsupervised estimation of the field parameters. In order to obtain an unbiased interpolation result, the registration is performed both in the forward and backward directions, and the resulting transformations are combined by using the local information content of the deformation. The method is applied to magnetic resonance and computed tomography acquisitions of the brain and the torso. Quantitative comparisons offer an overall improvement in performance with respect to related works in the literature. Additionally, the application of the interpolation method to cardiac magnetic resonance images has shown that the removal of any of the main components of the algorithm results in a decrease in performance which has proven to be statistically significant.

  2. The perceptual root of object-based storage: an interactive model of perception and visual working memory.

    PubMed

    Gao, Tao; Gao, Zaifeng; Li, Jie; Sun, Zhongqiang; Shen, Mowei

    2011-12-01

    Mainstream theories of visual perception assume that visual working memory (VWM) is critical for integrating online perceptual information and constructing coherent visual experiences in changing environments. Given the dynamic interaction between online perception and VWM, we propose that how visual information is processed during visual perception can directly determine how the information is going to be selected, consolidated, and maintained in VWM. We demonstrate the validity of this hypothesis by investigating what kinds of perceptual information can be stored as integrated objects in VWM. Three criteria for object-based storage are introduced: (a) automatic selection of task-irrelevant features, (b) synchronous consolidation of multiple features, and (c) stable maintenance of feature conjunctions. The results show that the outputs of parallel perception meet all three criteria, as opposed to the outputs of serial attentive processing, which fail all three criteria. These results indicate that (a) perception and VWM are not two sequential processes, but are dynamically intertwined; (b) there are dissociated mechanisms in VWM for storing information identified at different stages of perception; and (c) the integrated object representations in VWM originate from the "preattentive" or "proto" objects created by parallel perception. These results suggest how visual perception, attention, and VWM can be explained by a unified framework.

  3. High-resolution tree canopy mapping for New York City using LIDAR and object-based image analysis

    NASA Astrophysics Data System (ADS)

    MacFaden, Sean W.; O'Neil-Dunne, Jarlath P. M.; Royar, Anna R.; Lu, Jacqueline W. T.; Rundle, Andrew G.

    2012-01-01

    Urban tree canopy is widely believed to have myriad environmental, social, and human-health benefits, but a lack of precise canopy estimates has hindered quantification of these benefits in many municipalities. This problem was addressed for New York City using object-based image analysis (OBIA) to develop a comprehensive land-cover map, including tree canopy to the scale of individual trees. Mapping was performed using a rule-based expert system that relied primarily on high-resolution LIDAR, specifically its capacity for evaluating the height and texture of aboveground features. Multispectral imagery was also used, but shadowing and varying temporal conditions limited its utility. Contextual analysis was a key part of classification, distinguishing trees according to their physical and spectral properties as well as their relationships to adjacent, nonvegetated features. The automated product was extensively reviewed and edited via manual interpretation, and overall per-pixel accuracy of the final map was 96%. Although manual editing had only a marginal effect on accuracy despite requiring a majority of project effort, it maximized aesthetic quality and ensured the capture of small, isolated trees. Converting high-resolution LIDAR and imagery into usable information is a nontrivial exercise, requiring significant processing time and labor, but an expert system-based combination of OBIA and manual review was an effective method for fine-scale canopy mapping in a complex urban environment.

  4. Copula regression analysis of simultaneously recorded frontal eye field and inferotemporal spiking activity during object-based working memory.

    PubMed

    Hu, Meng; Clark, Kelsey L; Gong, Xiajing; Noudoost, Behrad; Li, Mingyao; Moore, Tirin; Liang, Hualou

    2015-06-10

    Inferotemporal (IT) neurons are known to exhibit persistent, stimulus-selective activity during the delay period of object-based working memory tasks. Frontal eye field (FEF) neurons show robust, spatially selective delay period activity during memory-guided saccade tasks. We present a copula regression paradigm to examine neural interaction of these two types of signals between areas IT and FEF of the monkey during a working memory task. This paradigm is based on copula models that can account for both marginal distribution over spiking activity of individual neurons within each area and joint distribution over ensemble activity of neurons between areas. Considering the popular GLMs as marginal models, we developed a general and flexible likelihood framework that uses the copula to integrate separate GLMs into a joint regression analysis. Such joint analysis essentially leads to a multivariate analog of the marginal GLM theory and hence efficient model estimation. In addition, we show that Granger causality between spike trains can be readily assessed via the likelihood ratio statistic. The performance of this method is validated by extensive simulations, and compared favorably to the widely used GLMs. When applied to spiking activity of simultaneously recorded FEF and IT neurons during working memory task, we observed significant Granger causality influence from FEF to IT, but not in the opposite direction, suggesting the role of the FEF in the selection and retention of visual information during working memory. The copula model has the potential to provide unique neurophysiological insights about network properties of the brain. PMID:26063909

  5. Seismic zonation of Port-Au-Prince using pixel- and object-based imaging analysis methods on ASTER GDEM

    USGS Publications Warehouse

    Yong, Alan; Hough, Susan E.; Cox, Brady R.; Rathje, Ellen M.; Bachhuber, Jeff; Dulberg, Ranon; Hulslander, David; Christiansen, Lisa; and Abrams, Michael J.

    2011-01-01

    We report about a preliminary study to evaluate the use of semi-automated imaging analysis of remotely-sensed DEM and field geophysical measurements to develop a seismic-zonation map of Port-au-Prince, Haiti. For in situ data, VS30 values are derived from the MASW technique deployed in and around the city. For satellite imagery, we use an ASTER GDEM of Hispaniola. We apply both pixel- and object-based imaging methods on the ASTER GDEM to explore local topography (absolute elevation values) and classify terrain types such as mountains, alluvial fans and basins/near-shore regions. We assign NEHRP seismic site class ranges based on available VS30 values. A comparison of results from imagery-based methods to results from traditional geologic-based approaches reveals good overall correspondence. We conclude that image analysis of RS data provides reliable first-order site characterization results in the absence of local data and can be useful to refine detailed site maps with sparse local data.

  6. Seismic-zonation of Port-au-Prince using pixel- and object-based imaging analysis methods on ASTER GDEM

    USGS Publications Warehouse

    Yong, A.; Hough, S.E.; Cox, B.R.; Rathje, E.M.; Bachhuber, J.; Dulberg, R.; Hulslander, D.; Christiansen, L.; Abrams, M.J.

    2011-01-01

    We report about a preliminary study to evaluate the use of semi-automated imaging analysis of remotely-sensed DEM and field geophysical measurements to develop a seismic-zonation map of Port-au-Prince, Haiti. For in situ data, Vs30 values are derived from the MASW technique deployed in and around the city. For satellite imagery, we use an ASTER GDEM of Hispaniola. We apply both pixel- and object-based imaging methods on the ASTER GDEM to explore local topography (absolute elevation values) and classify terrain types such as mountains, alluvial fans and basins/near-shore regions. We assign NEHRP seismic site class ranges based on available Vs30 values. A comparison of results from imagery-based methods to results from traditional geologic-based approaches reveals good overall correspondence. We conclude that image analysis of RS data provides reliable first-order site characterization results in the absence of local data and can be useful to refine detailed site maps with sparse local data. ?? 2011 American Society for Photogrammetry and Remote Sensing.

  7. Development of an object-based classification model for mapping mountainous forest cover at high elevation using aerial photography

    NASA Astrophysics Data System (ADS)

    Lateb, Mustapha; Kalaitzidis, Chariton; Tompoulidou, Maria; Gitas, Ioannis

    2016-08-01

    Climate change and overall temperature increase results in changes in forest cover in high elevations. Due to the long life cycle of trees, these changes are very gradual and can be observed over long periods of time. In order to use remote sensing imagery for this purpose it needs to have very high spatial resolution and to have been acquired at least 50 years ago. At the moment, the only type of remote sensing imagery with these characteristics is historical black and white aerial photographs. This study used an aerial photograph from 1945 in order to map the forest cover at the Olympus National Park, at that date. An object-based classification (OBC) model was developed in order to classify forest and discriminate it from other types of vegetation. Due to the lack of near-infrared information, the model had to rely solely on the tone of the objects, as well as their geometric characteristics. The model functioned on three segmentation levels, using sub-/super-objects relationships and utilising vegetation density to discriminate forest and non-forest vegetation. The accuracy of the classification was assessed using 503 visually interpreted and randomly distributed points, resulting in a 92% overall accuracy. The model is using unbiased parameters that are important for differentiating between forest and non-forest vegetation and should be transferrable to other study areas of mountainous forests at high elevations.

  8. COTS software selection process.

    SciTech Connect

    Watkins, William M. (Strike Wire Technologies, Louisville, CO); Lin, Han Wei; McClelland, Kelly (U.S. Security Associates, Livermore, CA); Ullrich, Rebecca Ann; Khanjenoori, Soheil; Dalton, Karen; Lai, Anh Tri; Kuca, Michal; Pacheco, Sandra; Shaffer-Gant, Jessica

    2006-05-01

    Today's need for rapid software development has generated a great interest in employing Commercial-Off-The-Shelf (COTS) software products as a way of managing cost, developing time, and effort. With an abundance of COTS software packages to choose from, the problem now is how to systematically evaluate, rank, and select a COTS product that best meets the software project requirements and at the same time can leverage off the current corporate information technology architectural environment. This paper describes a systematic process for decision support in evaluating and ranking COTS software. Performed right after the requirements analysis, this process provides the evaluators with more concise, structural, and step-by-step activities for determining the best COTS software product with manageable risk. In addition, the process is presented in phases that are flexible to allow for customization or tailoring to meet various projects' requirements.

  9. Interoperability of Neuroscience Modeling Software

    PubMed Central

    Cannon, Robert C.; Gewaltig, Marc-Oliver; Gleeson, Padraig; Bhalla, Upinder S.; Cornelis, Hugo; Hines, Michael L.; Howell, Fredrick W.; Muller, Eilif; Stiles, Joel R.; Wils, Stefan; De Schutter, Erik

    2009-01-01

    Neuroscience increasingly uses computational models to assist in the exploration and interpretation of complex phenomena. As a result, considerable effort is invested in the development of software tools and technologies for numerical simulations and for the creation and publication of models. The diversity of related tools leads to the duplication of effort and hinders model reuse. Development practices and technologies that support interoperability between software systems therefore play an important role in making the modeling process more efficient and in ensuring that published models can be reliably and easily reused. Various forms of interoperability are possible including the development of portable model description standards, the adoption of common simulation languages or the use of standardized middleware. Each of these approaches finds applications within the broad range of current modeling activity. However more effort is required in many areas to enable new scientific questions to be addressed. Here we present the conclusions of the “Neuro-IT Interoperability of Simulators” workshop, held at the 11th computational neuroscience meeting in Edinburgh (July 19-20 2006; http://www.cnsorg.org). We assess the current state of interoperability of neural simulation software and explore the future directions that will enable the field to advance. PMID:17873374

  10. Gammasphere software development

    SciTech Connect

    Piercey, R.B.

    1993-01-01

    Activities of the nuclear physics group are described. Progress was made in organizing the Gammasphere Software Working Group, establishing a nuclear computing facility, participating in software development at Lawrence Berkeley, developing a common data file format, and adapting the ORNL UPAK software to run at Gammasphere. A universal histogram object was developed that defines a file format and provides for an objective-oriented programming model. An automated liquid nitrogen fill system was developed for Gammasphere (110 Ge detectors comprise the sphere).

  11. Dtest Testing Software

    NASA Technical Reports Server (NTRS)

    Jain, Abhinandan; Cameron, Jonathan M.; Myint, Steven

    2013-01-01

    This software runs a suite of arbitrary software tests spanning various software languages and types of tests (unit level, system level, or file comparison tests). The dtest utility can be set to automate periodic testing of large suites of software, as well as running individual tests. It supports distributing multiple tests over multiple CPU cores, if available. The dtest tool is a utility program (written in Python) that scans through a directory (and its subdirectories) and finds all directories that match a certain pattern and then executes any tests in that directory as described in simple configuration files.

  12. NASA's Software Safety Standard

    NASA Technical Reports Server (NTRS)

    Ramsay, Christopher M.

    2005-01-01

    NASA (National Aeronautics and Space Administration) relies more and more on software to control, monitor, and verify its safety critical systems, facilities and operations. Since the 1960's there has hardly been a spacecraft (manned or unmanned) launched that did not have a computer on board that provided vital command and control services. Despite this growing dependence on software control and monitoring, there has been no consistent application of software safety practices and methodology to NASA's projects with safety critical software. Led by the NASA Headquarters Office of Safety and Mission Assurance, the NASA Software Safety Standard (STD-18l9.13B) has recently undergone a significant update in an attempt to provide that consistency. This paper will discuss the key features of the new NASA Software Safety Standard. It will start with a brief history of the use and development of software in safety critical applications at NASA. It will then give a brief overview of the NASA Software Working Group and the approach it took to revise the software engineering process across the Agency.

  13. Software Reuse Issues

    NASA Technical Reports Server (NTRS)

    Voigt, Susan J. (Editor); Smith, Kathryn A. (Editor)

    1989-01-01

    NASA Langley Research Center sponsored a Workshop on NASA Research in Software Reuse on November 17-18, 1988 in Melbourne, Florida, hosted by Software Productivity Solutions, Inc. Participants came from four NASA centers and headquarters, eight NASA contractor companies, and three research institutes. Presentations were made on software reuse research at the four NASA centers; on Eli, the reusable software synthesis system designed and currently under development by SPS; on Space Station Freedom plans for reuse; and on other reuse research projects. This publication summarizes the presentations made and the issues discussed during the workshop.

  14. On Software Compatibility.

    ERIC Educational Resources Information Center

    Ershov, Andrei P.

    The problem of compatibility of software hampers the development of computer application. One solution lies in standardization of languages, terms, peripherais, operating systems and computer characteristics. (AB)

  15. Architecture for Verifiable Software

    NASA Technical Reports Server (NTRS)

    Reinholtz, William; Dvorak, Daniel

    2005-01-01

    Verifiable MDS Architecture (VMA) is a software architecture that facilitates the construction of highly verifiable flight software for NASA s Mission Data System (MDS), especially for smaller missions subject to cost constraints. More specifically, the purpose served by VMA is to facilitate aggressive verification and validation of flight software while imposing a minimum of constraints on overall functionality. VMA exploits the state-based architecture of the MDS and partitions verification issues into elements susceptible to independent verification and validation, in such a manner that scaling issues are minimized, so that relatively large software systems can be aggressively verified in a cost-effective manner.

  16. 2006 XSD Scientific Software Workshop report.

    SciTech Connect

    Evans, K., Jr.; De Carlo, F.; Jemian, P.; Lang, J.; Lienert, U.; Maclean, J.; Newville, M.; Tieman, B.; Toby, B.; van Veenendaal, B.; Univ. of Chicago

    2006-01-22

    In May of 2006, a committee was formed to assess the fundamental needs and opportunities in scientific software for x-ray data reduction, analysis, modeling, and simulation. This committee held a series of discussions throughout the summer, conducted a poll of the members of the x-ray community, and held a workshop. This report details the findings and recommendations of the committee. Each experiment performed at the APS requires three crucial ingredients: the powerful x-ray source, an optimized instrument to perform measurements, and computer software to acquire, visualize, and analyze the experimental observations. While the APS has invested significant resources in the accelerator, investment in other areas such as scientific software for data analysis and visualization has lagged behind. This has led to the adoption of a wide variety of software with variable levels of usability. In order to maximize the scientific output of the APS, it is essential to support the broad development of real-time analysis and data visualization software. As scientists attack problems of increasing sophistication and deal with larger and more complex data sets, software is playing an ever more important role. Furthermore, our need for excellent and flexible scientific software can only be expected to increase, as the upgrade of the APS facility and the implementation of advanced detectors create a host of new measurement capabilities. New software analysis tools must be developed to take full advantage of these capabilities. It is critical that the APS take the lead in software development and the implementation of theory to software to ensure the continued success of this facility. The topics described in this report are relevant to the APS today and critical for the APS upgrade plan. Implementing these recommendations will have a positive impact on the scientific productivity of the APS today and will be even more critical in the future.

  17. HydroShare: Applying professional software engineering to a new NSF-funded large software project

    NASA Astrophysics Data System (ADS)

    Idaszak, R.; Tarboton, D. G.; Ames, D.; Saleem Arrigo, J. A.; Band, L. E.; Bedig, A.; Castronova, A. M.; Christopherson, L.; Coposky, J.; Couch, A.; Dash, P.; Gan, T.; Goodall, J.; Gustafson, K.; Heard, J.; Hooper, R. P.; Horsburgh, J. S.; Jackson, S.; Johnson, H.; Maidment, D. R.; Mbewe, P.; Merwade, V.; Miles, B.; Reeder, S.; Russell, T.; Song, C.; Taylor, A.; Thakur, S.; Valentine, D. W.; Whiteaker, T. L.

    2013-12-01

    HydroShare is an online, collaborative system being developed for sharing hydrologic data and models as part of the NSF's Software Infrastructure for Sustained Innovation (SI2) program (NSF collaborative award numbers 1148453 and 1148090). HydroShare involves a large software development effort requiring cooperative research and distributed software development between domain scientists, professional software engineers (here 'professional' denotes previous commercial experience in the application of modern software engineering), and university software developers. HydroShare expands upon the data sharing capabilities of the Hydrologic Information System of the Consortium of Universities for the Advancement of Hydrologic Sciences, Inc. (CUAHSI) by broadening the classes of data accommodated, expanding capability to include the sharing of models and model components, and taking advantage of emerging social media functionality to enhance information about and collaboration around hydrologic data and models. With a goal of enabling better science concomitant with improved sustainable software practices, we will describe our approach, experiences, and lessons learned thus-far in applying professional software engineering to a large NSF-funded software project from the project's onset.

  18. Statistical modelling of software reliability

    NASA Technical Reports Server (NTRS)

    Miller, Douglas R.

    1991-01-01

    During the six-month period from 1 April 1991 to 30 September 1991 the following research papers in statistical modeling of software reliability appeared: (1) A Nonparametric Software Reliability Growth Model; (2) On the Use and the Performance of Software Reliability Growth Models; (3) Research and Development Issues in Software Reliability Engineering; (4) Special Issues on Software; and (5) Software Reliability and Safety.

  19. Teaching Social Software with Social Software

    ERIC Educational Resources Information Center

    Mejias, Ulises

    2006-01-01

    Ulises Mejias examines how social software--information and communications technologies that facilitate the collaboration and exchange of ideas--enables students to participate in distributed research, an approach to learning in which knowledge is collectively constructed and shared. During Fall 2005, Mejias taught a graduate seminar that provided…

  20. System and Software Reliability (C103)

    NASA Technical Reports Server (NTRS)

    Wallace, Dolores

    2003-01-01

    Within the last decade better reliability models (hardware. software, system) than those currently used have been theorized and developed but not implemented in practice. Previous research on software reliability has shown that while some existing software reliability models are practical, they are no accurate enough. New paradigms of development (e.g. OO) have appeared and associated reliability models have been proposed posed but not investigated. Hardware models have been extensively investigated but not integrated into a system framework. System reliability modeling is the weakest of the three. NASA engineers need better methods and tools to demonstrate that the products meet NASA requirements for reliability measurement. For the new models for the software component of the last decade, there is a great need to bring them into a form that they can be used on software intensive systems. The Statistical Modeling and Estimation of Reliability Functions for Systems (SMERFS'3) tool is an existing vehicle that may be used to incorporate these new modeling advances. Adapting some existing software reliability modeling changes to accommodate major changes in software development technology may also show substantial improvement in prediction accuracy. With some additional research, the next step is to identify and investigate system reliability. System reliability models could then be incorporated in a tool such as SMERFS'3. This tool with better models would greatly add value in assess in GSFC projects.

  1. Object-Based Classification of Ikonos Imagery for Mapping Large-Scale Vegetation Communities in Urban Areas

    PubMed Central

    Mathieu, Renaud; Aryal, Jagannath; Chong, Albert K.

    2007-01-01

    Effective assessment of biodiversity in cities requires detailed vegetation maps. To date, most remote sensing of urban vegetation has focused on thematically coarse land cover products. Detailed habitat maps are created by manual interpretation of aerial photographs, but this is time consuming and costly at large scale. To address this issue, we tested the effectiveness of object-based classifications that use automated image segmentation to extract meaningful ground features from imagery. We applied these techniques to very high resolution multispectral Ikonos images to produce vegetation community maps in Dunedin City, New Zealand. An Ikonos image was orthorectified and a multi-scale segmentation algorithm used to produce a hierarchical network of image objects. The upper level included four coarse strata: industrial/commercial (commercial buildings), residential (houses and backyard private gardens), vegetation (vegetation patches larger than 0.8/1ha), and water. We focused on the vegetation stratum that was segmented at more detailed level to extract and classify fifteen classes of vegetation communities. The first classification yielded a moderate overall classification accuracy (64%, κ = 0.52), which led us to consider a simplified classification with ten vegetation classes. The overall classification accuracy from the simplified classification was 77% with a κ value close to the excellent range (κ = 0.74). These results compared favourably with similar studies in other environments. We conclude that this approach does not provide maps as detailed as those produced by manually interpreting aerial photographs, but it can still extract ecologically significant classes. It is an efficient way to generate accurate and detailed maps in significantly shorter time. The final map accuracy could be improved by integrating segmentation, automated and manual classification in the mapping process, especially when considering important vegetation classes with limited

  2. Multiscale object-based drought monitoring and comparison in rainfed and irrigated agriculture from Landsat 8 OLI imagery

    NASA Astrophysics Data System (ADS)

    Ozelkan, Emre; Chen, Gang; Ustundag, Burak Berk

    2016-02-01

    Drought is a rapidly rising environmental issue that can cause hardly repaired or unrepaired damages to the nature and socio-economy. This is especially true for a region that features arid/semi-arid climate, including the Turkey's most important agricultural district - Southeast Anatolia. In this area, we examined the uncertainties of applying Landsat 8 Operational Land Imager (OLI) NDVI data to estimate meteorological drought - Standardized Precipitation Index (SPI) - measured from 31 in-situ agro-meteorological monitoring stations during spring and summer of 2013 and 2014. Our analysis was designed to address two important, yet under-examined questions: (i) how does the co-existence of rainfed and irrigated agriculture affect remote sensing drought monitoring in an arid/semi-arid region? (ii) What is the role of spatial scale in drought monitoring using a GEOBIA (geographic object-based image analysis) framework? Results show that spatial scale exerted a higher impact on drought monitoring especially in the drier year 2013, during which small scales were found to outperform large scales in general. In addition, consideration of irrigated and rainfed areas separately ensured a better performance in drought analysis. Compared to the positive correlations between SPI and NDVI over the rainfed areas, negative correlations were determined over the irrigated agricultural areas. Finally, the time lag effect was evident in the study, i.e., strong correlations between spring SPI and summer NDVI in both 2013 and 2014. This reflects the fact that spring watering is crucial for the growth and yield of the major crops (i.e., winter wheat, barley and lentil) cultivated in the region.

  3. Land cover changes assessment using object-based image analysis in the Binah River watershed (Togo and Benin)

    NASA Astrophysics Data System (ADS)

    Badjana, Hèou Maléki; Helmschrot, Jörg; Selsam, Peter; Wala, Kpérkouma; Flügel, Wolfgang-Albert; Afouda, Abel; Akpagana, Koffi

    2015-10-01

    In this study, land cover changes between 1972 and 2013 were investigated in the Binah River watershed (North of Togo and Benin) using remote sensing and geographic information system technologies. Multitemporal satellite images—Landsat MSS (1972), TM (1987), and OLI-TIRS (2013)—were processed using object-based image analysis and post-classification comparison methods including landscape metrics and changes trajectories analysis. Land cover maps referring to five main land cover classes, namely, agricultural land, forest land, savannah, settlements, and water bodies, were produced for each acquisition date. The overall accuracies were 76.64% (1972), 83.52% (1987), and 88.84% (2013) with respective Kappa statistics of 0.69, 0.78, and 0.86. The assessment of the spatiotemporal pattern of land cover changes indicates that savannah, the main vegetation type, has undergone the most dominant change, decreasing from 67% of the basin area in 1972 to 56% in 1987 and 33% in 2013. At the same time, agricultural land has significantly increased from 15% in 1972 to 24% in 1987 and 43% in 2013, while some proportions of agricultural land were converted to savannah relating to fallow agriculture. In total, more than 55% of the landscape experienced changes between 1972 and 2013. These changes are primarily due to human activities and population growth. In addition, agricultural activities significantly contributed to the increase in the number of patches, degree of division, and splitting index of forest and savannah vegetations and the decrease in their effective mesh sizes. These results indicate further fragmentation of forest and savannah vegetations between 1972 and 2013. Further research is needed to quantitatively evaluate the influences of individual factors of human activities and to separate these from the impacts of climate change-driven disturbances.

  4. Mapping trees outside forests using high-resolution aerial imagery: a comparison of pixel- and object-based classification approaches.

    PubMed

    Meneguzzo, Dacia M; Liknes, Greg C; Nelson, Mark D

    2013-08-01

    Discrete trees and small groups of trees in nonforest settings are considered an essential resource around the world and are collectively referred to as trees outside forests (ToF). ToF provide important functions across the landscape, such as protecting soil and water resources, providing wildlife habitat, and improving farmstead energy efficiency and aesthetics. Despite the significance of ToF, forest and other natural resource inventory programs and geospatial land cover datasets that are available at a national scale do not include comprehensive information regarding ToF in the United States. Additional ground-based data collection and acquisition of specialized imagery to inventory these resources are expensive alternatives. As a potential solution, we identified two remote sensing-based approaches that use free high-resolution aerial imagery from the National Agriculture Imagery Program (NAIP) to map all tree cover in an agriculturally dominant landscape. We compared the results obtained using an unsupervised per-pixel classifier (independent component analysis-[ICA]) and an object-based image analysis (OBIA) procedure in Steele County, Minnesota, USA. Three types of accuracy assessments were used to evaluate how each method performed in terms of: (1) producing a county-level estimate of total tree-covered area, (2) correctly locating tree cover on the ground, and (3) how tree cover patch metrics computed from the classified outputs compared to those delineated by a human photo interpreter. Both approaches were found to be viable for mapping tree cover over a broad spatial extent and could serve to supplement ground-based inventory data. The ICA approach produced an estimate of total tree cover more similar to the photo-interpreted result, but the output from the OBIA method was more realistic in terms of describing the actual observed spatial pattern of tree cover.

  5. An object-based approach to delineate wetlands across landscapes of varied disturbance with high spatial resolution satellite imagery

    NASA Astrophysics Data System (ADS)

    Mui, Amy; He, Yuhong; Weng, Qihao

    2015-11-01

    Mapping wetlands across both natural and human-altered landscapes is important for the management of these ecosystems. Though they are considered important landscape elements providing both ecological and socioeconomic benefits, accurate wetland inventories do not exist in many areas. In this study, a multi-scale geographic object-based image analysis (GEOBIA) approach was employed to segment three high spatial resolution images acquired over landscapes of varying heterogeneity due to human-disturbance to determine the robustness of this method to changing scene variability. Multispectral layers, a digital elevation layer, normalized-difference vegetation index (NDVI) layer, and a first-order texture layer were used to segment images across three segmentation scales with a focus on accurate delineation of wetland boundaries and wetland components. Each ancillary input layer contributed to improving segmentation at different scales. Wetlands were classified using a nearest neighbor approach across a relatively undisturbed park site and an agricultural site using GeoEye1 imagery, and an urban site using WorldView2 data. Successful wetland classification was achieved across all study sites with an accuracy above 80%, though results suggest that overall a higher degree of landscape heterogeneity may negatively affect both segmentation and classification. The agricultural site suffered from the greatest amount of over and under segmentation, and lowest map accuracy (kappa: 0.78) which was partially attributed to confusion among a greater proportion of mixed vegetated classes from both wetlands and uplands. Accuracy of individual wetland classes based on the Canadian Wetland Classification system varied between each site, with kappa values ranging from 0.64 for the swamp class and 0.89 for the marsh class. This research developed a unique approach to mapping wetlands of various degrees of disturbance using GEOBIA, which can be applied to study other wetlands of similar

  6. Mapping trees outside forests using high-resolution aerial imagery: a comparison of pixel- and object-based classification approaches.

    PubMed

    Meneguzzo, Dacia M; Liknes, Greg C; Nelson, Mark D

    2013-08-01

    Discrete trees and small groups of trees in nonforest settings are considered an essential resource around the world and are collectively referred to as trees outside forests (ToF). ToF provide important functions across the landscape, such as protecting soil and water resources, providing wildlife habitat, and improving farmstead energy efficiency and aesthetics. Despite the significance of ToF, forest and other natural resource inventory programs and geospatial land cover datasets that are available at a national scale do not include comprehensive information regarding ToF in the United States. Additional ground-based data collection and acquisition of specialized imagery to inventory these resources are expensive alternatives. As a potential solution, we identified two remote sensing-based approaches that use free high-resolution aerial imagery from the National Agriculture Imagery Program (NAIP) to map all tree cover in an agriculturally dominant landscape. We compared the results obtained using an unsupervised per-pixel classifier (independent component analysis-[ICA]) and an object-based image analysis (OBIA) procedure in Steele County, Minnesota, USA. Three types of accuracy assessments were used to evaluate how each method performed in terms of: (1) producing a county-level estimate of total tree-covered area, (2) correctly locating tree cover on the ground, and (3) how tree cover patch metrics computed from the classified outputs compared to those delineated by a human photo interpreter. Both approaches were found to be viable for mapping tree cover over a broad spatial extent and could serve to supplement ground-based inventory data. The ICA approach produced an estimate of total tree cover more similar to the photo-interpreted result, but the output from the OBIA method was more realistic in terms of describing the actual observed spatial pattern of tree cover. PMID:23255169

  7. A systematic comparison of different object-based classification techniques using high spatial resolution imagery in agricultural environments

    NASA Astrophysics Data System (ADS)

    Li, Manchun; Ma, Lei; Blaschke, Thomas; Cheng, Liang; Tiede, Dirk

    2016-07-01

    Geographic Object-Based Image Analysis (GEOBIA) is becoming more prevalent in remote sensing classification, especially for high-resolution imagery. Many supervised classification approaches are applied to objects rather than pixels, and several studies have been conducted to evaluate the performance of such supervised classification techniques in GEOBIA. However, these studies did not systematically investigate all relevant factors affecting the classification (segmentation scale, training set size, feature selection and mixed objects). In this study, statistical methods and visual inspection were used to compare these factors systematically in two agricultural case studies in China. The results indicate that Random Forest (RF) and Support Vector Machines (SVM) are highly suitable for GEOBIA classifications in agricultural areas and confirm the expected general tendency, namely that the overall accuracies decline with increasing segmentation scale. All other investigated methods except for RF and SVM are more prone to obtain a lower accuracy due to the broken objects at fine scales. In contrast to some previous studies, the RF classifiers yielded the best results and the k-nearest neighbor classifier were the worst results, in most cases. Likewise, the RF and Decision Tree classifiers are the most robust with or without feature selection. The results of training sample analyses indicated that the RF and adaboost. M1 possess a superior generalization capability, except when dealing with small training sample sizes. Furthermore, the classification accuracies were directly related to the homogeneity/heterogeneity of the segmented objects for all classifiers. Finally, it was suggested that RF should be considered in most cases for agricultural mapping.

  8. Mapping landslide source and transport areas in VHR images with Object-Based Analysis and Support Vector Machines

    NASA Astrophysics Data System (ADS)

    Heleno, Sandra; Matias, Magda; Pina, Pedro

    2015-04-01

    Visual interpretation of satellite imagery remains extremely demanding in terms of resources and time, especially when dealing with numerous multi-scale landslides affecting wide areas, such as is the case of rainfall-induced shallow landslides. Applying automated methods can contribute to more efficient landslide mapping and updating of existing inventories, and in recent years the number and variety of approaches is rapidly increasing. Very High Resolution (VHR) images, acquired by space-borne sensors with sub-metric precision, such as Ikonos, Quickbird, Geoeye and Worldview, are increasingly being considered as the best option for landslide mapping, but these new levels of spatial detail also present new challenges to state of the art image analysis tools, asking for automated methods specifically suited to map landslide events on VHR optical images. In this work we develop and test a methodology for semi-automatic landslide recognition and mapping of landslide source and transport areas. The method combines object-based image analysis and a Support Vector Machine supervised learning algorithm, and was tested using a GeoEye-1 multispectral image, sensed 3 days after a damaging landslide event in Madeira Island, together with a pre-event LiDAR DEM. Our approach has proved successful in the recognition of landslides on a 15 Km2-wide study area, with 81 out of 85 landslides detected in its validation regions. The classifier also showed reasonable performance (false positive rate 60% and false positive rate below 36% in both validation regions) in the internal mapping of landslide source and transport areas, in particular in the sunnier east-facing slopes. In the less illuminated areas the classifier is still able to accurately map the source areas, but performs poorly in the mapping of landslide transport areas.

  9. Optimization of a lossless object-based compression embedded on GAIA, a next-generation space telescope

    NASA Astrophysics Data System (ADS)

    Oseret, Emmanuel; Timsit, Claude

    2007-09-01

    Until now space telescopes, like Humbble, did not require a strong data compression. In fact, images were captured on demand and their proximity to Earth gave them a sufficient downlink bandwidth. Yet, the next generation space telescopes like GAIA (ESA) and the James Webb Space Telescope (JWST, ESA & NASA) will observe even wider sky fields at even higher resolutions. Moreover, they will be dramatically farther from Earth than Hubble (1.5 million versus 600 kilometers). This will imply a poor downlink bandwidth, and thus require a fast, on-board strong data compression (better than 1:200 ratios). To achieve GAIA scientific objectives, a real-time «selectively lossless» compression is needed. With standard schemes, it is simply not possible today, even without time constraints (because of the entropy limit...). This paper explains why the GAIA Compression, which is based on Object-Based Compression (OBC), is efficient for stellar images. Since the baseline implementation did not meet all the ESA requirements (compression speed and ratio), we have also brought our contribution to optimize the GAIA Compression. It consists mainly in using (i) non-rectangular regions for large objects and (ii) and (inter-objects) differential predictive coding to improve the effficiency of the final lossless compression. We have tested our algorithms on the GAIA sky generator (GIBIS) which stimulates flight-realistic conditioins (CCD read-noise, cosmic rays...). Without any loss on signal, we have obtained promising ratios up to 1:270 for the worst case sky.

  10. Remedial Education of Embedded Software Specialists for Working People

    NASA Astrophysics Data System (ADS)

    Yamamoto, Masaki; Kawaguchi, Nobuo; Agusa, Kiyoshi; Mase, Kenji; Takada, Hiroaki; Tomiyama, Hiroyuki; Honda, Shinya; Kaneko, Nobuyuki

    Recently, development of embedded software increases in the company. The need of the training embedded software engineers in the company rises, and expectation for a university is big. We carry out embedded software engineers education courses for working people in a university. The courses are short term type and are classified the type of job and the technological level. We carried out eight kinds of courses of the introductory course, the intermediate courses and the advanced courses.

  11. Practical quality metrics for resolution enhancement software

    NASA Astrophysics Data System (ADS)

    Boone, Robert E.; Lucas, Kevin; Wynd, Raphael; Boatright, Mike; Thompson, Matthew A.; Reich, Alfred J.

    2003-06-01

    The past few years have seen an explosion in the application of software techniques to improve lithographic printing. Techniques such as optical proximity correction (OPC) and phase shift masks (PSM) increase resolution and CD control by distorting the mask pattern data from the original designed pattern. These software techniques are becoming increasingly complicated and non-intuitive; and the rate of complexity increase appears to be accelerating [1]. The benefits of these techniques to improve CD control and lower cost of ownership (COO) is balanced against the effort required to implement them and the additional problems they create. One severe problem for users of immature and complex software tools and methodologies is quality control, [2] as it ultimately becomes a COO problem. Software quality can be defined very simply as the ability of an application to meet detailed customer requirements. Software quality practice can be defined as the adherence to proven methods for planning, developing, testing and maintaining software. Although software quality for lithographic resolution enhancement is extremely important, the understanding and recognition of good software development practices among lithographers is generally poor. We therefore start by reviewing the essential terms and concepts of software quality that impact lithography and COO. We then propose methods by which semiconductor process and design engineers can estimate and compare the quality of the software tools and vendors they are evaluating or using. We include examples from advanced process technology resolution enhancement work that highlight the need for high-quality software practices, and show how to avoid many problems. Note that, although several authors have worked in software application development, our analysis here is somewhat of a black box analysis. The black box is the software development organization of an RET software supplier. Our access to actual developers within these

  12. Supporting Source Code Comprehension during Software Evolution and Maintenance

    ERIC Educational Resources Information Center

    Alhindawi, Nouh

    2013-01-01

    This dissertation addresses the problems of program comprehension to support the evolution of large-scale software systems. The research concerns how software engineers locate features and concepts along with categorizing changes within very large bodies of source code along with their versioned histories. More specifically, advanced Information…

  13. A strategic surety roadmap for high consequence software

    SciTech Connect

    Pollock, G.M.; Dalton, L.J.

    1995-12-31

    A strategic surety roadmap for high consequence software systems developed under the High Integrity Software (HIS) Program at Sandia National Laboratories is presented. Selected research tracks are identified and described detailing current technology and outlining advancements to be pursued over the coming decade to reach HIS goals.

  14. Designing Distributed Learning Environments with Intelligent Software Agents

    ERIC Educational Resources Information Center

    Lin, Fuhua, Ed.

    2005-01-01

    "Designing Distributed Learning Environments with Intelligent Software Agents" reports on the most recent advances in agent technologies for distributed learning. Chapters are devoted to the various aspects of intelligent software agents in distributed learning, including the methodological and technical issues on where and how intelligent agents…

  15. The Case for Open Source Software: The Interactional Discourse Lab

    ERIC Educational Resources Information Center

    Choi, Seongsook

    2016-01-01

    Computational techniques and software applications for the quantitative content analysis of texts are now well established, and many qualitative data software applications enable the manipulation of input variables and the visualization of complex relations between them via interactive and informative graphical interfaces. Although advances in…

  16. Communications Software for Microcomputers.

    ERIC Educational Resources Information Center

    Bruman, Janet L.

    Focusing on the use of microcomputers as "smart terminals" for accessing time-sharing systems for libraries, this document discusses the communications software needed to allow the microcomputer to appear as a terminal to the remote host. The functions which communications software programs are designed to perform are defined and explained,…

  17. Selecting the Right Software.

    ERIC Educational Resources Information Center

    Shearn, Joseph

    1987-01-01

    Selection of administrative software requires analyzing present needs and, to meet future needs, choosing software that will function with a more powerful computer system. Other important factors to include are a professional system demonstration, maintenance and training, and financial considerations that allow leasing or renting alternatives.…

  18. UWB Tracking Software Development

    NASA Technical Reports Server (NTRS)

    Gross, Julia; Arndt, Dickey; Ngo, Phong; Phan, Chau; Dusl, John; Ni, Jianjun; Rafford, Melinda

    2006-01-01

    An Ultra-Wideband (UWB) two-cluster Angle of Arrival (AOA) tracking prototype system is currently being developed and tested at NASA Johnson Space Center for space exploration applications. This talk discusses the software development efforts for this UWB two-cluster AOA tracking system. The role the software plays in this system is to take waveform data from two UWB radio receivers as an input, feed this input into an AOA tracking algorithm, and generate the target position as an output. The architecture of the software (Input/Output Interface and Algorithm Core) will be introduced in this talk. The development of this software has three phases. In Phase I, the software is mostly Matlab driven and calls C++ socket functions to provide the communication links to the radios. This is beneficial in the early stage when it is necessary to frequently test changes in the algorithm. Phase II of the development is to have the software mostly C++ driven and call a Matlab function for the AOA tracking algorithm. This is beneficial in order to send the tracking results to other systems and also to improve the tracking update rate of the system. The third phase is part of future work and is to have the software completely C++ driven with a graphics user interface. This software design enables the fine resolution tracking of the UWB two-cluster AOA tracking system.

  19. A User's Software Dilemma.

    ERIC Educational Resources Information Center

    Splittgerber, Fred; Stirzaker, N. A.

    1989-01-01

    Discusses several issues associated with purchasing computer software packages: (1) continual updates; (2) lack of industrial standards for software development; and (3) expense. Many packages fail to provide technical assistance from a local dealer or the package developer. Without standards, costs to business, education, and the general public…

  20. Plating Tank Control Software

    1998-03-01

    The Plating Tank Control Software is a graphical user interface that controls and records plating process conditions for plating in high aspect ratio channels that require use of low current and long times. The software is written for a Pentium II PC with an 8 channel data acquisition card, and the necessary shunt resistors for measuring currents in the millampere range.

  1. Software process assessments

    NASA Technical Reports Server (NTRS)

    Miller, Sharon E.; Tucker, George T.; Verducci, Anthony J., Jr.

    1992-01-01

    Software process assessments (SPA's) are part of an ongoing program of continuous quality improvements in AT&T. Their use was found to be very beneficial by software development organizations in identifying the issues facing the organization and the actions required to increase both quality and productivity in the organization.

  2. Reusable Software Technology

    NASA Technical Reports Server (NTRS)

    Morgan, Timothy E.

    1995-01-01

    The objective of the Reusable Software System (RSS) is to provide NASA Langley Research Center and its contractor personnel with a reusable software technology through the Internet. The RSS is easily accessible, provides information that is extractable, and the capability to submit information or data for the purpose of scientific research at NASA Langley Research Center within the Atmospheric Science Division.

  3. Cartographic applications software

    USGS Publications Warehouse

    U.S. Geological Survey

    1992-01-01

    The Office of the Assistant Division Chief for Research, National Mapping Division, develops computer software for the solution of geometronic problems in the fields of surveying, geodesy, remote sensing, and photogrammetry. Software that has been developed using public funds is available on request for a nominal charge to recover the cost of duplication.

  4. Learning from Software Localization.

    ERIC Educational Resources Information Center

    Guo, She-Sen

    2003-01-01

    Localization is the process of adapting a product to meet the language, cultural and other requirements of a specific target environment or market. This article describes ways in which software localization impacts upon curriculum, and discusses what students will learn from software localization. (AEF)

  5. Measuring software technology

    NASA Technical Reports Server (NTRS)

    Agresti, W. W.; Card, D. N.; Church, V. E.; Page, G.; Mcgarry, F. E.

    1983-01-01

    Results are reported from a series of investigations into the effectiveness of various methods and tools used in a software production environment. The basis for the analysis is a project data base, built through extensive data collection and process instrumentation. The project profiles become an organizational memory, serving as a reference point for an active program of measurement and experimentation on software technology.

  6. Software engineering ethics

    NASA Technical Reports Server (NTRS)

    Bown, Rodney L.

    1991-01-01

    Software engineering ethics is reviewed. The following subject areas are covered: lack of a system viewpoint; arrogance of PC DOS software vendors; violation od upward compatibility; internet worm; internet worm revisited; student cheating and company hiring interviews; computing practitioners and the commodity market; new projects and old programming languages; schedule and budget; and recent public domain comments.

  7. Software measurement guidebook

    NASA Technical Reports Server (NTRS)

    Bassman, Mitchell J.; Mcgarry, Frank; Pajerski, Rose

    1994-01-01

    This software Measurement Guidebook presents information on the purpose and importance of measurement. It discusses the specific procedures and activities of a measurement program and the roles of the people involved. The guidebook also clarifies the roles that measurement can and must play in the goal of continual, sustained improvement for all software production and maintenance efforts.

  8. Cactus: Software Priorities

    ERIC Educational Resources Information Center

    Hyde, Hartley

    2009-01-01

    The early eighties saw a period of rapid change in computing and teachers lost control of how they used computers in their classrooms. Software companies produced computer tools that looked so good that teachers forgot about writing their own classroom materials and happily purchased software--that offered much more than teachers needed--from…

  9. Fastbus software progress

    SciTech Connect

    Gustavson, D.B.

    1982-01-01

    The current status of the Fastbus software development program of the Fastbus Software Working Group is reported, and future plans are discussed. A package of Fastbus interface subroutines has been prepared as a proposed standard, language support for diagnostics and bench testing has been developed, and new documentation to help users find these resources and use them effectively is being written.

  10. NASA Software Documentation Standard

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The NASA Software Documentation Standard (hereinafter referred to as "Standard") is designed to support the documentation of all software developed for NASA; its goal is to provide a framework and model for recording the essential information needed throughout the development life cycle and maintenance of a software system. The NASA Software Documentation Standard can be applied to the documentation of all NASA software. The Standard is limited to documentation format and content requirements. It does not mandate specific management, engineering, or assurance standards or techniques. This Standard defines the format and content of documentation for software acquisition, development, and sustaining engineering. Format requirements address where information shall be recorded and content requirements address what information shall be recorded. This Standard provides a framework to allow consistency of documentation across NASA and visibility into the completeness of project documentation. The basic framework consists of four major sections (or volumes). The Management Plan contains all planning and business aspects of a software project, including engineering and assurance planning. The Product Specification contains all technical engineering information, including software requirements and design. The Assurance and Test Procedures contains all technical assurance information, including Test, Quality Assurance (QA), and Verification and Validation (V&V). The Management, Engineering, and Assurance Reports is the library and/or listing of all project reports.

  11. ITOUGH2 software qualification

    SciTech Connect

    Finsterle, S.; Pruess, K.; Fraser, P.

    1996-10-01

    The purpose of this report is to provide all software baseline documents necessary for the software qualification of ITOUGH2. ITOUGH2 is a computer program providing inverse modeling capabilities for TOUGH2. TOUGH2 is a numerical simulation code for multi-dimensional coupled fluid and heat flow of multiphase, multicomponent fluid mixtures in porous and fractured media.

  12. Measuring software design

    NASA Technical Reports Server (NTRS)

    1986-01-01

    An extensive series of studies of software design measures conducted by the Software Engineering Laboratory is described. Included are the objectives and results of the studies, the method used to perform the studies, and the problems encountered. The document should be useful to researchers planning similar studies as well as to managers and designers concerned with applying quantitative design measures.

  13. PREVAPORATION PERFORMANCE PREDICTION SOFTWARE

    EPA Science Inventory

    The Pervaporation, Performance, Prediction Software and Database (PPPS&D) computer software program is currently being developed within the USEPA, NRMRL. The purpose of the PPPS&D program is to educate and assist potential users in identifying opportunities for using pervaporati...

  14. Software Solution Saves Dollars

    ERIC Educational Resources Information Center

    Trotter, Andrew

    2004-01-01

    This article discusses computer software that can give classrooms and computer labs the capabilities of costly PC's at a small fraction of the cost. A growing number of cost-conscious school districts are finding budget relief in low-cost computer software known as "open source" that can do everything from manage school Web sites to equip…

  15. Software quality in 1997

    SciTech Connect

    Jones, C.

    1997-11-01

    For many years, software quality assurance lagged behind hardware quality assurance in terms of methods, metrics, and successful results. New approaches such as Quality Function Deployment (QFD) the ISO 9000-9004 standards, the SEI maturity levels, and Total Quality Management (TQM) are starting to attract wide attention, and in some cases to bring software quality levels up to a parity with manufacturing quality levels. Since software is on the critical path for many engineered products, and for internal business systems as well, the new approaches are starting to affect global competition and attract widespread international interest. It can be hypothesized that success in mastering software quality will be a key strategy for dominating global software markets in the 21st century.

  16. NASA Software Safety Standard

    NASA Technical Reports Server (NTRS)

    Rosenberg, Linda

    1997-01-01

    If software is a critical element in a safety critical system, it is imperative to implement a systematic approach to software safety as an integral part of the overall system safety programs. The NASA-STD-8719.13A, "NASA Software Safety Standard", describes the activities necessary to ensure that safety is designed into software that is acquired or developed by NASA, and that safety is maintained throughout the software life cycle. A PDF version, is available on the WWW from Lewis. A Guidebook that will assist in the implementation of the requirements in the Safety Standard is under development at the Lewis Research Center (LeRC). After completion, it will also be available on the WWW from Lewis.

  17. Software safety hazard analysis

    SciTech Connect

    Lawrence, J.D.

    1996-02-01

    Techniques for analyzing the safety and reliability of analog-based electronic protection systems that serve to mitigate hazards in process control systems have been developed over many years, and are reasonably well understood. An example is the protection system in a nuclear power plant. The extension of these techniques to systems which include digital computers is not well developed, and there is little consensus among software engineering experts and safety experts on how to analyze such systems. One possible technique is to extend hazard analysis to include digital computer-based systems. Software is frequently overlooked during system hazard analyses, but this is unacceptable when the software is in control of a potentially hazardous operation. In such cases, hazard analysis should be extended to fully cover the software. A method for performing software hazard analysis is proposed in this paper.

  18. Tracker 300 Software

    2006-01-12

    The Tracker300 software is downloaded to an off-the-shelf product called RCM3400/RCM3410 made by Rabbit Semiconductor. The software is a closed loop control which computes the sun's position and provides stability compensation. Using the RCM3400/RCM3410 module, the software stores and retrieves parameters from the onboard flash. The software also allows for communication with a host. It will allow the parameters to be downloaded or uploaded, it will show the status of the controller, it will providemore » real-time feedback, and it will send command acknowledgements. The software will capture the GPS response and ensure the internal clock is set correctly.« less

  19. Scientific Software Component Technology

    SciTech Connect

    Kohn, S.; Dykman, N.; Kumfert, G.; Smolinski, B.

    2000-02-16

    We are developing new software component technology for high-performance parallel scientific computing to address issues of complexity, re-use, and interoperability for laboratory software. Component technology enables cross-project code re-use, reduces software development costs, and provides additional simulation capabilities for massively parallel laboratory application codes. The success of our approach will be measured by its impact on DOE mathematical and scientific software efforts. Thus, we are collaborating closely with library developers and application scientists in the Common Component Architecture forum, the Equation Solver Interface forum, and other DOE mathematical software groups to gather requirements, write and adopt a variety of design specifications, and develop demonstration projects to validate our approach. Numerical simulation is essential to the science mission at the laboratory. However, it is becoming increasingly difficult to manage the complexity of modern simulation software. Computational scientists develop complex, three-dimensional, massively parallel, full-physics simulations that require the integration of diverse software packages written by outside development teams. Currently, the integration of a new software package, such as a new linear solver library, can require several months of effort. Current industry component technologies such as CORBA, JavaBeans, and COM have all been used successfully in the business domain to reduce software development costs and increase software quality. However, these existing industry component infrastructures will not scale to support massively parallel applications in science and engineering. In particular, they do not address issues related to high-performance parallel computing on ASCI-class machines, such as fast in-process connections between components, language interoperability for scientific languages such as Fortran, parallel data redistribution between components, and massively

  20. COTS-based OO-component approach for software inter-operability and reuse (software systems engineering methodology)

    NASA Technical Reports Server (NTRS)

    Yin, J.; Oyaki, A.; Hwang, C.; Hung, C.

    2000-01-01

    The purpose of this research and study paper is to provide a summary description and results of rapid development accomplishments at NASA/JPL in the area of advanced distributed computing technology using a Commercial-Off--The-Shelf (COTS)-based object oriented component approach to open inter-operable software development and software reuse.

  1. The environmental control and life support system advanced automation project

    NASA Technical Reports Server (NTRS)

    Dewberry, Brandon S.

    1991-01-01

    The objective of the ECLSS Advanced Automation project includes reduction of the risk associated with the integration of new, beneficial software techniques. Demonstrations of this software to baseline engineering and test personnel will show the benefits of these techniques. The advanced software will be integrated into ground testing and ground support facilities, familiarizing its usage by key personnel.

  2. Object-Based Point Cloud Analysis of Full-Waveform Airborne Laser Scanning Data for Urban Vegetation Classification

    PubMed Central

    Rutzinger, Martin; Höfle, Bernhard; Hollaus, Markus; Pfeifer, Norbert

    2008-01-01

    Airborne laser scanning (ALS) is a remote sensing technique well-suited for 3D vegetation mapping and structure characterization because the emitted laser pulses are able to penetrate small gaps in the vegetation canopy. The backscattered echoes from the foliage, woody vegetation, the terrain, and other objects are detected, leading to a cloud of points. Higher echo densities (>20 echoes/m2) and additional classification variables from full-waveform (FWF) ALS data, namely echo amplitude, echo width and information on multiple echoes from one shot, offer new possibilities in classifying the ALS point cloud. Currently FWF sensor information is hardly used for classification purposes. This contribution presents an object-based point cloud analysis (OBPA) approach, combining segmentation and classification of the 3D FWF ALS points designed to detect tall vegetation in urban environments. The definition tall vegetation includes trees and shrubs, but excludes grassland and herbage. In the applied procedure FWF ALS echoes are segmented by a seeded region growing procedure. All echoes sorted descending by their surface roughness are used as seed points. Segments are grown based on echo width homogeneity. Next, segment statistics (mean, standard deviation, and coefficient of variation) are calculated by aggregating echo features such as amplitude and surface roughness. For classification a rule base is derived automatically from a training area using a statistical classification tree. To demonstrate our method we present data of three sites with around 500,000 echoes each. The accuracy of the classified vegetation segments is evaluated for two independent validation sites. In a point-wise error assessment, where the classification is compared with manually classified 3D points, completeness and correctness better than 90% are reached for the validation sites. In comparison to many other algorithms the proposed 3D point classification works on the original measurements

  3. 3D-Modeling of deformed halite hopper crystals: Object based image analysis and support vector machine, a first evaluation

    NASA Astrophysics Data System (ADS)

    Leitner, Christoph; Hofmann, Peter; Marschallinger, Robert

    2014-05-01

    Halite hopper crystals are thought to develop by displacive growth in unconsolidated mud (Gornitz & Schreiber, 1984). The Alpine Haselgebirge, but also e.g. the salt deposits of the Rhine graben (mined at the beginning of the 20th century), comprise hopper crystals with shapes of cuboids, parallelepipeds and rhombohedrons (Görgey, 1912). Obviously, they deformed under oriented stress, which had been tried to reconstruct with respect to the sedimentary layering (Leitner et al., 2013). In the present work, deformed halite hopper crystals embedded in mudrock were automated reconstructed. Object based image analysis (OBIA) has been used successfully in remote sensing for 2D images before. The present study represents the first time that the method was used for reconstruction of three dimensional geological objects. First, manually a reference (gold standard) was created by redrawing contours of the halite crystals on each HRXCT scanning slice. Then, for OBIA, the computer program eCognition was used. For the automated reconstruction a rule set was developed. Thereby, the strength of OBIA was to recognize all objects similar to halite hopper crystals and in particular to eliminate cracks. In a second step, all the objects unsuitable for a structural deformation analysis were dismissed using a support vector machine (SVM) (clusters, polyhalite-coated crystals and spherical halites) The SVM simultaneously drastically reduced the number of halites. From 184 OBIA-objects 67 well shaped remained, which comes close to the number of pre-selected 52 objects. To assess the accuracy of the automated reconstruction, the result before and after SVM was compared to the reference, i.e. the gold standard. State-of the art per-scene statistics were extended to a per-object statistics. Görgey R (1912) Zur Kenntnis der Kalisalzlager von Wittelsheim im Ober-Elsaß. Tschermaks Mineral Petrogr Mitt 31:339-468 Gornitz VM, Schreiber BC (1981) Displacive halite hoppers from the dead sea

  4. Building Scientific Workflows for the Geosciences with Open Community Software

    NASA Astrophysics Data System (ADS)

    Pierce, M. E.; Marru, S.; Weerawarana, S. M.

    2012-12-01

    We describe the design and development of the Apache Airavata scientific workflow software and its application to problems in geosciences. Airavata is based on Service Oriented Architecture principles and is developed as general purpose software for managing large-scale science applications on supercomputing resources such as the NSF's XSEDE. Based on the NSF-funded EarthCube Workflow Working Group activities, we discuss the application of this software relative to specific requirements (such as data stream data processing, event triggering, dealing with large data sets, and advanced distributed execution patterns involved in data mining). We also consider the role of governance in EarthCube software development and present the development of Airavata software through the Apache Software Foundation's community development model. We discuss the potential impacts on software accountability and sustainability using this model.

  5. Software Measurement Guidebook

    NASA Technical Reports Server (NTRS)

    1995-01-01

    This Software Measurement Guidebook is based on the extensive experience of several organizations that have each developed and applied significant measurement programs over a period of at least 10 years. The lessons derived from those experiences reflect not only successes but also failures. By applying those lessons, an organization can minimize, or at least reduce, the time, effort, and frustration of introducing a software measurement program. The Software Measurement Guidebook is aimed at helping organizations to begin or improve a measurement program. It does not provide guidance for the extensive application of specific measures (such as how to estimate software cost or analyze software complexity) other than by providing examples to clarify points. It does contain advice for establishing and using an effective software measurement program and for understanding some of the key lessons that other organizations have learned. Some of that advice will appear counterintuitive, but it is all based on actual experience. Although all of the information presented in this guidebook is derived from specific experiences of mature measurement programs, the reader must keep in mind that the characteristics of every organization are unique. Some degree of measurement is critical for all software development and maintenance organizations, and most of the key rules captured in this report will be generally applicable. Nevertheless, each organization must strive to understand its own environment so that the measurement program can be tailored to suit its characteristics and needs.

  6. CASE: Software design technologies

    SciTech Connect

    Kalyanov, G.N.

    1994-05-01

    CASE (Computer-Aided Software Engineering) is a set of methodologies for software design, development, and maintenance supported by a complex of interconnected automation tools. CASE is a set of tools for the programmer, analyst, and developer for the automation of software design and development. Today, CASE has become an independent discipline in software engineering that has given rise to a powerful CASE industry made up of hundreds of firms and companies of various kinds. They include companies that develop tools for software analysis and design and have a wide network of distributors and dealers, firms that develop specialized tools for narrow subject areas or for individual stages of the software life cycle, firms that organize seminars and courses for specialists, consulting firms, which demonstrate the practical power of CASE toolkits for specific applications, and companies specializing in the publication of periodicals and bulletins on CASE. The principal purchasers of CASE toolkits abroad are military organizations, data-processing centers, and commercial software developers.

  7. Software Formal Inspections Guidebook

    NASA Technical Reports Server (NTRS)

    1993-01-01

    The Software Formal Inspections Guidebook is designed to support the inspection process of software developed by and for NASA. This document provides information on how to implement a recommended and proven method for conducting formal inspections of NASA software. This Guidebook is a companion document to NASA Standard 2202-93, Software Formal Inspections Standard, approved April 1993, which provides the rules, procedures, and specific requirements for conducting software formal inspections. Application of the Formal Inspections Standard is optional to NASA program or project management. In cases where program or project management decide to use the formal inspections method, this Guidebook provides additional information on how to establish and implement the process. The goal of the formal inspections process as documented in the above-mentioned Standard and this Guidebook is to provide a framework and model for an inspection process that will enable the detection and elimination of defects as early as possible in the software life cycle. An ancillary aspect of the formal inspection process incorporates the collection and analysis of inspection data to effect continual improvement in the inspection process and the quality of the software subjected to the process.

  8. Object based image analysis for the classification of the growth stages of Avocado crop, in Michoacán State, Mexico

    NASA Astrophysics Data System (ADS)

    Gao, Yan; Marpu, Prashanth; Morales Manila, Luis M.

    2014-11-01

    This paper assesses the suitability of 8-band Worldview-2 (WV2) satellite data and object-based random forest algorithm for the classification of avocado growth stages in Mexico. We tested both pixel-based with minimum distance (MD) and maximum likelihood (MLC) and object-based with Random Forest (RF) algorithm for this task. Training samples and verification data were selected by visual interpreting the WV2 images for seven thematic classes: fully grown, middle stage, and early stage of avocado crops, bare land, two types of natural forests, and water body. To examine the contribution of the four new spectral bands of WV2 sensor, all the tested classifications were carried out with and without the four new spectral bands. Classification accuracy assessment results show that object-based classification with RF algorithm obtained higher overall higher accuracy (93.06%) than pixel-based MD (69.37%) and MLC (64.03%) method. For both pixel-based and object-based methods, the classifications with the four new spectral bands (overall accuracy obtained higher accuracy than those without: overall accuracy of object-based RF classification with vs without: 93.06% vs 83.59%, pixel-based MD: 69.37% vs 67.2%, pixel-based MLC: 64.03% vs 36.05%, suggesting that the four new spectral bands in WV2 sensor contributed to the increase of the classification accuracy.

  9. Software packager user's guide

    NASA Technical Reports Server (NTRS)

    Callahan, John R.

    1995-01-01

    Software integration is a growing area of concern for many programmers and software managers because the need to build new programs quickly from existing components is greater than ever. This includes building versions of software products for multiple hardware platforms and operating systems, building programs from components written in different languages, and building systems from components that must execute on different machines in a distributed network. The goal of software integration is to make building new programs from existing components more seamless -- programmers should pay minimal attention to the underlying configuration issues involved. Libraries of reusable components and classes are important tools but only partial solutions to software development problems. Even though software components may have compatible interfaces, there may be other reasons, such as differences between execution environments, why they cannot be integrated. Often, components must be adapted or reimplemented to fit into another application because of implementation differences -- they are implemented in different programming languages, dependent on different operating system resources, or must execute on different physical machines. The software packager is a tool that allows programmers to deal with interfaces between software components and ignore complex integration details. The packager takes modular descriptions of the structure of a software system written in the package specification language and produces an integration program in the form of a makefile. If complex integration tools are needed to integrate a set of components, such as remote procedure call stubs, their use is implied by the packager automatically and stub generation tools are invoked in the corresponding makefile. The programmer deals only with the components themselves and not the details of how to build the system on any given platform.

  10. Ascent/Descent Software

    NASA Technical Reports Server (NTRS)

    Brown, Charles; Andrew, Robert; Roe, Scott; Frye, Ronald; Harvey, Michael; Vu, Tuan; Balachandran, Krishnaiyer; Bly, Ben

    2012-01-01

    The Ascent/Descent Software Suite has been used to support a variety of NASA Shuttle Program mission planning and analysis activities, such as range safety, on the Integrated Planning System (IPS) platform. The Ascent/Descent Software Suite, containing Ascent Flight Design (ASC)/Descent Flight Design (DESC) Configuration items (Cis), lifecycle documents, and data files used for shuttle ascent and entry modeling analysis and mission design, resides on IPS/Linux workstations. A list of tools in Navigation (NAV)/Prop Software Suite represents tool versions established during or after the IPS Equipment Rehost-3 project.

  11. Error Free Software

    NASA Technical Reports Server (NTRS)

    1985-01-01

    A mathematical theory for development of "higher order" software to catch computer mistakes resulted from a Johnson Space Center contract for Apollo spacecraft navigation. Two women who were involved in the project formed Higher Order Software, Inc. to develop and market the system of error analysis and correction. They designed software which is logically error-free, which, in one instance, was found to increase productivity by 600%. USE.IT defines its objectives using AXES -- a user can write in English and the system converts to computer languages. It is employed by several large corporations.

  12. CNEOST Control Software System

    NASA Astrophysics Data System (ADS)

    Wang, Xin; Zhao, Hai-bin; Xia, Yan; Lu, Hao; Li, Bin

    2016-01-01

    In 2013, CNEOST (China Near Earth Object Survey Telescope) adapted its hardware system for the new CCD camera. Based on the new system architecture, the control software is re-designed and implemented. The software system adopts the messaging mechanism based on the WebSocket protocol, and possesses good flexibility and expansibility. The user interface based on the responsive web design has realized the remote observations under both desktop and mobile devices. The stable operation of the software system has greatly enhanced the operation efficiency while reducing the complexity, and has also made a successful attempt for the future system design of telescope and telescope cloud.

  13. CNEOST Control Software System

    NASA Astrophysics Data System (ADS)

    Wang, X.; Zhao, H. B.; Xia, Y.; Lu, H.; Li, B.

    2015-03-01

    In 2013, CNEOST (China Near Earth Object Survey Telescope) adapted its hardware system for the new CCD camera. Based on the new system architecture, the control software is re-designed and implemented. The software system adopts the message passing mechanism via WebSocket protocol, and improves its flexibility, expansibility, and scalability. The user interface with responsive web design realizes the remote operating under both desktop and mobile devices. The stable operating of software system has greatly enhanced the operation efficiency while reducing the complexity, and has also made a successful attempt for the future system design of telescope and telescope cloud.

  14. Multiphase flow calculation software

    DOEpatents

    Fincke, James R.

    2003-04-15

    Multiphase flow calculation software and computer-readable media carrying computer executable instructions for calculating liquid and gas phase mass flow rates of high void fraction multiphase flows. The multiphase flow calculation software employs various given, or experimentally determined, parameters in conjunction with a plurality of pressure differentials of a multiphase flow, preferably supplied by a differential pressure flowmeter or the like, to determine liquid and gas phase mass flow rates of the high void fraction multiphase flows. Embodiments of the multiphase flow calculation software are suitable for use in a variety of applications, including real-time management and control of an object system.

  15. Towards a software profession

    NASA Technical Reports Server (NTRS)

    Berard, Edward V.

    1986-01-01

    An increasing number of programmers have attempted to change their image. They have made it plain that they wish not only to be taken seriously, but they also wish to be regarded as professionals. Many programmers now wish to referred to as software engineers. If programmers wish to be considered professionals in every sense of the word, two obstacles must be overcome: the inability to think of software as a product, and the idea that little or no skill is required to create and handle software throughout its life cycle. The steps to be taken toward professionalization are outlined along with recommendations.

  16. Software quality assurance handbook

    SciTech Connect

    Not Available

    1990-09-01

    There are two important reasons for Software Quality Assurance (SQA) at Allied-Signal Inc., Kansas City Division (KCD): First, the benefits from SQA make good business sense. Second, the Department of Energy has requested SQA. This handbook is one of the first steps in a plant-wide implementation of Software Quality Assurance at KCD. The handbook has two main purposes. The first is to provide information that you will need to perform software quality assurance activities. The second is to provide a common thread to unify the approach to SQA at KCD. 2 figs.

  17. Speakeasy software development

    NASA Astrophysics Data System (ADS)

    Baskinger, Patricia J.; Ozarow, Larry; Chruscicki, Mary C.

    1993-08-01

    The Speakeasy Software Development Project had three primary objectives. The first objective was to perform Independent Verification and Validation (IV & V) of the software and documentation associated with the signal processor being developed by Hazeltine and TRW under the Speakeasy program. The IV & V task also included an analysis and assessment of the ability of the signal processor software to provide LPI communications functions. The second objective was to assist in the enhancement and modification of an existing Rome Lab signal processor workstation. Finally, TASC developed project management support tools and provided program management support to the Speakeasy Program Office.

  18. Orbit Software Suite

    NASA Technical Reports Server (NTRS)

    Osgood, Cathy; Williams, Kevin; Gentry, Philip; Brownfield, Dana; Hallstrom, John; Stuit, Tim

    2012-01-01

    Orbit Software Suite is used to support a variety of NASA/DM (Dependable Multiprocessor) mission planning and analysis activities on the IPS (Intrusion Prevention System) platform. The suite of Orbit software tools (Orbit Design and Orbit Dynamics) resides on IPS/Linux workstations, and is used to perform mission design and analysis tasks corresponding to trajectory/ launch window, rendezvous, and proximity operations flight segments. A list of tools in Orbit Software Suite represents tool versions established during/after the Equipment Rehost-3 Project.

  19. Object-Based Image Classification of Floating Ice Used as Habitat for Harbor Seals in a Tidewater Glacier Fjord in Alaska

    NASA Astrophysics Data System (ADS)

    McNabb, R. W.; Womble, J. N.; Prakash, A.; Gens, R.; Ver Hoef, J.

    2014-12-01

    Tidewater glaciers play an important role in many landscape and ecosystem processes in fjords, terminating in the sea and calving icebergs and discharging meltwater directly into the ocean. Tidewater glaciers provide floating ice for use as habitat for harbor seals (Phoca vitulina richardii) for resting, pupping, nursing, molting, and avoiding predators. Tidewater glaciers are found in high concentrations in Southeast and Southcentral Alaska; currently, many of these glaciers are retreating or have stabilized in a retracted state, raising questions about the future availability of ice in these fjords as habitat for seals. Our primary objective is to investigate the relationship between harbor seal distribution and ice availability at an advancing tidewater glacier in Johns Hopkins Inlet, Glacier Bay National Park, Alaska. To this end, we use a combination of visible and infrared aerial photographs, object-based image analysis (OBIA), and statistical modeling techniques. We have developed a workflow to automate the processing of the imagery and the classification of the fjordscape (e.g., individual icebergs, brash ice, and open water), providing quantitative information on ice coverage as well as properties not typically found in traditional pixel-based classification techniques, such as block angularity and seal density across the fjord. Reflectance variation in the red channel of the optical images has proven to be the most important first-level criterion to separate open water from floating ice. This first-level criterion works well in areas without dense brash ice, but tends to misclassify dense brash ice as single icebergs. Isolating these large misclassified regions and applying a higher reflectance threshold as a second-level criterion helps to isolate individual ice blocks surrounded by dense brash ice. We present classification results from surveys taken during June and August, 2007-2013, as well as preliminary results from statistical modeling of the

  20. ACS: ALMA Common Software

    NASA Astrophysics Data System (ADS)

    Chiozzi, Gianluca; Šekoranja, Matej

    2013-02-01

    ALMA Common Software (ACS) provides a software infrastructure common to all ALMA partners and consists of a documented collection of common patterns and components which implement those patterns. The heart of ACS is based on a distributed Component-Container model, with ACS Components implemented as CORBA objects in any of the supported programming languages. ACS provides common CORBA-based services such as logging, error and alarm management, configuration database and lifecycle management. Although designed for ALMA, ACS can and is being used in other control systems and distributed software projects, since it implements proven design patterns using state of the art, reliable technology. It also allows, through the use of well-known standard constructs and components, that other team members whom are not authors of ACS easily understand the architecture of software modules, making maintenance affordable even on a very large project.

  1. Software Design Analyzer System

    NASA Technical Reports Server (NTRS)

    Tausworthe, R. C.

    1985-01-01

    CRISP80 software design analyzer system a set of programs that supports top-down, hierarchic, modular structured design, and programing methodologies. CRISP80 allows for expression of design as picture of program.

  2. Software interface verifier

    NASA Technical Reports Server (NTRS)

    Soderstrom, Tomas J.; Krall, Laura A.; Hope, Sharon A.; Zupke, Brian S.

    1994-01-01

    A Telos study of 40 recent subsystem deliveries into the DSN at JPL found software interface testing to be the single most expensive and error-prone activity, and the study team suggested creating an automated software interface test tool. The resulting Software Interface Verifier (SIV), which was funded by NASA/JPL and created by Telos, employed 92 percent software reuse to quickly create an initial version which incorporated early user feedback. SIV is now successfully used by developers for interface prototyping and unit testing, by test engineers for formal testing, and by end users for non-intrusive data flow tests in the operational environment. Metrics, including cost, are included. Lessons learned include the need for early user training. SIV is ported to many platforms and can be successfully used or tailored by other NASA groups.

  3. HOMER® Energy Modeling Software

    2000-12-31

    The HOMER® energy modeling software is a tool for designing and analyzing hybrid power systems, which contain a mix of conventional generators, cogeneration, wind turbines, solar photovoltaic, hydropower, batteries, fuel cells, hydropower, biomass and other inputs.

  4. Writing Instructional Software.

    ERIC Educational Resources Information Center

    Lorenz, Marian; Moose, Allan

    1983-01-01

    Discusses the main categories of instructional software, including drill/practice, tutorials, simulation/problem solving, games, and management, along with factors involved in their design. (Author/MBR)

  5. Software Solutions for ICME

    NASA Astrophysics Data System (ADS)

    Schmitz, G. J.; Engstrom, A.; Bernhardt, R.; Prahl, U.; Adam, L.; Seyfarth, J.; Apel, M.; de Saracibar, C. Agelet; Korzhavyi, P.; Ågren, J.; Patzak, B.

    2016-01-01

    The Integrated Computational Materials Engineering expert group (ICMEg), a coordination activity of the European Commission, aims at developing a global and open standard for information exchange between the heterogeneous varieties of numerous simulation tools. The ICMEg consortium coordinates respective developments by a strategy of networking stakeholders in the first International Workshop on Software Solutions for ICME, compiling identified and relevant software tools into the Handbook of Software Solutions for ICME, discussing strategies for interoperability between different software tools during a second (planned) international workshop, and eventually proposing a scheme for standardized information exchange in a future book or document. The present article summarizes these respective actions to provide the ICME community with some additional insights and resources from which to help move this field forward.

  6. Computer Center: Software Review.

    ERIC Educational Resources Information Center

    Duhrkopf, Richard, Ed.; Belshe, John F., Ed.

    1988-01-01

    Reviews a software package, "Mitosis-Meiosis," available for Apple II or IBM computers with colorgraphics capabilities. Describes the documentation, presentation and flexibility of the program. Rates the program based on graphics and usability in a biology classroom. (CW)

  7. Economics of software utilization

    SciTech Connect

    Sidorov, N.A.

    1995-01-01

    The application of the reuse principle to software (use of methods, concepts, or system components in a context or a situation which is different from that originally envisaged in the development phase) requires solving many problems of technical, economic, organizational, and legal nature. At present, it is the technical problems of reuse that are receiving the greater attention. Economic aspects of reuse, which are the subject of this paper, are only beginning to be studied. In our analysis, an integrated approach to the economics of software recycling suggests three models that can be applied to examine reusability. Section 1 characterizes the application of the reuse principle in software systems. Section 2 identifies the factors which are relevant for reuse. Section 3 briefly describes the main processes of reuse. Section 4 presents the metrics for the evaluation of reuse models. Section 5 examines the reuse models, and Section 6 presents some recommendations for reducing the development costs of reusable software.

  8. Error-Free Software

    NASA Technical Reports Server (NTRS)

    1989-01-01

    001 is an integrated tool suited for automatically developing ultra reliable models, simulations and software systems. Developed and marketed by Hamilton Technologies, Inc. (HTI), it has been applied in engineering, manufacturing, banking and software tools development. The software provides the ability to simplify the complex. A system developed with 001 can be a prototype or fully developed with production quality code. It is free of interface errors, consistent, logically complete and has no data or control flow errors. Systems can be designed, developed and maintained with maximum productivity. Margaret Hamilton, President of Hamilton Technologies, also directed the research and development of USE.IT, an earlier product which was the first computer aided software engineering product in the industry to concentrate on automatically supporting the development of an ultrareliable system throughout its life cycle. Both products originated in NASA technology developed under a Johnson Space Center contract.

  9. Astronomical Software Directory Service

    NASA Technical Reports Server (NTRS)

    Hanisch, Robert J.; Payne, Harry; Hayes, Jeffrey

    1997-01-01

    With the support of NASA's Astrophysics Data Program (NRA 92-OSSA-15), we have developed the Astronomical Software Directory Service (ASDS): a distributed, searchable, WWW-based database of software packages and their related documentation. ASDS provides integrated access to 56 astronomical software packages, with more than 16,000 URLs indexed for full-text searching. Users are performing about 400 searches per month. A new aspect of our service is the inclusion of telescope and instrumentation manuals, which prompted us to change the name to the Astronomical Software and Documentation Service. ASDS was originally conceived to serve two purposes: to provide a useful Internet service in an area of expertise of the investigators (astronomical software), and as a research project to investigate various architectures for searching through a set of documents distributed across the Internet. Two of the co-investigators were then installing and maintaining astronomical software as their primary job responsibility. We felt that a service which incorporated our experience in this area would be more useful than a straightforward listing of software packages. The original concept was for a service based on the client/server model, which would function as a directory/referral service rather than as an archive. For performing the searches, we began our investigation with a decision to evaluate the Isite software from the Center for Networked Information Discovery and Retrieval (CNIDR). This software was intended as a replacement for Wide-Area Information Service (WAIS), a client/server technology for performing full-text searches through a set of documents. Isite had some additional features that we considered attractive, and we enjoyed the cooperation of the Isite developers, who were happy to have ASDS as a demonstration project. We ended up staying with the software throughout the project, making modifications to take advantage of new features as they came along, as well as

  10. LIGA Scanner Control Software

    1999-02-01

    The LIGA Scanner Software is a graphical user interface package that facilitates controlling the scanning operation of x-rays from a synchrotron and sample manipulation for making LIGA parts. The process requires scanning of the LIGA mask and the PMMA resist through a stationary x-ray beam to provide an evenly distributed x-ray exposure over the wafer. This software package has been written specifically to interface with Aerotech motor controllers.

  11. Public Key FPGA Software

    SciTech Connect

    Hymel, Ross

    2013-07-25

    The Public Key (PK) FPGA software performs asymmetric authentication using the 163-bit Elliptic Curve Digital Signature Algorithm (ECDSA) on an embedded FPGA platform. A digital signature is created on user-supplied data, and communication with a host system is performed via a Serial Peripheral Interface (SPI) bus. Software includes all components necessary for signing, including custom random number generator for key creation and SHA-256 for data hashing.

  12. Image Processing Software

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The Ames digital image velocimetry technology has been incorporated in a commercially available image processing software package that allows motion measurement of images on a PC alone. The software, manufactured by Werner Frei Associates, is IMAGELAB FFT. IMAGELAB FFT is a general purpose image processing system with a variety of other applications, among them image enhancement of fingerprints and use by banks and law enforcement agencies for analysis of videos run during robberies.

  13. Engineering and Software Engineering

    NASA Astrophysics Data System (ADS)

    Jackson, Michael

    The phrase ‘software engineering' has many meanings. One central meaning is the reliable development of dependable computer-based systems, especially those for critical applications. This is not a solved problem. Failures in software development have played a large part in many fatalities and in huge economic losses. While some of these failures may be attributable to programming errors in the narrowest sense—a program's failure to satisfy a given formal specification—there is good reason to think that most of them have other roots. These roots are located in the problem of software engineering rather than in the problem of program correctness. The famous 1968 conference was motivated by the belief that software development should be based on “the types of theoretical foundations and practical disciplines that are traditional in the established branches of engineering.” Yet after forty years of currency the phrase ‘software engineering' still denotes no more than a vague and largely unfulfilled aspiration. Two major causes of this disappointment are immediately clear. First, too many areas of software development are inadequately specialised, and consequently have not developed the repertoires of normal designs that are the indispensable basis of reliable engineering success. Second, the relationship between structural design and formal analytical techniques for software has rarely been one of fruitful synergy: too often it has defined a boundary between competing dogmas, at which mutual distrust and incomprehension deprive both sides of advantages that should be within their grasp. This paper discusses these causes and their effects. Whether the common practice of software development will eventually satisfy the broad aspiration of 1968 is hard to predict; but an understanding of past failure is surely a prerequisite of future success.

  14. Proceedings of the Eighteenth Annual Software Engineering Workshop

    NASA Technical Reports Server (NTRS)

    1993-01-01

    The workshop provided a forum for software practitioners from around the world to exchange information on the measurement, use, and evaluation of software methods, models, and tools. This year, approximately 450 people attended the workshop, which consisted of six sessions on the following topics: the Software Engineering Laboratory, measurement, technology assessment, advanced concepts, process, and software engineering issues in NASA. Three presentations were given in each of the topic areas. The content of those presentations and the research papers detailing the work reported are included in these proceedings. The workshop concluded with a tutorial session on how to start an Experience Factory.

  15. Integration of case tools for software project management

    SciTech Connect

    Paul, R.; Shinagawa, Y.; Khan, M.F.

    1996-12-31

    Building and maintenance of high quality large software projects is a complex and difficult process. Tools employing software metrics are becoming an effective aid for management of such large projects. In this paper, we briefly trace the evolution of such tools from their beginnings up until the current trends of integrated CASE tools. We present a generic integrated CASE environment incorporating a formal set of software metrics with a suite of advanced analytic techniques. The proposed integrated CASE environment is an enhancement of currently used tools, and can enable more efficient and cost-effective management of large and complex software projects.

  16. Software for the EVLA

    NASA Astrophysics Data System (ADS)

    Butler, Bryan J.; van Moorsel, Gustaaf; Tody, Doug

    2004-09-01

    The Expanded Very Large Array (EVLA) project is the next generation instrument for high resolution long-millimeter to short-meter wavelength radio astronomy. It is currently funded by NSF, with completion scheduled for 2012. The EVLA will upgrade the VLA with new feeds, receivers, data transmission hardware, correlator, and a new software system to enable the instrument to achieve its full potential. This software includes both that required for controlling and monitoring the instrument and that involved with the scientific dataflow. We concentrate here on a portion of the dataflow software, including: proposal preparation, submission, and handling; observation preparation, scheduling, and remote monitoring; data archiving; and data post-processing, including both automated (pipeline) and manual processing. The primary goals of the software are: to maximize the scientific return of the EVLA; provide ease of use, for both novices and experts; exploit commonality amongst all NRAO telescopes where possible. This last point is both a bane and a blessing: we are not at liberty to do whatever we want in the software, but on the other hand we may borrow from other projects (notably ALMA and GBT) where appropriate. The software design methodology includes detailed initial use-cases and requirements from the scientists, intimate interaction between the scientists and the programmers during design and implementation, and a thorough testing and acceptance plan.

  17. Star Wars software debate

    SciTech Connect

    Myers, W.

    1986-02-01

    David L. Parnas, Landsdowne Professor of Computer Science at the University of Victoria resigned from the SDI Organization's Panel on Computing in Support of Battle Management on June 28, 1985. Parnas, with 20 years of research on software engineering plus 8 years of work on military aircraft real-time software, says the software portion of SDI cannot be built error-free and he doesn't expect the next 20 years of research to change that fact. Since Parnas resigned, there have been several public debates on Star Wars software questions. In November 1985 the SDIO panel from which Parnas resigned released a draft of its report, reflecting its effort to critics of the project. While one might think that errors could be entirely eliminated with enough care and checking, most software professionals believe there will always be some residue of errors in a system of this size and complexity. The general line of the critics' argument is that the larger the amount of software in a single, unified system, the higher the percentage of errors it will contain. Proponents counter that the one very large system can be divided into a number of smaller, relatively independent pieces, thus reducing the proportionate number of errors in each separate piece. This approach is in turn countered by those who point to the intricate relations between these pieces, which themselves contribute to error.

  18. Software Process Assessment (SPA)

    NASA Technical Reports Server (NTRS)

    Rosenberg, Linda H.; Sheppard, Sylvia B.; Butler, Scott A.

    1994-01-01

    NASA's environment mirrors the changes taking place in the nation at large, i.e. workers are being asked to do more work with fewer resources. For software developers at NASA's Goddard Space Flight Center (GSFC), the effects of this change are that we must continue to produce quality code that is maintainable and reusable, but we must learn to produce it more efficiently and less expensively. To accomplish this goal, the Data Systems Technology Division (DSTD) at GSFC is trying a variety of both proven and state-of-the-art techniques for software development (e.g., object-oriented design, prototyping, designing for reuse, etc.). In order to evaluate the effectiveness of these techniques, the Software Process Assessment (SPA) program was initiated. SPA was begun under the assumption that the effects of different software development processes, techniques, and tools, on the resulting product must be evaluated in an objective manner in order to assess any benefits that may have accrued. SPA involves the collection and analysis of software product and process data. These data include metrics such as effort, code changes, size, complexity, and code readability. This paper describes the SPA data collection and analysis methodology and presents examples of benefits realized thus far by DSTD's software developers and managers.

  19. Encyclopedia of software components

    NASA Technical Reports Server (NTRS)

    Vanwarren, Lloyd (Inventor); Beckman, Brian C. (Inventor)

    1991-01-01

    Intelligent browsing through a collection of reusable software components is facilitated with a computer having a video monitor and a user input interface such as a keyboard or a mouse for transmitting user selections, by presenting a picture of encyclopedia volumes with respective visible labels referring to types of software, in accordance with a metaphor in which each volume includes a page having a list of general topics under the software type of the volume and pages having lists of software components for each one of the generic topics, altering the picture to open one of the volumes in response to an initial user selection specifying the one volume to display on the monitor a picture of the page thereof having the list of general topics and altering the picture to display the page thereof having a list of software components under one of the general topics in response to a next user selection specifying the one general topic, and then presenting a picture of a set of different informative plates depicting different types of information about one of the software components in response to a further user selection specifying the one component.

  20. Encyclopedia of Software Components

    NASA Technical Reports Server (NTRS)

    Warren, Lloyd V. (Inventor); Beckman, Brian C. (Inventor)

    1997-01-01

    Intelligent browsing through a collection of reusable software components is facilitated with a computer having a video monitor and a user input interface such as a keyboard or a mouse for transmitting user selections, by presenting a picture of encyclopedia volumes with respective visible labels referring to types of software, in accordance with a metaphor in which each volume includes a page having a list of general topics under the software type of the volume and pages having lists of software components for each one of the generic topics, altering the picture to open one of the volumes in response to an initial user selection specifying the one volume to display on the monitor a picture of the page thereof having the list of general topics and altering the picture to display the page thereof having a list of software components under one of the general topics in response to a next user selection specifying the one general topic, and then presenting a picture of a set of different informative plates depicting different types of information about one of the software components in response to a further user selection specifying the one component.